Friday, April 21, 2006

Quintillion

So where did this McDonald's Quintillion come from? How did I get involved?

Second question first. I got involved because during that time, circa 1991, because I was working at Argonne National Laboratory and my immediate supervisor, Bob (all names will be first name only), happened to have a friend, Gary, who worked in the Marketing department at McDonalds. Now earlier in 1991 Bob (Smart Adult), David (Smart Kid#1), and I (Smart Kid #2) started a company together (We were later joined by Smart Kid #3 Gordon). Gary was looking for help in creating a mapping system to do analysis of existing stores and determination of where new stores should go. Oh... Did I mention that we spent the previous 3 years working on mapping systems for the DoD at ANL?!? (That is a bunch of stories for another time). I'm pretty sure that had some bearing on Gary thinking we could help him create the software he wanted. So one thing lead to another and next thing you know we are working for McDonalds on what was to become Quintillion.

First question: Where did it really come from though?

Well Quintillion, as it is known today, was really two different products. There was Impact: the executable that was responsible for determining what impact a placement of one store would have on the trade areas of other existing stores and Speedy: the application that did geographic analysis of existing stores. Look at a map point and click and get lots of information about what you clicked on.

Here are a couple examples of what I'm talking about, see the link below for a more detailed view. On the left you see what I think were impact rings and dot maps of where customers came from and how far they traveled. This information came from surveys done in-store and then feed into the system.

On the right you see one of the cooler comparison tools that we (The royal we. I don't recall whose idea this actually was.) came up with. It allowed us to compare how two stores were doing in comparison with some other computed metric. It made it very easy to overlay stores and see how they matched up in key metrics and what made a store good or bad.

Great you say. What's so special about it? Well first off, do you realize that McDonalds is an absolutely huge real estate company? If folks that own land find out McDonalds is interested in land the price goes up. If competitors find out they buy up land before McDonalds has a chance to run all it's due diligence. It is simply amazing how many dollars are associated with real-estate transactions at McDonalds. And I'm not just talking about just in the US either! It therefore seemed like it would pay if the market managers could have an automated way of seeing information about their current stores and weighing information about future sites. Apparently there weren't that many folks doing this type of GIS integration back then. There were GIS applications from the players you would be familiar with today but none that really provided such an integrated vertical solution. A custom app was deemed the best solution and it actually wasn't started by us. A couple of other guys started the development and our team took over after the initial team had some issues. How come the McDonalds IT department wasn't doing the development? That is a long and interesting story in and of itself that I will cover some other time. Suffice to say that the belief was the talent and ability to execute to the timeframe wasn't there.

The applications themselves were initially developed on a Sun workstation in C utilizing the SunView window system. The underlying data to create the vector maps could have come from a number of sources but it turned out that the easiest solution was to utilize the US Census TIGER data files. Now there were/are plenty of problems with the data in these files but for the purposes we were using it for it worked great. During development it was also asked to overlay the vector maps on top of satellite imagery (Google Hybrid view anyone?). Our team developed the algorithms to convert the map projections to a common format and warp the image data to the selected projection to allow for very accurate overlay. Store data was ground-truthed in a couple of ways. Initially we would find the street or cross roads a store was located on and then then using the map click the point to register the lat/lon where a store icon would appear. Later we were also able to utilize a GPS receiver. We also associated picture information of the store (layout, exterior, interior shots) to the store icon on the map and then began the work of associating the information obtained from in-store surveys with it as well. All this information was then available with just a couple of clicks. This lead to some fascinating discoveries in our pilot markets (Dallas, Nashville? and Washington DC?) such as: how people wouldn't cross railroad tracks to visit a store that was closer to them than another, how economic boundaries can act like physical ones, the impact of large corporation or college campuses on stores and their business and many more.

On the technology side, things continued to evolve. Because of internal politics between the Marketing and IT groups it was necessary to port the application from the initial Sun/SunView platform to a more portable, and therefore politically acceptable, Unix/X-Windows system. It also required us to be able to run the client on a PC. This was accomplished by utilizing the XView toolkit for porting SunView applications to the OpenLook GUI for X-Windows. Then we had to port the Sun specific build environment and code segments to more generic cross-platform Unix calls (stuff like bcopy and memcpy, vfork etc...) and finally make it work on a little-endian system to boot. The IT group had "challenged" the team to support many platforms already within the business. which meant the server code ran on Sun, Ultrix and AIX and the display/client ran on those as well as a PC.

What that means is that we dealt with code to handle big/little endian issues in both server and client code. X really sucked when it came to images. Colormaps, overlays and pixmap byte orders though somewhat portable in reality still had to be specifically hand coded (Java anyone?!?) to the existing servers you were running against. The major problems were the DEC server and the PC Hummingbird X server and the paltry memory available on the PC. Now the PC was state-of-the-art at the time but it was still just a 33Mhz 486 with an 8-bit ATI graphics card. It had a state of the art EISA bus, sigh, and it ran the brand-new Windows 3.1 and the Hummingbird X server. Very little memory (especially to display the satellite pixmap overlays) caused most of our problems with this setup but we fought with driver issues for both the network and video cards.

In time, as we looked to create the next generation of the application, we were porting it to XView3 (from the original XView2 port) integrating with a custom object oriented database to help with the object model we had created and finally creating a "bridge" to ingest data from Oracle databases (which customers seemed to have/ask for).

In hindsight none of this stuff seems quite so *wow* anymore but back then we were doing stuff that very few others had really undertaken. In the end the product ended up with Dakota (see link below). In a future post I'll talk about how close we came to spinning out a stand-alone company to sell Speedy/Impact, some of the potential customers we created custom demos for, the political intrique that was behind the creation of the product and how it ultimately ended up where it did.

The Dakota Quintillion web site

No comments: