Bing Maps: A revealing discussion on Ch. 9

Charles Torre and Erik Meijer spent an hour talking to Bing Maps infrastructure architect Gur Kimchi for an episode of Expert to Expert on Channel 9, and crammed a lot of information and a few juicy tidbits into the session.

flightx_thumb_73d17a42 News According to Kimchi, the Bing Maps for Enterprise, what up until a few weeks ago was known as Virtual Earth, is a team of some 600, with 260 developers (of which 1/3 hold PhDs).  He recalls the early days of VE, going all the way back to a Think Week paper in 2005, called “Virtual Earth”.  Kimchi describes the premise of the paper:

“Imagine if you had mapping, search, and flight simulator, and what could happen”

An interesting note, as Flight Simulator has seemingly been dropped by Microsoft “to align Microsoft’s resources with our strategic priorities”.  Maybe it’s reading too much into the inference, but might we see this Think Week dream come to life?  Will Flight Simulator live again in Bing Maps? (or at least using Bing Maps imagery?)

The discussion then moves to the whiteboard, where Kimchi draws up some of the basic architecture, and compares the infrastructure with Google:

All of Virtual Earth, end to end is about 1900 servers…

Each drop of Google Earth in each of their 22 to 25 data centers has over 15,000 servers, literally a couple of magnitudes away in terms of efficiency

(slightly paraphrased)

Of course Google Maps is much more popular at this point than Bing Maps, but the efficiencies built into the architecture are impressive.

Kimchi then pulls out the “long tail” diagram and discusses some of the business decisions about where to commit to high resolution imagery and low latency (sorry Australia).  He hints at how PhotoSynth, which is now part of the Bing Maps team and has some rudimentary integration now, might make use of user generated content to produce automated 3D models, especially in some of those long tail scenarios.

And in one last hint on what may be to come for Bing Maps, Kimchi reveals that a team is working on a “very early” project to allow for queries of map information over time.  Much of the historical data from the past isn’t digitized and would be difficult and expensive to make available, but the architecture of the system does allow for versioning, so historical information once it was input could be queried, potentially.

All in all an interesting discussion.  Check it out.