Design implications for software capable of assembling static photos into a synergy of zoomable, navigatable spaces
February 15, 2010
I admit I am excited about the concept of meta data and augmented reality. Imagine being able to zoom into a map of your home town, go to a building and look at the structure, the HVAC systems, energy use or the connection to the smart grid. Watch this video and you will see that that potential is really here. In one part of this video Blaise Aquera y Arcas briefly mentions time travel. What? Well of course, if you could do this with ” current” imagery, why not with historic maps and images? Then, hold on, why not use interoperative GBxml data from your BIM model and do a ” what if” forward? In other words, put that big condo complex on the map and see what its implications are for the site, the power grid, traffic etc?
from the Photosynth about page
Photosynth is a potent mixture of two independent breakthroughs: the ability to reconstruct the scene or object from a bunch of flat photographs, and the technology to bring that experience to virtually anyone over the Internet.
Using techniques from the field of computer vision, Photosynth examines images for similarities to each other and uses that information to estimate the shape of the subject and the vantage point each photo was taken from. With this information, we recreate the space and use it as a canvas to display and navigate through the photos.
Providing that experience requires viewing a LOT of data though—much more than you generally get at any one time by surfing someone’s photo album on the web. That’s where our Seadragon™ technology comes in: delivering just the pixels you need, exactly when you need them. It allows you to browse through dozens of 5, 10, or 100(!) megapixel photos effortlessly, without fiddling with a bunch of thumbnails and waiting around for everything to load.