ok, anyone who watched Dennou Coil will immediately recognize where this is headed:
Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.
The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user’s hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.
The thing is a project at the MIT Media Lab and can be built for $350 in off-the-shelf hardware. And just to make the obviousness of it all even more so, compare the following:
I posited in this slideshare presentation on the future of the Web that mapping a virtual layer on top of reality would be “web 4.0”. I think i may have been more right than I realized.
I was going to drop a link to the TED talks to see the demonstration of the Sixth Sense prototype, but they have that in the video section of the link you supplied.
Very interesting technology, but they have a long way to go… First, their attempt relies on having a surface to project the images. Second, the environments they project are extremely cluttered. It’s hard to say where this technology will end up in several years, but it is a unique start.
That said, have you ever scrolled through the TED talks (ted.com)? Lots of interesting seminars usually running in the 20 minute range. Every thing from scientists, technology leaders and all the way to artists. There are definitely worse places on the net to waste some time.
I had about the same thought.
Now, couple Sixth Sense with this:
http://blog.makezine.com/archive/2009/02/flartoolkit_augmented_reality_for_f.html
(One person I showed this to immediately said: “Dennosuke!”
Or, for a somewhat creepy-otaku version: http://www.youtube.com/watch?v=yCCx7zANsGE