'SixthSense' is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
The Technology, Entertainment, Design conference – otherwise known simply asTED – is an exclusive annual conference that serves as a launch pad for “ideas worth spreading.” Its participants usually include well-achieved personalities such as former presidents, Nobel Prize winners and successful executives from a number of fields, but also less known entrepreneurs with innovative and world-changing ideas.
Hundreds of lectures and demonstrations have been delivered since 1984 and one in particular from Pattie Maes' lab at MIT was the buzz of TED this year. Dubbed 'sixth sense', this device is the brainchild of student Pranav Mistry and combines optical sensors with mobile connectivity to augment the physical world around us with digital information.
For instance, one could seamlessly pull up data on various items while shopping for groceries, interact with this information based on gestures and even take photos by forming a frame with your fingers. You really have to watch the video demo after the jump to fully understand what the project is about, but basically, it should enable users to interact with any real life surface by using gestures and have constant access to relevant and real time information available on the web.
The Technology, Entertainment, Design conference – otherwise known simply asTED – is an exclusive annual conference that serves as a launch pad for “ideas worth spreading.” Its participants usually include well-achieved personalities such as former presidents, Nobel Prize winners and successful executives from a number of fields, but also less known entrepreneurs with innovative and world-changing ideas.
Hundreds of lectures and demonstrations have been delivered since 1984 and one in particular from Pattie Maes' lab at MIT was the buzz of TED this year. Dubbed 'sixth sense', this device is the brainchild of student Pranav Mistry and combines optical sensors with mobile connectivity to augment the physical world around us with digital information.
The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.
The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes user’s freehand gestures (postures). For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. The user can stop by any surface or wall and flick through the photos he/she has taken. SixthSense also lets the user draw icons or symbols in the air using the movement of the index finger and recognizes those symbols as interaction instructions. For example, drawing a magnifying glass symbol takes the user to the map application or drawing an ‘@’ symbol lets the user check his mail. The SixthSense system also augments physical objects the user is interacting with by projecting more information about these objects projected on them. For example, a newspaper can show live video news or dynamic information can be provided on a regular piece of paper. The gesture of drawing a circle on the user’s wrist projects an analog watch.
For instance, one could seamlessly pull up data on various items while shopping for groceries, interact with this information based on gestures and even take photos by forming a frame with your fingers. You really have to watch the video demo after the jump to fully understand what the project is about, but basically, it should enable users to interact with any real life surface by using gestures and have constant access to relevant and real time information available on the web.
wow. this is amazing. it kind of boggles the mind.