Pranav Mistry`s
integrating information with
the real world
SixthSense' is a wearable gestural
interface that augments the physical world around us with digital information
and lets us use natural hand gestures to interact with that information.
We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.
The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while
the camera recognizes and tracks user's hand gestures and physical objects
using computer-vision based techniques. The software program processes the
video stream data captured by the camera and tracks the locations of the
colored markers (visual tracking fiducials) at the tip of the user’s fingers using
simple computer-vision techniques. The movements and arrangements of these
fiducials are interpreted into gestures that act as interaction instructions
for the projected application interfaces. The maximum number of tracked fingers
is only constrained by the number of unique fiducials, thus SixthSense also
supports multi-touch and multi-user interaction. The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application
lets the user draw on any surface by tracking the fingertip
movements of the user’s index finger. SixthSense also recognizes user’s
freehand gestures (postures). For example, the SixthSense system implements a
gestural camera that takes photos of the scene the user is looking at by
detecting the ‘framing’ gesture. The user can stop by any surface or wall and
flick through the photos he/she has taken. SixthSense also lets the user draw
icons or symbols in the air using the movement of the index finger and
recognizes those symbols as interaction instructions. For example, drawing a
magnifying glass symbol takes the user to the map application or drawing an ‘@’
symbol lets the user check his mail. The SixthSense system also augments
physical objects the user is interacting with by projecting more information
about these objects projected on them. For example, a newspaper can show live
video news or dynamic information can be provided on a regular piece of paper.
The gesture of drawing a circle on the user’s wrist projects an analog watch.
Check out the video to get more of
an idea >>
To know more about Pranav Mistry-click here




No comments:
Post a Comment