Skip to main content

    Microsoft and Carnegie Mellon researchers put a touchscreen on the palm of your hand

    Microsoft and Carnegie Mellon researchers put a touchscreen on the palm of your hand


    Researchers from Carnegie Mellon and Microsoft have created a new system that turns everyday objects into interactive touchscreens.

    Share this story

    Touchscreens have revolutionized the way we use computers, and now researchers are bringing the same kind of interaction to everyday objects with a system called OmniTouch. Developed by Carnegie Mellon Ph.D. student Chris Harrison, with Andy Wilson and Hrvoje Benko of Microsoft Research Redmond, the system consists of a shoulder-mounted pico projector that displays interface elements on any object in a user's immediate personal vicinity: a nearby wall, a notebook, or even their hand. A depth-sensing camera, similar to the Microsoft Kinect, then tracks the user's multitouch interactions with those elements in real-time. The accuracy and speed, the researchers say, nears that of actual physical touchscreens.

    As part of the proof-of-concept, the team ran the system through a number of trials, from simply tracking touches on the inside of a user's arm, to a more elaborate drawing app mock-up, where the subject would "paint" on a nearby wall with their finger, using their free hand as a virtual palette to choose colors. The system is still clearly a research project, with crude graphics and an unwieldy rig required, and it's not even the first projection-based touchscreen system out there: Light Blue Optics showed one at CES last year. OmniTouch does excite, however, due to its responsiveness and sheer flexibility. The research project is set for its public debut this Wednesday at the UIST conference in Santa Barbara. To see it in action, check out the video below.