Carnegie Mellon researcher Chris Harrison and his team have invented a screen technology called TapSense that could enhance the way we interact with touchscreens — particularly on mobile devices. Up to this point, we've only been able to tap with our fingertips, but TapSense recognizes input from other parts of the finger, allowing for a greater variety of touch responses. For example, your knuckle could be mapped to bring up a context menu similar to a right mouse click, while a fingernail could be used to close an application. Perhaps inspired by drumming apps for the iPhone that use its microphone to help determine tap velocity, the researchers attached an acoustic sensor to an iPod Touch and differentiated between the sounds made by each part of the finger with a computer program. The team also demonstrated TapSense's applications on a table setup, somewhat reminiscent of Microsoft Surface, where the technology can discern between different types of pens, allowing for several people to collaborate on the same drawing. The researchers believe that TapSense could be implemented into a phone if it was given raw access to the built-in microphone, but they'll have to work on its accuracy — fingertip input is accurate a little more than half the time. Watch TapSense in action below.