When it comes to modern interfaces, touchscreens don't provide a lot of feedback, and air gestures like those that might be used in Project Glass are even more confusing. That lack of feedback is exactly what the Syntact gestural interface is designed to fix. Built by Ultrasonic Audio, Syntact sends an array of ultrasound waves that converge on one point, letting the user feel a version of the sound they're creating. At the same time, a USB camera picks up hand gestures. The result, as seen in the video, looks something like an air keyboard, but one where the user can actually tell what keys are being pressed.

So far, interested parties can get directly in touch with Ultrasonic, but we don't see any sign that this will be widely available, nor any word on pricing. It is, however, apparently going to be at London's Beam Festival next month, so hopefully we can see whether it's as amazing as it looks.