This year at SXSW, Sony opened up what it calls the “Wow Factory” in a converted warehouse on Trinity Street in Austin, where members of its Future Lab program have set up some of the coolest and weirdest hardware concepts out there. The Future Lab program is a research and development initiative that urges Sony employees to think more about human interaction and creativity, and not just bigger screens and faster processors.
One theme Sony hit upon at last year’s show and brought back in full force this go-around is projector-based touchscreen technology. The company has essentially taken its expertise in display projection and married it with some truly unique user interface design. The result is a pair of products that can turn any flat surface into a screen that you can not only interact with using your hands, but that can also take real-world objects and turn them into a kind of augmented reality version of themselves.
Imagine placing a copy of Alice’s Adventures in Wonderland on the table and then being able to drag a character off the page, or running your finger along a plain wooden surface and turning it into a responsive piano made of light. One demo even takes angular blocks of white-painted wood and transforms the table into a scale model of a home — using only light from the projector.
Those are features of two products, one prototype and one soon-to-be consumer device, Sony’s Future Lab has cooked up. One is a projector that Sony first brought out at last year’s SXSW. It sits directly above a tabletop, transforming the surface into an interactive display that does 3D tracking of hand movements and objects, as well as depth sensing. The device is aware of when both an object is placed in view, when your physical hand is touching that object, or when a pointed finger is resting on the table’s surface.
Sony created some clever software demos to show it off, including a live music app that used cylindrical plastic blocks to create an increasingly elaborate version of a classic Beethoven tune. The other was the Alice demo we first saw last year, which showed off how the software could identify when a teacup or deck of cards was on the table and overlay some cool graphics that could even be manipulated by a user dragging their hands on the table.
The third and final demo was the scale model one, which let a Sony rep construct a virtual home out of blocks of wood. He then manipulated the scene by dropping physical objects on the table that transform into virtual trees and adding light to the scene by hovering his hand over the objects.
The second projector, called the Xperia Touch and due out later some time this spring, is new this year at SXSW. It’s an entirely different piece of hardware, that relies on the same responsive projector technology. Instead of sitting overhead, this version is a small modem-sized box that sits at the peak of an angled surface. The projector turns the table into a number of different musical instruments by blasting light at the surface, while sensors track what your hands are doing on the table to let you produce sounds.
You could draw circles to create a series of drum pads and strike them with hand taps. You could also string together multiple projector units into a single, unified piano and play it just like a real one. Both of the projectors run on a modified version of Android, letting Sony’s software accept traditional touchscreen input methods even though there’s no screen in play whatsoever.
It’s hard to fully grasp what’s going on without seeing it in action, and it truly feels like Sony has pioneered something groundbreaking here. This type of tech might be necessarily cost-effective right now, and neither of these devices feel close to becoming consumer products you might actually see in someone’s offie or home any time soon.
But even as proof of concepts, this hardware goes a long way in helping us envision what the future of interactivity might look like — especially when it takes away screens and relies solely on light.