Google glass+Microsoft Kinnect... Futurist dreaming
Imagine google glass style glasses with a Kinnect sensor across the rim that tracks your hands, allowing you to interact with the projected objects via direct manipulation. Certain gestures like "opening a book" could be used to access menus. Another approach for "system gestures" could be to mimic swiping from "edges" like windows 8, where the edge objects are also projected as hot spots that you can interact with. To avoid accidentally hitting or groping people, you would be advised to look down or up when gesturing. Or consider that you could have this available to you while you work from the comfort of a chair, without requiring an office desk. In terms of ergonomic considerations, it may be advisable to have an actual physical object that proxies the virtual one and allows for complex interaction like a rubix cube or a ball.... Could this be another direction for NUI?