Intel's "perceptual computing" initiative might still be a loose collection of motion control and voice recognition technologies right now, but that doesn't mean they aren't impressive feats: We just tried a few computers equipped with a Kinect-like 3D camera that let us play video games merely by waving a hand, and let us digitally reach right into the screen. It's pretty simple, really: Intel's tracking the motion of your fingers with an accuracy now that can scan each and every joint, then recreate that motion in real time.
In Portal 2, that means that you can just reach out and grab a cube, and rotate it in 3D space just by rotating your hand the slightest amount. When you push your hand towards the screen, the cube moves deeper into the scene as well. That makes for some delightfully interactive puzzles, but also some destructive fun, as certain segments offer you pinpoint control over a deadly laser beam to destroy those pesky turrets. The only letdown is that "grabbing" is a particular motion where you make a fist, the game doesn't actually recognize when your fingers are wrapping around a virtual object. Just think: only two years ago, these exact same demos required a physical game controller and an electromagnetic orb.
Even more impressive was a demo where we could hold out a hand, and the computer would immediately create a digital clone on screen — right down to the motions of each and every joint of each finger, flexing as we flexed our digits. It's downright creepy. It's particularly creepy when it fails to recognize the orientation of a hand, and turns it into a flopping disembodied monstrosity. We digress.
Intel bills perceptual computing as giving computers the ability to know what you're doing, how you're feeling, and respond appropriately, but we can't help but think that the killer app here might be virtual reality.