To many people, Google Glass already seems like something straight out of Minority Report, and newly granted gesture patents from the company will only reinforce that association. The patents outline an entirely new way of controlling Glass (or any other wearable computing device); one that tracks a user's hand gestures in an attempt to understand what's important or significant. One example cited puts a physical spin on the ubiquitous "like" action used across social media. Google's patent shows a user framing real-world objects with a heart-shaped hand gesture. Using its built-in camera, the wearable device would then analyze the framed content and intelligently "like" the highlighted object or location.

Heartlandscape

The possibilities would theoretically be endless for this sort of recognition-based user interface. Google says other gestures could include forming a right angle with your thumb and index finger, or moving your hand in the shape of a closed loop. Keep in mind that these patents aren't necessarily proof that Google intends to roll out such an unconventional way of interacting with technology anytime soon, but the company is clearly experimenting with new ways of controlling Glass that move beyond head gestures, voice commands, and the device's "swipe bar."

Rightangle