Using a phone or smartwatch when you’re supposed to be talking to someone has become an accepted rudeness in the 21st century. So, a group of researchers have a possible solution to this minor societal ill: prototype smart glasses that let you control a computer just by rubbing your nose. Yes, you can reject a call, pause a video, or skip a song, simply by scratching your schnoz.
They aren’t (sadly) available to buy right now, or we’d all be wandering the streets, pawing at our noses like coked-up advertising execs. The glasses were designed as an experiment by researchers from South Korea’s KAIST University, Japan’s Keio University, the University of St. Andrews in Scotland, and Georgia Tech in the US, to create a way to “control a wearable computer without calling attention to the user in public.”
The specs work thanks to a trio of electrooculography (or EOG) sensors embedded in the bridge and nosepads of the frame, which measure the electric potential of the surrounding flesh. These types of sensors are usually used to record eye activity for doctors, but have also found their way into the film industry as a method of re-creating realistic eye movements in CGI.
Instead of recording the eye, though, these sensors are trained on the nose. When the wearer touches it in different ways, it changes the electric potential of the organ, and the researchers were able to identify the specific signatures of different motions. These including flicking and pushing the nose to one side or the other, and rubbing the bottom.
The system (delightfully dubbed ItchyNose) could be used to minimize social awkwardness when using wearable computers, says the study’s first author Juyoung Lee. He has in mind the sort of device that might be worn directly in front of the eyes. Indeed, Google Glass had a similar-ish control system, using swipes down the side of the frame as an input.
“If an important text from a spouse came in during a business meeting, the user could check it and dismiss it quickly without calling undue attention to the interaction,” Lee tells The Verge. “Similarly, if the user had a list of names and faces to remind her of who is in the meeting, she could scroll through the list until she found the person whose name she forgot. These quick interactions can be very useful but are rare enough to not attract attention.”
There are some major limitations to the system. Namely the fact that the movements are not suitable for precise controls and, if the demo video is anything to judge by, they have to be performed with a little gusto to get the message across.
Researcher Hui-Shyong Yeo says the current challenge facing the team is getting the system to distinguish between intentional and unintentional nose scratches. “Certain gestures like rubbing are very distinct and almost never false trigger. Other gestures, like the push gesture, have more false triggers per hour,” he says. “There is a trade-off between how consistent the users must be with the control gesture versus how many false triggers the system will have.”
The answer, says Yeo, is training the system to adapt to each individuals’ own mannerisms. Once it nose enough, there’s snot a problem.