Skip to main content

Robots at the University of Tsukuba learn to read facial expressions

Robots at the University of Tsukuba learn to read facial expressions

Share this story

Robot training
Robot training

Taking a page from how people learn, researchers from the University of Tsukuba, Japan have created a method of teaching robots via facial expression. Most humans and some animals learn a huge amount of their behavior by the body language of those teaching them — the difference between a smile and a frown tells them if they're doing something right or wrong. By applying these same methods to training robots, you're able to influence their actions in a more intuitive and immediate way than by hitting a button or using a dial.

The technology uses a small wireless electromyography (EMG) head band, which can accurately read smiles and frowns 97 percent of the time — and unlike facial recognition, can be used facing any direction, and under any light. The video below shows how it's used, training a Nao robot in real time whether it should give a ball or throw it. The robot's learning is just about palpable as it figures out what the scientist wants based on if she's smiling or frowning. It's a step towards a more naturalistic way of interacting with robots, one which taps into taps into universal human expressions rather than requiring a specific vocabulary.