While some recent studies have found that investment in facial recognition technologies hasn't yet yielded great success, researchers at the Massachusetts Institute of Technology continue to move the ball forward with a system that can outperform humans at recognizing the emotion behind a person's smile. The group's study involved the difference between a happy smile and one generated out of annoyance. To begin, the researchers asked subjects to act out two different emotional reactions: joy and frustration, both tracked by webcam video. MIT's algorithms analyzed the footage to determine the character of the reactions as did a group of regular humans; both could determine the expressed emotion with similar accuracy. The researchers then added some nuance to the experiment, showing subjects a video of a cute baby, and also asking them to fill out a form on a computer — one that would reset after they clicked "submit."

Both experiences produced smiles — one borne from delight, the other from frustration — but while the human analysts guessed the emotional state correctly less than half the time, MIT's computer algorithm guessed correctly more than 90 percent of the time. The researchers see the technology being used to help better train those who have trouble interpreting human expression (individuals with autism are mentioned as possible beneficiaries). There are also commercial implications, however, with one of the students involved with the program pointing out that marketers may find the information incredibly useful. "Just because a customer is smiling, that doesn't necessarily mean they're satisfied," Ehsan Hoque told MIT News. "The underlying meaning behind the smile is crucial."