Skip to main content

Why are all these robots staring at me?

Why are all these robots staring at me?

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Ah, the Massive Eye. Such a comforting symbol, and not all reminiscent of omnipresent surveillance or literal towers of evil from Lord of the Rings. Which must be why so many tech companies have chosen to adopt a huge eyeball as a central design aspect of their home robots.

Just this week, we've seen appearances from two new eyeball-focused bots: the Big-I (which looks like a felt trashcan with a bowling ball stuck on top), and Moorebot (a smaller device that boasts an "eyeball dance" that will entertain the young and easily-amused while playing music). Moorebot's at the top of the page, and here's Big-I:

Not at all creepy, huh?

This is an established trend. Moorebot is itself a rip-off of the design of Jibo, an early "social robot for the home" that boasted an abstract single-eyeball that bounces, changes color, and scrunches itself up in a variety of ways to convey Pixar-like emotions. And even those robots that don't have just a single 'ball (I ran out of synonyms for "eyeball" pretty quickly) tend to focus on animated eyes as the central way of interacting with the robot.

From left to right below we have the Asus Zenbo, Ninebot's Segway Robot, and Buddy by Blue Frog Robotics. (Yeah, I hate Buddy the most, too.) There are also bots like the new Furby Connect, which has eyes made from circular displays.

So why do all these robots focus on such a similar design? The answer is simple: emotions are easy, and robots are hard.

Apart from their mobility, none of these bots offer functions that you can't replicate by sticking a tablet or a smartphone on a shelf somewhere in your house. They can control your home's electronics, turning on lights or hooking up to burglar alarms — but so can smartphone apps. You can shout at them to play music, send texts, or schedule calendar events — but you can do exactly the same with Alexa, Siri, or Google Now. They sometimes add in a little extra functionality (like face recognition), but it's nothing you couldn't achieve with a mobile device.

The big tech companies know the limitations of this tech, which is why they're offering this functionality in static products like the Amazon Echo and Google Home. And it's why the makers of some personal robots make outlandish claims for their devices (like being able to detect Alzheimer's) in order to drum up interest. And completing actual household chores is far beyond the reach of even the most advanced prototype bots. To demonstrate the point, here's Boston Dynamics' robot being defeated by a banana peel again:


So, in the absence of selling anything of practical use, the makers of personal robots are selling the dream. People want a cute robot they can interact with (they want Wall-E!), and animated eyes are a great shortcut to faking emotions and intimacy. We instinctively pay attention to eyes, and we read a lot into them. Plus, if you're going incorporate a display into your robot somewhere, you may as well use that display as well.

Like Tom Hanks and his best volleyball pal in Cast Away, if you want to bond with something — put a face on it. Just don't expect it to be much use because of that.