Skip to main content

Intel's RealSense technology taught a drone to fly itself

Intel's RealSense technology taught a drone to fly itself

/

And the same technology can help the blind see

Share this story

Four men — one in a suit, three in black shirts — stood around a single drone, buzzing restlessly in the middle. One man rushed at the drone, which quickly veered out of the way... right into the waiting arms of another. The four men passed the drone among each other, never touching or controlling it, only moving it around by flailing their arms and moving in its direction. The drone knew the men were coming, and it avoided them all on its own.

This is drone ping-pong, and this is what Intel thinks its RealSense technology can enable. During its CES 2015 keynote, the company focused almost entirely on the possibilities enabled by its real-time RealSense processing engine, which helps a device do everything from assess its position in space to read the emotions on the person in front of it. CEO Brian Krzanich showed people opening doors and unlocking their phones using facial recognition, but the presentation took an entirely different turn when the drones began to fly. Krzanich sent one through an obstacle course, in which it flew without a controller through a series of left, right, up, and down obstacles. Then, at the very end, it ran into a door. It paused, flying simply in place; when the doors opened, it flew straight through.

RealSense Daryl

What if your jacket could vibrate when you were getting too close?

Krzanich then spoke with an Intel employee named Daryl, who was wearing a jacket with RealSense built in. "I can feel people moving around me" as he wears he the jacket, he said, which helps him react faster. Daryl is losing his sight, and the jacket gives him more comfort and power in his environment. His jacket can tell him when he's too close to something, or when something is moving too fast — in a very real way, it helps him see. This tech can be built into things as small as a button, and with the Curie module it's not even hard to implement.

It may have just been a demo — and far from the first one we've seen for RealSense — but it was a remarkable one. If Intel can make our devices not only able to do things when they're told, but to actually adapt to their surroundings and their owners, it will open up a completely different use case for almost every device we own. RealSense is already being embedded in laptops, to change how we interact with games and apps (a cook on stage during Intel's keynote did some crazy gesture to scroll through a recipe while his hands were dirty), but the potential is clearly bigger. A lot bigger.

See all the latest CES 2015 news here ›