A team of researchers from the Massachusetts Institute of Technology have just released a video showing a drone autonomously navigating its way through tightly clustered trees at 30 miles per hour, the most advanced system of object detection and avoidance we've seen to date. It's called Pushbroom Stereo and was developed by Andrew Barry and Russ Tedrake, both with the Robot Locomotion Group at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL), a hotbed of innovation in the world of robotics and computer vision.
"Everyone is building drones these days, but nobody knows how to get them to stop running into things," says Barry, who developed the system as part of his thesis with Tedrake, an MIT professor. "Sensors like lidar are too heavy to put on small aircraft, and creating maps of the environment in advance isn’t practical. If we want drones that can fly quickly and navigate in the real world, we need better, faster algorithms." The results look like lost scenes from the Top Gun special effects sequences.
The drone was built using off-the-shelf parts, including a camera and processor that are no more powerful than what you find in today's smartphones. All told, Barry says it cost about $1,700 to build. We've seen drones, like the Asctec Firefly with Intel's RealSense camera, navigate through forests before. But they moved much slower than this. Barry says that's because most systems map the world at multiple points, for example at a distance of 2, 4, and 6 meters in front of its flight path. This new system maps only 10 meters out, making it much less computationally intensive, although what happens if a squirrel suddenly enters the previously empty path between you and that 10-meter mark is not exactly clear.