MIT scientists are developing a camera system to enable an unmanned aircraft to respond to gestures from the crew on the deck of an aircraft carrier. The team uses a combination of proprietary software and a stereo camera to detect posture, arm, and hand movements, and compares the data against standard Air Force gestures. The system is very similar to Microsoft's Kinect, which unfortunately wasn't available when the project began. Although the technology certainly belongs in the same family tree, the researchers face some unique challenges in accurately tracking the crew members: an aircraft carrier's deck is not a static environment, and the crew is always on the move. Because of this, it's not possible for the system to be constantly tracking movement, so instead it focuses on key body-pose sequences.
An algorithm has been developed that focuses on a finding one of 24 gestures in sequences around 60 long. It analyses video in small chunks, looking at overall body shape and arm position to weigh up the probability that one of the frames contains the start of a gesture. Once it thinks it has found a key frame, it then analyses those immediately after it to weigh up the probability of a gesture taking place. After identifying the movement, it then looks at hand shape to determine the exact gesture taking place. Vetting the data in this way helps the system to improve its accuracy, but it still isn't quite up to scratch. The system is currently only 76 percent accurate — a long way off the level needed for military applications. Moving forward, the researchers are going to retool the software to analyze arm and hand position simultaneously, which they believe will fix the current issues.