Rescue robots could be incredibly useful tools in the future, diving into situations too dangerous for humans. But if you’ve seen current-generation robots try to tackle these sorts of scenarios, you’ll know even the most advanced lack the coordination and flexibility of movement required. To fix this, a group of researchers from Japan are trying a new method of robot-control, letting a human operate a bot essentially like a giant puppet.
This sort of remote control system isn’t new, and usually comes under a branch of research known as “telepresence.” However, scientists from the University of Tokyo say their method (presented last month at the IROS conference) is more advanced than predecessors’. Earlier systems used smaller robots or only controlled the upper half of the bot; theirs controls an entire robot as big as an adult human, using controllers from HTC virtual reality Vive system.
The Vive’s “lighthouse” sensors are used to track its controllers in 3D space using infrared light. By strapping a controller to each foot and hand, the researchers were able to map their movements and send them as commands to their robot. The really clever part, though, is the intermediary software that ensures any motion sent to the robot are tweaked to fit its capabilities.
“For example, stepping at a walking speed is allowed, but running and jumping are forbidden,” researcher Ishiguro Yasushiro tells The Verge over email. Jerky movements are smoothed out, and fast ones slowed down, he says. “We force the robot to keep its gait always safe.” Yasushiro admits that the system still has “many weak points,” but claims its offers more flexibility and responsiveness than other designs.
The robot itself was built several years ago by the university and is known as JAXON. It previously competed in DARPA’s Robotics Challenge, which is meant to test bots in disaster scenarios. Although these humanoid machines are currently too unwieldy to be used in the field, it’s hoped that they’ll be more convenient in the future. After all, if they have to navigate a space built for humans (with doors, and handles, and valves), it’s handy to be human shaped.
Yasushiro says that he and his colleagues have only really mastered slow bipedal walking, and that they want to learn more actions, like walking up and down stairs, jumping, and even running. Virtual reality headsets could help operators see what the robot sees, and force feedback suits could help them feel what it feels. “Eventually, we aim to achieve everything humans can through the humanoid robot,” says Yasushiro.