Researchers working for the French government have found a new way to silently and remotely take control of Siri or Google Now on your smartphone, according to a report from Wired. The method is ingenious but limited, working only over short distances and requiring the target to have a pair of microphone-enabled earphones plugged into their phone. The hack takes advantage of the fact that a headphone cord can double up as a makeshift antenna, with hackers sending electromagnetic signals to the wire which are converted into audio input. The target device then interprets these as regular voice commands.
The hack could be used to send texts and place calls
"The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts," wrote the researchers, José Lopes Esteves and Chaouki Kasmi, who discovered the hack. Their work was carried out on behalf of French government agency ANSSI, which handles computer security, and was published by IEEE. Speaking to Wired, Vincent Strubel, the director of the ANSSI research group which discovered the hack, said "the sky is the limit here." He gave an example scenario of hackers using the technique in a crowded bar or airport: "Sending out some electromagnetic waves could cause a lot of smartphones to call a paid number and generate cash."
However, there are many limitations. For a start, as well as requiring that a pair of microphone-enabled earphones are plugged in, the target phone needs to have voice commands enabled on the lock screen (this is the default for iPhones but not for many Android devices, and easily disabled on both operating systems), and users might simply notice that their phone was doing something strange. There's also a problem of range. Using equipment small enough to fit into a backpack, the researchers were able to make the hack work over a distance of just 6.5 feet, reports Wired. If the hardware was scaled up to fill a car, this range would still be no more than roughly 16 feet. It's an ingenious method, certainly, and reveals some latent security issues with the use of personal assistants, but there aren't many hacks you can simply walk away from.