Computer researchers working for the French government claim they have been able to crack into mobile voice-controlled assistants like Siri, allowing them to remotely and silently give commands to smartphones.
The team, who work the French government’s Network and Information Security Agency (ANSSI) used a “remote command injection” to access Siri and Google Now, the voice-controlled assistant apps available on iOS and Android smartphones.
As long as the phone has a pair of headphones with a built-in microphone plugged into the jack, the hackers can use the cord as an antenna to give mock voice commands.
By broadcasting a silent electromagnetic signal which the headphone wire picks up, they can fool the phone into thinking the signals coming from the headphones are simply regular voice commands.
Everything that can be done through voice commands can be done silently and remotely with this kind of technology.
By accessing Siri in this way, the hackers could command the phones to call certain people, send texts, make searches, or send the phone to a website containing malicious software, all without the owner realizing.
The hackers were able to hijack the phones from up to five meters away, with the use of an amplifier, laptop, antenna and radio.
As reported by Wired, two researchers on the team, Jose Lopes Esteves and Chaouki Kasmi, said the possibility that hackers could send these “parasitic” signals could have “critical security impacts” on the industry.
A lot of Android phones don’t have the Google Now voice-activated assistant accessible from the home-screen, as Siri is on the iPhone.
And most obviously, people with their headphones in their ears would be able to hear the distinctive sound of Siri or Google Now activating.
However, the team’s technology, as well as our appetite for futuristic and easy-to-use features on our smartphones, makes the method feasible as hacking technique.
iOS9, Apple’s latest mobile operating system, has a hands-free feature that allows owners to activate Siri simply by saying “Hey Siri”. Google Now has has a similar feature, and users only have to speak to activate it.. These features mean the hackers can easily activate the voice control before giving their instructions.
However, the hijack can work on previous iOSs too. The team figured out a way to mimic the signal sent by the headphones when the home button is pressed, activating Siri even on older versions that don’t have the hands-free feature.
Vincent Strubel, director of the ANSSI team, told Wired that hackers could potentially go to a crowded train station or cafe, break into multiple devices, and get them to call a premium-rate phone number to generate money.
There’s no evidence that this hack has actually been used by criminals, and the team have tipped off Apple about the security breach.
By using better shielding on their headphone cords, or by making it impossible to use these voice control features without enabling voice recognition or having to say a password, the potential danger could be neutralized.