As if we needed another reason to hate annoying voice-command technology, researchers at Georgetown University have discovered it could lead to attacks on your smartphone.
In a new study, computer science professor Michael Sherr explains that the imperfect tech on our precious devices can understand vocal attacks that we can’t (maybe that’s because the attacks sound like demon robots). Basically, even though Siri can’t seem to understand that we want to call “Mom” and not “Bob,” she has no trouble with The Exorcist-like commands outside of your control.
According to Sherr, these hypothetical commands could be hidden in the audio tracks of otherwise harmless-seeming media (say, a cat video). If you or someone nearby is watching said cat video, your phone could inadvertently obey the hidden voice command, resigning it to a similar fate to the 10 million Android phones that were recently corrupted.
There is a slight silver lining to Sherr’s research (emphasis on slight): his team has created a few defenses against these attacks. One alerts you with a tone, but that might not work if you’re busy watching a video. They also created machine-learning techniques to help computers differentiate between human and digitally produced speech. But that’s also flawed for long-term use thanks to ever-evolving speech-recognition systems.
As Sherr points out, these attacks are possible because companies tend to be pretty liberal when deciding what counts as speech in an effort to make sure their systems are efficient enough for public use.
And if the question is “Sales or security?”, well, you can probably guess the answer.