author photo
By SecureWorld News Team
Thu | Jul 14, 2016 | 9:10 AM PDT

By Courtney Theim
SecureWorld Media

Think of Siri as your personal assistant? It may be allowing far more than you bargained for.

A team of eight researchers from Georgetown University and UC Berkeley showed how devices that enable voice-activated commands (such as Siri and Google Now) could be hacked. According to the study, the hack is possible through "hidden voice commands that are unintelligible to human listeners but which are interpreted as commands by devices."

Because voice command is by default almost always turned on (even on your lock screen), it's easy for a hacker to take control of your phone.

"Depending upon the device, attacks could lead to information leakage (e.g., posting the user's location on Twitter), cause denial of service (e.g., activating airplane mode), or serve as a stepping stone for further attacks (e.g., opening a web page hosting drive-by malware). Hidden voice commands may also be broadcast from a loudspeaker at an event or embedded in a trending YouTube video, compounding the reach of a single attack," the study describes.

Two forms of this hack exist, which researchers refer to as a "black-box" or "white-box" attack. A black-box attack doesn't require much knowledge from the attacker of the voice-recognition software, and can be as simple as playing a muffled voice command. White-box attacks require sufficient knowledge of the system in order to hide the hack from the victim completely.

Examples of black-box and white-box attacks used by the researchers can be found here.

Stopping these attacks can be complicated; speaker recognition software is an option, although hackers could still obtain samples of your voice to bypass this. Removing Siri's access to your lock screen can also help. Go to Settings, Touch ID & Passcode, then toggle off Siri under Allow Access When Locked.

Comments