15 March 2017

// // Leave a Comment

Voice Commands Hidden In YouTube Videos Can Hack Your Smartphone



Think Twice Before Watching Youtube Videos

A combined research has been conducted by UC Berkeley and Georgetown University to demonstrate how distorted voice commands hidden in YouTube videos can be used to attack a smartphone. The research shows that certain harmful commands that can be understood by our voice assistants can be hidden inside a YouTube video.

Speech recognition systems have trailed on the path of advancement faster than what the world has expected. We’re very well aware of the presence of mind shown by Siri and Cortana. But these qualities manifested by our beloved voice assistants have now become a matter of concern.


According to a research paper titled as Hidden Voice Commands, an attacker can use video from YouTube or any other source to hack your smartphone. Sounds next to impossible but the idea is not that insane as you may assume. The researchers are sure about the experiments they’ve carried out.

What exactly happens is that voice commands which can be understood by voice assistants are packed inside a YouTube video such that a human can’t understand them.

Once you start playing a YouTube video on your computer or laptop on speakers, the hidden commands in the malicious video can trigger operations in your smartphone lying nearby. Tasks, like opening a particular web page and downloading some malware, can be accomplished even without the knowledge of the user. This would allow the attacker to perform things on your device which you might regret.

Two models have been proposed in the research conducted at UC Berkeley and Georgetown University. First, the black-box model, which consists of testing with voices which can be deciphered by a human if he concentrates. A prime benefit described is that if the listener already knows that command present in the distorted voice, then it would be easy for him to understand it by unconsciously hearing the message.

Listen yourself:

The White-box model involves totally distorted voices for which chances are next to impossible of being understood by a human.

Example:

The researchers are also working on an alarming system which would warn the users if any such command gets initiated on their smartphone. The attack detection accuracy is 99.8% with a machine learning approach and a challenge response system created by the researchers. You can visit the Hidden Voice Commands website for more such commands and other technical stuff about the research.

Security issues like these are a great worry in the century when tech giants are busy enhancing the ability of their voice assistants. For example, Google is working on a version of Google Now which works without the internet.

Check out the demo video uploaded by the researchers. In the video, they have shown how the smartphone response to distorted voice commands:

Last Words

Happy Belated Holi friends!!!! Hope you liked this article. Feel free to share your views about these developments in the comments below. And don't forget to share it with your friends!


Share:

0 comments:

Post a Comment