You are here

You are here

Alexa! Don’t listen to that silent laser

Richi Jennings Industry analyst and editor, RJAssociates

Smart speakers are vulnerable to spoofing attacks using lasers. That’s the conclusion of research unveiled this week.

Apparently, a malicious laser beam—when modulated with speech patterns—will trigger the microphones used in all these devices. Despite there being no actual sound, the device thinks the attacker’s words are being said right next to it, and will do whatever they tell it.

And an attacker could aim the laser through a window. In this week’s Security Blogwatch, we consider weird new threat models.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Alexa stumped.

[ Get up to speed with TechBeacon's guide to a Modern Security Operations Center. Plus: Learn how to defend against insider threats with Interset and CrowdStrike. ]


What’s the craic? Nicole Perlroth reports—Researchers Say They Can Hack Alexa, Google Home or Siri:

Researchers in Japan and at the University of Michigan [say they’ve] found a way to take over … devices from hundreds of feet away by shining [a] laser … at the devices’ microphones. … They said they were able to hijack a voice assistant more than 350 feet away.

[It] was easy, the researchers said. … They said they could have easily switched light switches on and off, made online purchases … opened a front door protected by a smart lock [and] even could have remotely unlocked or started a car.

Takeshi Sugawara at the University of Electro-Communications in Japan and [Kevin] Fu, Daniel Genkin, Sara Rampazzi and Benjamin Cyr at the University of Michigan released their findings in a paper. … Genkin was also one of the researchers responsible for discovering … Meltdown and Spectre. [They] discovered that the microphones in the devices would respond to light as if it were sound.

Tesla, Ford, Amazon, Apple and Google … all said they were studying the conclusions. … An Amazon spokeswoman said … customers could … set up voice PINs for Alexa shopping or … use the mute button.

And Andy Greenberg adds color—Hackers Can Use Lasers to ‘Speak’:

In the spring of last year, cybersecurity researcher Takeshi Sugawara walked into the lab of Kevin Fu. … He wanted to show off a strange trick he'd discovered. Sugawara pointed a high-powered laser at the microphone of his iPad. [The] microphone had inexplicably converted the laser's light into an electrical signal, just as it would with sound.

Six months later [the] researchers have honed that curious photoacoustic quirk into something far more disturbing. … They can open garages, make online purchases, and cause all manner of mischief or malevolence.

When it comes to the actual physics of a microphone interpreting light as sound, the researchers [say] they don't know … what photoacoustic mechanics caused their light-as-speech effect. [But] given the physical nature of the vulnerability, no software update may be able to fix it [in] smart speakers.

These days, all vulnerabilities need a snappy name. Sugawara et al. employ reductive nomenclature—Light Commands:

We propose a new class of signal injection attacks on microphones based on the photoacoustic effect. … We show how an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphone’s aperture.

This effect leads to a remote voice-command injection attack. … User authentication on these devices is often lacking or non-existent.

We have identified a semantic gap between the physics and specifications of MEMS (microelectro-mechanical systems) microphones. [They] are particularly popular in mobile and embedded applications … due to their small footprints and low prices.

We find that 5 mW of laser power (the equivalent of a laser pointer) is sufficient to obtain full control over many popular Alexa and Google smart home devices. … [We can] unlock the target’s smart-lock protected front door, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford).

An attacker can build a cheap yet effective injection setup, using commercially available laser pointers and laser drivers. Moreover, by using infrared lasers and abusing volume features (e.g., whisper mode for Alexa devices) … an attacker can [minimize] the chance of discovery.

Here come the YouTube trolls. 10mintwo might be one:

Lame, absurdly improbable attack. … Like this is such a technical quandary it requires resurrecting Einstein's corpse from the grave to solve.

Here's the wildly complex solution I came up with, after 0.2 seconds of intense thought: … DON'T PUT THE DAMN THING NEXT TO A WINDOW. Now where's my personal uni. research lab and million $ budget?

Nothing to see here? Legatum_of_Kain disagrees:

The people … saying this is no big deal probably do not work in cybersecurity. Personal assistants usually have explicit or implicit access to contacts, phone calls, calendars, emails and all of these can be used to not only get around 2FA, but also to socially engineer further attacks.

This alone makes me want to disable voice assistants in all my devices just to kill that avenue of attack, let alone not ever own voice activated standalone devices.

So let’s review. @Pinboard adds to this vital list:

Threat model for home surveillance microphones so far:
  • parrots
  • small children
  • large children
  • lasers from across street
  • government subpoena
  • third-world gig economy workers
  • first-world gig economy workers
  • local police
  • remote police
  • giant tech monopolies.

    Yes, but how does it work? athlon11 has educated guesses:

    I did study MEMS in school … 9 years ago.

    I have two simple guesses as to how this works: … I could see this being caused by photon pressure—the photons generating enough force to move the … diaphragm.

    [Or] through heating of the diaphragm … which would cause expansion and contraction—causing it to vibrate as it does with sound.

    But Baloroth doesn’t like that first hypothesis:

    Light waves do carry momentum. However, the pressure from light is absolutely negligible, unless you're talking about megawatt+ lasers. More likely either:

    a) the light heats the (very thin and sensitive) diaphragm, causing it to contract as if being hit by sound waves, or

    b) some of the electronics in the microphone are themselves sensitive to light directly (basically, something is acting as a photodiode).

    Meanwhile, this Anonymous Coward asks the obvious question:

    Why not just a giant … speaker? And they could just burn a hole through the garage door and walk in anyway, unless scifi has led me astray.

    The moral of the story?

    If you build voice-triggered apps, harden them against “novel” threat vectors such as this. Ask yourself: Is implied proximity enough?

    [ Learn how to practice zero trust security with TechBeacon's guide. Plus: Learn how get to zero trust access control with low friction in this Webinar. ]

    And finally

    Alexa pleads the Fifth

    Previously in “And finally”

    You have been reading Security Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or Ask your doctor before reading. Your mileage may vary. E&OE.

    Image source: Douglas Muth (cc:by-sa)

    [ Learn how to supercharge your behavioral analytics with CrowdStrike EDR in this Webinar. Plus: Get the State of SecOps Report. ]