A group of students from Berkeley have demonstrated how malicious commands to Siri, Google Assistant and Alexa can be hidden in recorded music or innocuous-sounding speech.

Simply playing the tracks over the radio, streaming music track or podcast could allow attackers to take control of a smart home …

The NY Times reports that it builds on research that began in 2016.

The 2016 research demonstrated commands hidden in white noise, but the students have this month managed to do the same thing in music and spoken text.

Similar techniques have been demonstrated using ultrasonic frequencies.

They were able to hide the command, “O.K. Google, browse to evil.com” in a recording of the spoken phrase, “Without the data set, the article is useless.” Humans cannot discern the command.

The Berkeley group also embedded the command in music files, including a four-second clip from Verdi’s “Requiem.”

The Berkeley researchers say that there is no indication of the attack method being used in the wild, but that could easily change.

Apple said that Siri has protections in place that limit the opportunities to execute this type of attack.

Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors, [said that] while there was no evidence that these techniques have left the lab, it may only be a matter of time before someone starts exploiting them. “My assumption is that the malicious people already employ people to do what I do,” he said.

Apple said its smart speaker, HomePod, is designed to prevent commands from doing things like unlocking doors, and it noted that iPhones and iPads must be unlocked before Siri will act on commands that access sensitive data or open apps and websites, among other measures.