Google’s smart speakers are recording users when they least expect it, according to temp worker language experts hired by the company to listen to the snippets – which include some of users’ most private moments.
Google is able to claim it does not listen to the recordings Google Home devices are constantly generating only because it contracts the job out to temp workers. These “language experts,” as they are called, use a collaborative system built by the company to share and analyze sound snippets, assisting Google’s AI assistant in deciphering the nuances of human speech.
While Google emphasizes that it anonymizes the snippets, replacing the user’s name with a serial number, Belgian broadcaster VRT found that matching a voice snippet with its owner was not very difficult, given the ample supply of addresses and sensitive information found on the recordings they were given. They listened to over 1,000 excerpts supplied by a Dutch contractor and discovered that more than 15 percent of them – 153 recordings in all – were recorded without the user’s knowledge.
In one “accidental” recording, a woman was in “definite distress,” the temp said. Other snippets included sex and pillow talk, fights, and professional phone calls packed with private information. While employees are instructed to treat account numbers and passwords as “sensitive,” they’re left to their own devices everywhere else, leading to potential errors in judgment…like leaking to the media, according to Google, which condemned the contractor who spoke to VRT while fiercely defending its own practices.
Insisting that Google has safeguards in place to prevent “false accepts” – recordings initiated without the user’s knowledge – Google Search project manager David Monsees wrote in a Thursday blog post that the use of “language experts” is “necessary to creating products like the Google Assistant” and claimed the experts only review .2 percent of audio fragments recorded by the device. Monsees warned the leaker that “Security and Privacy Response teams have been activated on the issue, are investigating, and …will take action.”
When Bloomberg reported that Amazon’s Alexa was using thousands of humans to transcribe and annotate recordings, many made without the user’s knowledge, Google gloated that its superior Home Assistant anonymized and distorted audio snippets. However, the recordings VRT heard were not distorted at all. Google Home snippets were “clear,” and Google Assistant, the cellphone app version, produced “telephone quality” audio.
Google Home owners who expect the company to respect their privacy might be wise to consult the history of the company, whose founders have made their distaste for the concept of privacy abundantly clear: an attempt to set up an AI “ethics council” lasted less than a week before collapsing, and a study published earlier this week showed over 1,000 apps for Google’s Android operating system collect data even when users deny them permission to do so. Earlier this year, users of the company’s Nest Secure home security system discovered the device had a hidden microphone when a downloadable update activated the feature.