Mon. Apr 22nd, 2019

How is it possible for Amazon workers to listen to sounds recorded by Alexa?

¿Cómo es posible que trabajadores de Amazon escuchen los sonidos registrados por Alexa?



Smart speakers are already a reality in many homes. They are sold by millions each month, and the different assistants that make up their respective "brains" are constantly updated with better responses and greater artificial intelligence skills. Despite the usefulness of its functions, for many people it continues to cause misgivings to install a series of advanced microphones in their homes.

Sponsored Ads

Advertise Here

The potential for them to listen to everything they say is beyond their limits, and the doubts will be constant: who listens to what happens in my house ?, where are the audios recorded? In the case of Amazon, we knew that part of their assistant's ongoing training involved manual labeling by real people to improve the service, but recent research has revealed more details.







When the assistant Alexa does not understand what we mean, the audio is tagged for a possible revision by an Amazon worker


When the Amazon assistant does not understand a command, the audios are labeled under some parameters for review. Some of these audios are later heard by an Amazon worker to try to find out where the error was. This group of human editors, who works in offices around the world to get a sufficient variety of languages, tries to correct the error after listening to the audio.

The most common solutions are activation errors (when someone says something similar to "Alexa" that awakens the microphones), or mis-specified commands or that the system could not interpret. This type of labeling allows the assistant to learn to interpret similar audios better in the future. This is the theory.



Amazon workers can not do anything if they listen to what they think is a crime, just wait for the police or a judge to request the files


In practice things get complicated. According to the multiple sources who commented on this issue with Bloomberg, their daily work allows them to share the audios with other workers through the internal chat system, where they can evaluate what happens in the audio. Something they do "when they find a complicated or entertaining recording", and even disconcerting or uncomfortable. Two of them affirmed that they believe they have listened to recordings of an unconsented sexual act.





If they find a recording in which there is potentially a crime, their job is, according to the sources that explain Amazon's policy, is to do nothing or notify the authorities. The data will be stored in the Amazon servers until the user decides to delete them. Meanwhile, judges and police experts can request access to them if there is an investigation.



Amazon insists that employees can not identify the source of the audios


Amazon Spain explains that its employees "do not have direct access to the information that can identify the customer's person or account as part of this workflow. Although all information is treated with high confidentiality and we use multifactor authentication to restrict access, in addition to encrypted communications in the service and audits of our control environment to protect such confidentiality, customers can always delete their expressions at any time. "

Alexa de Amazon is the leading system both worldwide and in Spain. Its operation is simple, just need to say the keyword ("Alexa") followed by the command or question. As soon as their microphones identify the keyword or invocation, they start recording and sending the sounds to the Amazon servers to examine and respond to it. Other companies with similar services such as Apple's Siri or Google's assistant have virtually identical mechanisms. Mostly it works, but when it does not, all this complex puzzle decisions come into play. Is it worth it to put these microphones in your house?







Source link

Leave a Reply