October 28, 2020

The European privacy supervisor, on alert for Mercadona’s facial recognition system



The proliferation of facial recognition systems has in check the European privacy authorities, which examine their legal position as they are implemented in more spaces. Wojciech Wiewiórowski, the president of the European Data Protection Supervisor, declared on Wednesday that one of the cases that follows with the most attention is that of Mercadona, which since last July has tested a system to analyze faces in 40 of its supermarkets of all its customers and verify if any of them are prohibited from entering their establishments by the courts or restraining order against their employees.

Wiewiórowski highlights that the chain analyzes any citizen who enters its stores to achieve that goal. “The system processes the faces of all customers to detect the few individuals who are barred by court order. The first question that comes to mind should not be whether the system is accurate enough, but whether it is necessary and proportionate to Reach your goal”, said in a speech on the status of biometric identification systems in the European Union.

Shortly after Mercadona’s announcement, the Spanish Data Protection Agency (AEPD) opened an ex officio investigation to find out the details of the system implemented by the company. The exam is not yet over. “I’m looking forward to their findings,” Wiewiórowski said. Mercadona did not want to make any assessment of the position of the European supervisor.

The supermarkets in which Juan Roig’s company has installed this identification technology are located in Valencia, Mallorca and Zaragoza. At the time of the announcement, Mercadona explained that the system recognizes “physical features” and detects “only and exclusively” the entry of people with a final judgment and a precautionary measure of restraining order in force. “The picture [que sirve de base para la identificación] we have it because it is one of the evidence that we bring to the trial to prove the fact, “he said a company spokesperson to elDiario.es.

Mercadona also assured that it developed the system “in constant contact with the corresponding authorities to guarantee full protection and all its legal guarantees.” However, the AEPD opened its official investigation days later and reported that Mercadona had only provided initial information in the last quarter of 2019. “Already then, the AEPD revealed various reservations about its compliance with the regulations for the protection of data”, revealed Agency sources.

“We should not use a perfect system if it is not proportional”

In his speech, the president of the European privacy regulatory body used the case of Mercadona as an example in which “the need” to establish a biometric identification system “is difficult to determine.” He recalled that although identification technology is very precise (something that facial recognition has not yet achieved), “we should also ask ourselves about its necessity and proportionality. We should not use a perfect system that we do not need or that is processing biometric data in a disproportionate, “says Wiewiórowski.

In this sense, he recalled that this type of information can be used for many different purposes than what was collected in the beginning, either due to misuse by the company in charge of it or due to a security breach. “The same facial images that allow users to be identified can also be used to infer data about their health,” he explained, while “the voice data processed by intelligent assistants to respond to users’ orders allows us to infer about their emotions.”

In June, the AEPD published a list with 14 points in which biometric identification it often gives rise to “misunderstandings”. These included its precision (“there is a certain rate of false positives and false negatives”); that it is appropriate in all cases (“some people cannot use certain types of biometrics because their physical characteristics are not recognized by the system”) or about privacy with respect to other identification methods (“depending on the biometric data collected, they can derive data from the subject such as their race or gender, their emotional state, illnesses, disabilities and genetic characteristics, substance use, etc. “).

.



Source link