November 30, 2020

AI can simplify even the most infumable privacy policies | Innovation

You have to have more than free time to be able to read the terms and conditions of Paypal. The agreement we sign when we access this platform is twice as long as Hamlet and triple that The Lazarillo de Tormes, and it is not precisely light literature. It is something that happens with virtually all technological multinationals that offer their services online. Who has thoroughly read the privacy policies of Facebook, YouTube, Spotify or Tinder?

Neither conscientiously nor above. In 2016, two researchers from the universities of York and Connecticut created a false social network whose terms and conditions explained that the personal information collected would be delivered to the US National Security Agency (NSA) and required users to give up their first child. More than 500 university students accessed the platform. 98% of them accepted the privacy policies without reading them. Those who saw them, barely spent a minute before accepting them.

"The terms and conditions are usually very large texts that contain complex legal clauses, in many ambiguous cases, as well as technical terminology inaccessible to the majority," says the data protection expert Isidro Gómez-Juárez. “The platforms must improve the quality of the information they provide to users in terms of transparency, making compatible the requirement of information established by the applicable regulations with conciseness and understanding in the way of presenting it, so that it is accessible and complies so with what should be its ultimate purpose: inform the user ”.

  • An initiative made in spain

Against the impunity with which many companies define the rules of the game, some initiatives have emerged that seek to help the user understand what he is signing. The last one to come to light is called Guard and is an artificial intelligence system that reads the privacy policies of different platforms and offers users an analysis of the main risks they face when they use them.

“About a year ago, I discovered that Facebook had an updated copy of all your mobile contacts,” recalls Javier Ramírez, the Madrid engineer responsible for the platform. “There are strangers in Silicon Valley who know if you have your friends saved by their nickname. It's a barbarity. And it turns out that it appears explicit in its privacy policy. ”

Unidentified people in Silicon Valley know how to save your contacts on mobile; is explicit in the Facebook privacy policy

Javier Ramírez, the Madrid engineer responsible for Guard

A few months later, Ramírez made a bootcamp (intensive training course) of artificial intelligence and thought it would be interesting to apply this technology to simplify the terms and conditions for users.

To make this possible, it has been necessary to categorize a series of phrases based on the risk perceived in them. The phrase "We can sell your data to third parties" receives a different score from "We will not share your account contact information, such as email." Machine learning takes care of the rest.

“Guard is trained to recognize things he has not seen before,” says Ramírez. "We are not talking about heuristic rules, where the system understands that if‘ A ’occurs, it must do‘ B ’. Rather, we explain what kind of things are positive and which are negative and judge what it looks like the most. ”

The platform has been made possible by the altruistic contribution of some 1,900 people, mostly Americans, who have contributed more than 110,000 data points with which to train Ramírez's artificial intelligence. Several doctors in computer science work to validate that the process is carried out with guarantees.

  • A picture is worth more than tens of thousands of words

With Guard we can filter the biggest threats of the terms and conditions to read them, but there are solutions to work with this information in another format. Polisis is a tool that uses machine learning algorithms to visualize them.

Analysis of the Polisis tool of Facebook's use of our personal data as stipulated in its terms and conditions.

To train the artificial intelligence system to differentiate the different parts into which the privacy policies are divided, 130,000 of these documents from applications available on Google Play were used. Then, he trained with a set of data so that the system could focus on details such as the type of data that companies use and how they share it with third parties. Once the terms and conditions have been analyzed, Polisis shows the results through a series of diagrams that represent the flow of information.

"Until now, privacy policies play the role of being legally binding on the one hand, and being an interface on the other hand for users," pointed to Fast Company its creator, Hamza Harkous. "I hope that at some point we will reach a point where the legal text of the privacy policy is no longer the default interface for privacy information about websites."

. (tagsToTranslate) ia (t) can (t) simplify (t) policies (t) political (t) privacy (t) infumable (t) artificial intelligence (t) (t) help (t) cumbersome (t) term ( t) condition (t) accept (t) internet (t) be (t) accessible

Source link