Paloma Llaneza (Madrid, 1965) was one of the first users of Facebook. But this lawyer expert in data protection, to see how the social network and the information he compiled worked, he deleted his account. The day that Zuckerberg's company bought WhatsApp, he also uninstalled this instant messaging service from his phone. And he sent a message to all his contacts: "Facebook has bought WhatsApp and it has been purchased by users, He has bought it for your data. I'm leaving".
"The data is valuable because it says a lot about us and we are potential voters, potential buyers and petitioners of potential transportation, health, education and credit services. The world revolves around our needs. The better I know you, the better I will be able to sell you what I think you need even if you do not think you need it yet and deny what you ask, "he explains in an interview with EL PAÍS. Llaneza, who is also a systems auditor and cybersecurity consultant, just published Datanomics, a book that explains what technology companies do with our personal data.
The day that WhatsApp talks about everything we talk about, the world ends
The cost of having installed applications such as Facebook, Whatsapp or Instagram "is very high": "The day that WhatsApp talks about everything we talk about, the world ends". Technological giants get to know the user better than a parent, a couple or even yourself. "What tells us most about a human being is what he hides from himself: his emotional part. Social networks allow you to know your mood in real time, if you are suffering from loves or seeking medication for nerves, if you have depression, if you abuse too much alcohol, if you go out a lot or if the music you hear indicates a tendency to suicide or a transient melancholy that is part of your character, "says the lawyer.
Facebook analyzed data from more than six million Australian and New Zealand teenagers to determine their mood and provide advertisers with information about when they felt most vulnerable, according to a company document in Australia leaked in 2017 by the newspaper. The Australian. By knowing how a person is and how they feel at every moment, companies can sell at the right time "anything that emotionally needs": "An idea, a thought, a political party, a way of life or even a feeling of superiority national". "What is worrying has worked very well in the Brexit, in the elections of Trump and in some recent election in Spain ", affirms Llaneza.
How to avoid the collection of our data
To prevent technology companies from collecting data about us, Llaneza states that the only solution is to erase this type of application: "There is no intermediate advice, it does not matter to share more or less publications". "The privacy parameterization that Facebook does is intended for third parties, but Facebook sees everything and saves until your regrets, even that email that you were going to send putting a donkey to someone and then decided not to do it. Because repentance says much more about you than what you send, "he explains. Making less intensive use of these applications would not serve, according Llaneza, since "have a lot of permission to access your mobile phone": "Even by how you move the phone and typing, have a biometric fingerprint that identifies you with a high degree of probability. "
Facebook sees everything and keeps up your regrets, which tell you much more than what you send
The devices and applications are designed to be "usable, cool and highly addictive". The problem that the perception of risk among users "is very low": "Nobody is aware of the brutal amount of information that gives a mobile even without touching it." "Having a cell phone or Alexa on the table in your house seems the most normal, and yet, you would not have a man sitting in your living room every day observing how you talk or watching as snacks. It is much more dangerous to have Alexa on the table than that man, who has a human memory and will forget half of what he hears, "concludes Llaneza.
The companies extract profitability from the data of their users "based on selling targeted advertising and generating other businesses around that information". While in Europe there is a "more or less strict" regulation, in the United States "the fact that you fall a more or less serious penalty, have access to different universities or refuse a credit, insurance or medical service will depend on the data about you. " ¿Why, despite not having never paid a debt, you can be denied a loan? "Because the new systems are predictive and do not analyze the past, but read the future," says Llaneza. If a model predicts, for example, that someone has a high likelihood of divorce and their economic capacity will go down, they may not be granted a mortgage.
The use of these systems carries a risk, since the data with which the algorithms are trained They are conditioned by our knowledge and prejudices. In addition, the machines end up being sometimes a black box which makes it impossible to understand what path the model has followed to reach a certain conclusion: "One of the great questions that we have before us is algorithmic transparency. You have made a decision: Why and how? " "Data ownership is already regulated. What we now have to regulate is the control over the result of the treatment on these data ", affirms the lawyer.