The University of the Sorbonne in Paris organized a video conference with Edward Snowden last December. The former agent of the National Security Agency of the United States and author of one of the greater leaks of history lives now in Russia. It was an absolute success. Snowden spoke of privacy: "A thoughtless thought, a youthful indiscretion or a mistake that is forgotten are things that have already ceased to exist," he said. They do not exist because everything is recorded.
Facebook is one of the most visible companies that live to conserve and analyze all that information, but there are hundreds of actors in the trade and data collection. During 2018 Facebook has been involved in major scandals related to the privacy of its users: Cambridge Analytica; the use of the social network by Russian agents to influence US policy; Data loss of almost 30 million accounts.
This week, Apple has decided that Facebook can not place apps on its platform during a trial period before releasing them to the market. The reason: Zuckerberg's company used this to monitor young people who paid to know all their online activity. In spite of everything, Facebook has just announced more than 1,500 million daily users in December (9% more than the previous year) and record profits in the last quarter.
The youngest are the ones who use Facebook the least, and the most aware are the loss of privacy, but they are on Instagram and WhatsApp, property of Zuckerberg. It does not seem that the new generation will leave the Network to preserve their privacy. The auditorium of the Snowden talk at the Sorbonne was full of members of the Z generation, those who come after the millennials They assume that online privacy is impossible: they know that everything that is on the mobile or in the browser can be public one day.
The mobile is too valuable
Snowden is an idol because it uncovered what a generation assumes as inevitable: everything is recorded and our online actions can be used to sink our reputation, manipulate our opinion or predict our behavior. But even if we intuit it, we do not abandon the mobile phone or the Internet. It is too valuable: it is not negotiated.
What to do? There are at least three options. One, to legislate more before the big companies whose business is to collect and sell data. Two, reduce the use of the mobile phone to the minimum and try to leave as little trace as possible with open source tools. Three, resign yourself to espionage and trust that you do not touch us because you are unimportant.
It is difficult to understand the level of detail in the information of our lives that circulates in the market
No option is really viable on its own. As with all major problems, the solution is complex. Legislation can help. But first you have to understand the challenge. Today the data, our data, is an opportunity and a tragedy. It is difficult to understand the level of detail in the information of our lives that circulates in the market.
For example an application, Ovia, that offers a paid version to insurers and large companies the information on how many of their employees want to conceive. Equally true, the analysis of thousands of radiographs can also help to detect a serious illness early. It is difficult to introduce legislation that allows good use without straining the bad.
Legislative finesse on issues that are difficult to understand and constantly changing is unlikely, but that is what must be demanded. A first step would be more transparency: companies should recognize what data they use and what for, without bombarding users with terms of service. In recent days, another of the scandals that have affected Facebook has been in line with this: the initiative of a number of organizations so that users could know exactly why they saw certain ads on the social network and what information they were interested in. disposition of the advertisers, has been neutralized.
A millionaire business
Without strong social pressure it is difficult for nothing to change. The Interactive Advertising Bureau foresees that American companies spend 2018 "more than 19,000 million in the purchase of audience data and in solutions to direct, process and analyze these data, 17.5% more than the previous year". This growth encourages more companies to enter a sector that lives its best days of lack of control and 'everything is possible'.
We have not yet seen any of the serious consequences this may have. Some, like Yuval Noah Harari, suggest that data and algorithms will inevitably restrict our free will. In fact it is more disconcerting: we have no idea even of the immense stream of analyzable data that we leave in our navigation through the Net, nor of the final use that they may have.
There are even some who say: "It's good that they know how I am, so they send me personalized discounts"
Until now the disinterest that this caused among the public on foot was justified with two excuses: one, who bothers a few personalized ads? And two, who cares so much about my little life as to want to see the messages I send to my husband, what I do every Friday night? I do not have anything to hide. There are even some who say: "It's good that they know how I am, so they send me personalized discounts."
But between the personal ads and the smallness of our lives there is a lot of gray. It can be annoying that just by visiting a website they send us an e-mail asking why we have not bought. Or that same thing happens when leaving a physical store. It can also happen that the way we touch the screen of the mobile know that on Thursday from 9:00 pm to 11:00 pm he was drunk. That line is not going to cross, say some. Insurance? New York is the first state that will allow to medical insurers to use their clients' networks to charge them more according to their behavior. Companies must demonstrate that their algorithms do not discriminate unfairly.
And in 2025 what?
Can anyone guarantee that in 2025 Google or Facebook are solvent US companies and have an address that is susceptible to public opinion? Who can assure that the freest time in the West with almost half a century of peace does not see its future threatened by unthinkable changes of government?
In 1942, René Carmille was an intervener in the French Army. During the Vichy regime, he worked on the census. The collaborationist government wanted to do a new census to find out how many Jews there were in France. The process was then by punch cards. Carmille's team pirated column 11 (the one that corresponded to the creed) so that the machines could not read it. Thousands of people suddenly appeared without religion. Carmille was discovered and died in the Dachau camp. France was one of the European countries that more Jewish population managed to save. Carmille had to see in it.
This story does not imply that it will repeat itself. Nor that new computer heroes are necessary to erase our history (what will be considered then do something wrong?). But it does indicate that the combination of details that today seem harmless tomorrow can be explosive. Today to know that a citizen is something, likes something or believes in something, there is no need to look in the census. It's all in your digital life. How to avoid processing such data? It's hard. That's why you have to know the consequences. That way we'll at least know the size of the disaster.