Why do we only care about Pegasus if we are constantly watched?

Why do we only care about Pegasus if we are constantly watched?

Pegasus is not the only application capable of spying on us. In our day to day we are constantly monitored and we provide our permission completely voluntarily

Martha Beltran University Professor, Rey Juan Carlos University CARLOS ALBERTO VILLARAN NÚÑEZ Collaborator and researcher in Computer Security and Privacy, Rey Juan Carlos University

In recent months, a curious phenomenon has occurred: the constant news related to the hybrid threat exerted by some governments or the Pegasus tool has made many citizens distrust technology, question whether they are running unknown risks and seek information about how to protect your privacy in an environment that is proving to be so hostile.

However, we continue to observe how in their day to day these same citizens continue to make decisions that are not very reckless from the point of view of their privacy, the protection of their data and, in short, the guarantee of their rights and freedoms.

The privacy paradox

This is what is called the privacy paradox. This paradox states that despite the fact that most technology users claim to be concerned about their privacy, their daily decisions are not consistent with this concern. In many cases, these decisions too easily place large amounts of sensitive data in the hands of third parties, such as technology companies, telecommunications operators, the media, banks, data managers or administrations.

The first explanation given to this lack of coherence was based on ignorance, that is, on the lack of sensitization and awareness. Citizens were concerned about their privacy, protecting it was a priority for them, but they could not make decisions consistent with this concern if they did not understand the risks they ran when making certain decisions such as installing an application, registering on a platform or granting a certain consent.

Over time, both the media and different consumer organizations, user associations, personalities, academics, etc. they have fostered an improvement in this sensitization and awareness with different campaigns, publications or activities. There is more on data-driven business models, the economics of surveillance, and the risks we take when making certain decisions. Why then hasn't the attitude of most technology users changed?

It is at this time that new explanations appear related to anthropology, sociology or psychology, to mention just a few examples. Mainly, it seems that the average citizen is pragmatic and makes what is called a calculation, he makes a rational decision weighing the risks that he takes against the benefits that he obtains.

Except in the cases of users who are very jealous of their privacy, this calculation tends to tip the balance towards that application that everyone has already installed, that platform with so many interesting features, or that consent that will open the doors to that service, supposedly free, that was so much expected to be able to use.

emotions matter a lot

All the agents that participate in the surveillance economy are aware of these contradictions to which users are subjected. And they know how to exploit them. The aim is for users to believe that they are really in control and that they are the ones who voluntarily decide to give up their personal data in exchange for some short-term advantage.

The release is meant to seem unimportant, so it's often camouflaged behind (dubious) informed consents and endless privacy policies and terms of use that no one reads because they're deliberately long, convoluted, and hard to understand. But, if you have not read them, it is your responsibility, you are an impatient person, perhaps unconscious. Or perhaps, you have nothing to hide and you also trust the provider with whom you are carrying out the transaction, what non-legitimate interest could they have in your data? (understand the irony).

At the same time, the benefit must appear very substantial compared to the perceived risk. I would be a fool, a paranoid or a conspiracy theorist if I gave up all those advantages offered by that big technology company, a newspaper, the bank, the mobile operator or the streaming platform for not clicking on a simple button. What I give in exchange I do not see; what they later do with it, either. In some cases, I suspect it; in others, not even.

As if all this were not enough, the interfaces, the pages, the tools: everything is designed to give us that last little push. This is where the dark design patterns or dark patterns come into play that show us some options marked by default, that make it so easy for us to register and so complicated to unsubscribe, or that do not show us until the last moment the obligation to provide a credit card number, to name just a few examples. Just when we were almost tasting the caramel...

What can we do?

Are we resigned to surveillance then? Do we give in to the siege? For a time, it was advocated giving citizens the possibility to choose between paying for technology with their data or with money, as was done with traditional products. But this contains a danger: that privacy becomes a luxury, a right only possible for those who can afford it.

So we are currently working on two fronts. The first involves changing business models, so that all companies that obtain direct or indirect benefits from marketing with our data improve transparency and have incentives to migrate to new ethical models that respect the rights and freedoms of users.

In the meantime, citizens must be empowered and allowed to be at the center of their decisions. For this, they must be trained as digital citizens, aware of their rights and of all the impacts that their decisions may have. And they must be provided with tools that allow them to easily make truly informed decisions: graphic labels and trust seals, privacy-friendly configuration automation mechanisms, non-compliance monitoring solutions, access and transparency panels, consent receipts, of recommendation and support for the decision (our last publication just supposes advances in this direction).

Only when these tools become widespread and available to all citizens will those who profit from their data feel compelled to behave differently.

This article has been published in 'The Conversation'.

Source link