Social networks isolate and accentuate prejudices. Although it sounds paradoxical and contradictory with the name attributed to these platforms, the algorithms used to delimit the contents appropriate to the affinities shown in the network and the eagerness to get hooked to obtain the highest possible revenue from advertising create a phenomenon called "filter bubble" ": A space in which only notifications of certain characteristics enter and other contradictory ones are avoided that could provide different points of view to make decisions and form more reasoned opinions. A work on this effect made by Gema Valencia from sexist profiles, in favor of equality and neutral, has been recognized by the Pablo de Olavide University in the awards for Gender Studies.
The bubble filter starts when Google changes its search system in 2009 to offer personalized results and other companies incorporate similar mechanisms, "Only we are shown what we want to see or, rather, what an algorithm thinks we are interested. It is a passive process in which it is unknown how this filtering or editing of content is carried out, so we do not know what part of reality we are missing", Highlights the study.
In the reinforcement of positions such as machismo, according to the work, the effect is even more harmful. "If people with perspectives opposed to equality have less access to feminist content and are exposed to information that reinforces their visions, they can be trapped in a loop that leads to a macho polarization," warns Valencia.
Keywords that disappear
The researcher has put the network to the test by creating profiles with intentionally macho, feminist and neutral characteristics. One of the conclusions of his study shows a worrisome fact. "The contents referred to harassment, abuse and aggressions sexual or murderous murders disappear for some users and users from one stage to another. This is an indication of how the recommendation system could end up reinforcing some positions or others with the omission of news so momentous for the struggle for equality and democracy. If the macho violence disappears from the news Feed of users, the distortion of reality and the perception of inequality could be affected ", points out Valencia in the conclusions of his study.
Another result reflects how Facebook, in most cases, does not even show 50% of total publications disseminated by the pages on which the user has shown preferences. "It is important to consider this filtering if more and more people are informed through social networks, with Facebook at the head (...). The fact that news is lost it hinders the development of a collective project in society and, consequently, the social consensus necessary for the fight against gender inequalities ", concludes Valencia.
"The reality is, but it does not come. If we do not leave the bubble, our beliefs are reinforced, "says the researcher, who now extends the research for her doctoral thesis.
Stella Luna de María, general director of the consultancy PentaQuark and unaware of the study, agrees. "The filter bubble is a reality, it is a vicious circle created by the intention of the companies that you are still hooked on, that you spend more time in the networks through the searches carried out by the user, his profile, the browsing pattern , the likes. They offer what you like, more of the same, "he explains.
"The social network does not want balance. But not with an ideological interest but commercial. However, the result is that it sends information that makes it easier to spread false news and manipulation. The user is trapped and can lose their critical capacity ", warns this expert in big data.
Both Valencia and De Luna agree on the solution: an active attitude of users and education. These departures imply an active attitude in the search for information to "break the vicious circle" and let information into the bubble that is contrary to our prejudices. "You have to change habits in the networks and be critical of the sources. If only instant gratification is pursued and a single version arrives, we are more manipulable, "says De Luna. This is what Valencia calls "media literacy", to which it adds the need to be "responsible" for what we disseminate or support in the networks.
The bias of the information we receive through the networks is more worrying if we take into account that 48% of the population with Internet access in Spain uses Facebook to read, share or comment on news, according to the survey Digital News Report 2018, carried out by the University of Navarra in collaboration with the University of Oxford. To this network is added the growing influence of Instagram (the social network with more growth), Whatsapp or Twitter, where the same effect of the filter bubble occurs.