July 11, 2020

YouTube investigates how to improve recommendations and increase connection time | Technology

"How much could our last moments together have improved if it weren't for YouTube-induced delusions?" Last September, the son of an 80-year-old retired scientist told how his father his father immersed himself in a toxic spiral of videos during his last years. He did it through a campaign of the Mozilla Foundation, created to raise awareness about the problems generated by the recommendations algorithm of this popular video platform. Now, several researchers of the company Google, owner of the audiovisual platform, have proposed a modification of this algorithm that would improve the recommendations and increase the time that users remain connected.

Artificial intelligence controls much of the information consumed today on the Internet. The algorithms created by the different platforms “observe the activity of the users and infer things that may interest them and propose them”, explains Pablo Castells, professor of the Higher Polytechnic School of the Autonomous University of Madrid. "There are many ways to do it, from the most trivial, as it is to simply offer the most popular, to more complex ways that involve looking at the behavior of each individual user."

In the case of YouTube, the platform makes a first list of recommendations with several hundred videos related to the one the user is watching and then refines that list taking into account their clicks, tastes and other interactions. The result is that of the billion hours seen every day on this platform, 70% corresponds to videos recommended by the algorithm.

The different platforms work to improve this system, make it even more precise and keep users for a few more minutes in front of the screen and this is what a team of YouTube researchers seems to have achieved, according to an article published in the magazine ACM Digital Library. "We demonstrate that our proposals can lead to substantial improvements in the quality of recommendations," says the study.

To refine the recommendations, the researchers tried to give more importance to the videos that are at the bottom of the list, since it is understood that if the user has clicked on those videos it is because they have spent some time looking for it. Thanks to this modification, the developments of the new algorithm ensure that they have achieved "substantial improvements in both commitment and satisfaction metrics."

“It's a smart way to address the problem,” says Castells, “since we know that there are areas of the screen that are more exposed, so getting a click in that area has less merit than what is achieved in an element that It's more hidden. ”

An environment of hoaxes and echo chambers

However, this type of modification still does not solve one of the big problems that these algorithms have. Because the system is optimized for users to continue watching videos, it tends to offer recommendations that reinforce user tastes or beliefs, which can create an experience that excludes other opinions and stimulates the generation of what is known as cameras Echo

In this sense, a google research on the impact of recommendation systems, published earlier this year, concluded that "feedback loops in recommendation systems can lead to echo chambers', which can reduce a user's exposure to content and ultimately change their worldview. "

The algorithms created by the different platforms “observe the activity of the users and infer things that may interest them and propose them”, explains Pablo Castells, professor of the Higher Polytechnic School of the Autonomous University of Madrid

Also various studies conducted in recent years, including an experiment developed by journalists from EL PAÍS, have shown that the algorithm usually rewards the most extreme and controversial videos, even if they are full of hoaxes. “Three years ago, my ex-wife, who suffers from mental problems, began watching videos of conspiracy theories and believed them all. YouTube did not stop feeding its paranoia, fears and anxieties with videos, one after another, ”says another of the testimonies compiled by the Mozilla Foundation.

“In the community of recommendation algorithms there is a growing concern in this regard and there are more and more efforts to promote a responsible recommendation,” says Castells. According to this specialist, we must bear in mind that “the objectives of the user and the companies are not necessarily aligned, since the company needs the user to be happy, but in a way that is profitable and that is achieved if the user is more time connected. ” The problem, says this researcher, "is that the algorithm does not know when the user is happy and when he has entered into a compulsive mode."

When Peppa Pig killed his father

The algorithm has also been questioned for its lack of suitability when offering children's content. According to a study published this year in arXiv (a repository of scientific articles that are not peer reviewed), "There is a 45% chance that a young child who follows YouTube’s recommendations will find an inappropriate video in less than 10 clicks."

The authors of this study assure that the problem lies in the fact that certain adult videos use children's video content and the algorithm does not differentiate them. Examples are thousands within the platform, from drawings in which Mickey Mouse is hit by a car, to others in which Peppa Pig appears eating his father.

The solution, according to Castells, "would happen by not offering content simply based on the volume of response, but doing something more qualitative, identifying types of content." However, this informant warns not only of the technical complexity of the problem, but of the “ethical dilemma of deciding what content it is marked as inappropriate"

The problems generated by these algorithms have led Mozilla, a non-profit organization dedicated to free software, to create a campaign to alert about it. Thanks to this initiative, hundreds of testimonies have been collected from people who have been affected by these recommendations or who have seen how a loved one plunged into the toxic spiral of YouTube. "It is sad and frustrating to see how a loved one sinks deeper and deeper into this kind of dark, negative and harmful influence," one of them laments.

. (tagsToTranslate) youtube (t) investigate (t) improve (t) recommendation (t) increase (t) time (t) connection (t) study (t) publish (t) worker (t) company (t) propose ( t) update (t) algorithm (t) platform

Source link