YouTube tries to see less 'conspiracy' videos | Innovation

YouTube tries to see less 'conspiracy' videos | Innovation


It is not difficult to find on YouTube videos that say that Earth is flat, that he landing on the moon was fake or that the immigrants take away all the work. Conspiracy theories, whatever their intentions, are not hard to find on the ubiquitous Google platform. The American giant, however, responds to criticism and has taken a step forward and will stop placing them in recommended positions (200 million a day, almost nothing).

The company has posted a post on his official blog in which he explains his intention to modify his recommendations. YouTube said that it would no longer suggest videos with "content limits" or those that "misinform users badly", even if the images do not infringe community guidelines. That is, YouTube will not delete the videos: it will only stop recommending them. The decision not to click (and not believe everything you get on a screen or send you on WhatsApp), remember, it's yours).

YouTube, however, has explained that the amount of videos affected by the policy change is less than 1% of the total number of videos on the platform. To carry out this change, Youtube will trust both machine learning and "real people". The human evaluators, deployed by "the entire US," said the company, will see different YouTube videos and offer comments on the quality of those videos. These judgments will help inform what the algorithm is marking.

This will be a gradual change and, for now, will only affect the US. "Over time, when our systems become more accurate, we will begin to deploy these changes to other countries."

YouTube and other powerful technology platforms have faced growing criticism for not monitoring the content that users post. Its recommendation engine has been denounced for pushing users to dubious content. Even those people who do not show a special predisposition to consume them. He has also been blamed for widening the political division in the country, taking viewers and partisans to more extreme views, as The New York Times explains.

At the end of 2017, YouTube began labeling certain videos as content "controversial religious or supremacist", so the videos are not monetized with advertising. In addition, it disabled features such as comments and I like it. Some videos appear behind a brief message that says the videos may be inappropriate or offensive.

YouTube provided only three examples of the types of videos it would stop recommending: those that promote a false miracle cure for a serious illness, others that claim that the Earth is flat or make blatant claims about historical events such as the September 11 attacks .

"We believe that this change will help us achieve a balanced service between maintaining a platform for freedom of expression and fulfill our responsibility to users," says YouTube in his post.

YouTube said it was constantly adjusting its recommendation system. He assures that last year he already applied "hundreds of changes". In his early years, he explains, he suggested those videos that he thought would bring more clicks or reproductions. Over time, however, he discovered that the people who created the videos started trying to play with the system with clickbait titles. Thus, the intention of the company is that the user considers the time spent in it as "well employed". And that, without falling into the temptation that the next recommendation is not too similar to the video just played.

.



Source link