Francisco Castañares, a 58-year-old from Cáceres, opened an account in Facebook in 2016 to "be where everything happens". Last Thursday when accessing the social network, a tab popped up: "You will not be able to publish or comment for 24 hours." The company of Zuckerberg He considered that his latest publication violated the “community norms” of the platform on nudes or sexual activity. Francisco had shared a story from EL PAÍS about the ironing of breasts to African teenagers, a barely investigated ancestral torture that is carried out to prevent them from attracting the attention of men. That piece was accompanied by an image, taken in Cameroon, in which a woman's breasts are crushed by a cane to prevent their growth. That was the photograph of the discord, the reason why Facebook censored the content.
“It seems to me an outrage, as soon as a centimeter of skin is shown, Facebook's automatic censorship instruments are activated. This does not respond to the informative needs of the 21st century, ”criticizes Castañares, who has not yet been able to make any movement in the social network. It refers to the algorithms that control what can be seen and what is not, mathematical formulas that escape human analysis. "I asked some friends to post the same news on their wall in protest, but their accounts were equally blocked," he says.
For its part, Facebook assured this newspaper yesterday that the images in which female nipples are seen violate its norms. In the specific case of the news about Cameroon, they move that the publication will be reviewed. For Facebook there are “unacceptable contents”, publications that may be “sensitive” for some users. "By default we delete sexual images to avoid sharing content without permission or minors," they explain in their code of ethics. The controversies over the continuous censorship of Facebook are not new. One of the most famous was that of 2016, when Norwegian Prime Minister Erna Solberg saw how the platform removed from her profile the image of the Vietnamese girl burned with napalm, a symbol of the barbarism of the Vietnam war. The technological rectified.
"Above the internal policies of Facebook, the right to information and freedom of expression must prevail," says Borja Adsuara, a lawyer specializing in digital law. The problem is the automated content recognition systems used by the platform, considers the lawyer and telecommunications engineer Sergio Carrasco. "They should introduce changes in the review systems so that the same criteria do not apply to the contents of individuals and those from the media," he explains, a technically viable option.
In part, Carrasco justifies the Facebook methodology. "We have demanded immediacy in the removal of violent content and the machines are not perfect, so they have chosen to cut the root: they block the suspicious content and review it." German law, for example, demands the Internet giants that eliminate publications that may encourage terrorist actions in a maximum of 24 hours. Otherwise, they face fines of up to 50 million euros.
Other experts believe that the problem is not Facebook. “We forget that it is not a ministry, but a private company, we cannot demand a certain behavior because it only responds to its board of directors,” says Paloma Llaneza, lawyer and author of Datanomics. “It is their right to establish their conditions of use as long as the contract with the users is not contrary to public order. Their decisions are influenced by the puritanism of the United States, where tacos are not allowed on public television and some programs are broadcast in deferred to control any behavior outside the norm, ”says Llaneza. "Americans have dread naked," he adds.
"Deciding what content remains on Facebook and which ones are removed is one of the toughest tasks we face," spokesmen of the social network told this newspaper yesterday. "Context is everything," they added. That is the reason, they explained, that has led them to approve a series of rules detailing what is allowed. "As those guidelines show, we take nudity in children and child abuse very seriously," they said. For this they have a team dedicated to review their policies and "thank" the collaboration of industries, experts and other groups to "be sure that they make the right decisions."
. (tagsToTranslate) facebook (t) stop (t) access (t) news (t) abuse (t) child (t) network (t) block (t) user (t) share (t) piece (t) the country (t) ironed (t) breast (t) adolescent (t) African (t) African (t) torture (t) ancestral (t) investigate