The law, the unused key to protect minors from "fake" images

Just two days after we learned the news of the spread of fake photographs of naked girls manipulated by artificial intelligence in Almendralejo, the police reported another similar case in Ayamonte (Huelva). On this occasion, a minor under 14 years of age had used a more rudimentary tool (the image editor Photoshop) to simulate the naked body of other minor victims.
The coincidence of these two events is an ideal occasion to place the focus where it should truly never have been placed before one of the controversies most commented of the week. We have a social problem of growing magnitude, a problem perhaps not yet calibrated with our children, a problem that spreads like wildfire through the social networks. But the problem is not the technology. In less than a week in two different corners of the country, two groups of young people have used two very different technologies to commit the same alleged crime. It doesn't matter that in Almendralejo the weapon was an app that is distributed through Telegram and that uses rudiments of artificial intelligence (some experts consider that such technology cannot even be called artificial intelligence), that in Ayamonte the tool is a semi-professional software well known in graphic design or that they had used a collage of photographs cut with scissors and glued together. If we are sick with something, it is not artificial intelligence. The children involved decided to use a tool at their disposal to sexualize minor female companions. They could have committed any other crime, made a joke, or also used the bodies of their peers of the same sex. But no, the problem was not the technology used, the problem was the deep crisis of values, the hypersexualization of childhoodthe deficit in equality education… with or without artificial intelligence.
At least that is the thesis of many experts consulted in recent days in the heat of events. The loudest of the voices, for obvious reasons, is that of the doctor Mirian Al Adib, gynecologist, sexual education disseminator and mother of two of the victim girls in Almendralejo. Dr. Al Adib has long warned of the hypersexualization of childhood. “Today's girls,” she says, “don't play with dolls, they play twerking on TikTok”.
Los crimes against privacy, the honor and good name of people (specifically women) have existed long before the arrival of easily accessible artificial intelligence technologies. For decades, more or less crude montages have been circulating on the internet with the faces of celebrities (almost always women) inserted into naked bodies or in explicit sexual acts. It is the call Porno Fake.
But since 2017, the use of artificial intelligence was added to this attack, which allows more realistic and more accessible representations even in video images. This technique is known as Deep Fake. More than 90 percent of the Deep Fakes that circulate on the networks are pornographic according to a report by the company Sensity AI.
Rebecca Delfino is a professor of digital law at Loyola University (United States) and has dedicated herself to studying the Deep Fake phenomenon. “The practice is clearly invasive, abusive and harmful. It is a kind of sexual violence,” she says.
But is it easily prosecutable by law?
Unfortunately, no. In most countries, there is, according to Delfino, an absence of specific legislation that makes this attack practically unpunished. “You can't put a song on your YouTube channel without respecting copyright, you can't play a movie clip on a podcast without permission, but If you use the face of a famous actress and stick it to the body of a woman having sex, she may not be able to sue you.”.
At least in some places, because certain legislation is beginning to put a stop to this practice. In the United States, there is no federal legislation in this regard, but some states have designed legal channels for regulation. It's about going to the calls “Celebrity rigths” which give famous people the right to control the use made of their image for any purpose. This legislation protects the most recognized personalities. For anonymous citizens, the most common way is to file a lawsuit for violation of data protection laws. A rather indirect route and very little connected to the true depth of the aggression: sexual violence.
The European Union also lacks specific legislation on the matter. It is true that both Parliament and the Commission have been warning since 2021 of the difficulty of detecting false content of this nature and the need to regulate its use. In fact, the European Parliament approved this year to begin negotiating the content of a Artificial Intelligence Law that limits the use of this technology and its possible harmful or criminal effects. The law is expected to come into force in 2026, so we still do not know what protection it will provide against cases of girls like those from Almendralejo or famous women victims of Deep Fake porn.
In Spain there continues to be a debate about whether these attacks can be punished under the legislation that protects honor and privacy. The laws consider an illegitimate intrusion into privacy “the capture, reproduction or publication by photography, film, or any other procedure, of the image of a person in places or moments of their private life or outside of them.” The problem is that these technological products are a false reproduction of that person, and not a capture of their real image. In the 1980s, the law did not contemplate a threat such as that of new technologies.
Pending legislative reforms, these practices seem to be more easily stopped by resorting to violations of personal data protection than by criminal complaints. In fact, in the recent Digital Services Law, in force in Europe since this year, it is not even contemplated that companies providing digital services are obliged to remove adult content that has been reported for not having the consent of those who appear. in them.
Once again, it is recognized that the way to tackle this growing problem must be education in values from early childhood. Neither the limitation of technology nor the laws have yet found the solution.