has warned the influencer Laura Escanes this week on Twitter: “I have received a link where there are AI edited nude photos of me. Aside from feeling totally used and exposed, there is something that it makes my blood boil. A woman’s body is not used. Neither for pleasure, nor to abuse or manipulate. I am disgusted by the person who created them, but also by those who they are there and they find it funny and they are silent“.
What Escanes condemns is the proliferation of new applications that use the AI (Artificial Intelligence) to create nude images and erotic scenes with any person’s face. And they have flooded social networks. The AI explosion took place at the end of 2022, with the release of ChatGPT. Once the ban is open, there are many companies that have dedicated themselves to exploiting the niche. And, on the internet, The niche that generates the most money is that of sex.
These technologies generate realistic (but not real) erotic images. Apps, whose names we will omit, have slipped into the stores of all operating systems. And they appear as advertisements on the main social networks. Some have already been removed. But, like a hydra, they are reborn with new names and appearances. And the most extreme are advertised with acronyms that give them distinction over the others: NSFW.
nsfw
NSFW is the acronym for “Non suitable for work”. Actually it is a euphemism to warn that it is a program that generates sensitive material, That is why it is not convenient to have it open in situations such as the workplace. In this case, create erotic and sexual content. Through the relevant commands, the user can invent a person and get naked photos of someone who doesn’t exist. Or if?
Because the problem comes when, as it happens with some of these applications, they allow you to upload real images. These applications admit photos of already existing faces, such as the case denounced by Laura Escanes. Because it is not necessary to be a public figure. Just have someone get a picture of her face and insert it. Although the body is not real, these programs can generate sexual ‘fakes’ with another person’s face without them knowing.
This type of applications, which are usually paid and whose fee ranges from 40 and 70 euros a yearThey do not ask for permissions of any kind. The malicious user can steal a photo of a person’s face on any social network and, with it, build a pornographic image with that face, without requesting any kind of consent. They are not, however, real images. But could they be committing some kind of crime?
The Atrioch case
This type of content is called deepfakeswhich comes from fake (false) and deep [learning] ([aprendizaje] deep), which is the set of machine learning algorithms used by artificial intelligences. Since the advent of artificial intelligence on the internet, some have been seen deepfake already iconic, Like the photo of Pope Francis dressed as a Philadelphia rapper.
That happened in March. Two months earlier, the first major scandal related to this type of material had already broken out. A known streamer American named Brandon Ewing and known as Atrioc was surprised in one of his directs with one of these pages that generate deepfakes sex on one of her open tabs. There it could be seen that I was creating this type of material with photos of two well-known streamers as Maya and Pokimane.
The discovery made the case explode because, in addition to the two affected, other streamers They decided to investigate and found material with their faces. Not only on the page in question, but on large pornography portals. Although they had done absolutely nothing, there were already sex videos of them on the main porn websites. A streamer known as Sweet Anita even identified videos supposedly of hers on Pornhub, the world leader in the sector.
That page, which was called bavfakesIt ended up being closed. Mainly due to the efforts of QTCinderella, another streamer American who was involved when she saw that there was explicit content about her on the web. But, far from disappearing from the web, new applications of this type appear every day. And the main social networks like Instagram or Tiktok They are not filtering these types of ads and show them as soon as the algorithm is triggered. And Google offers nearly 120 million search results for the words “Deepfake porn.”
The law
That is to say, the moment has already been reached in which it is not necessary for someone to have created erotic material in their privacy and to have it leaked in order for their sexual material to circulate. Is this punishable? EL PERIÓDICO DE ESPAÑA, from the Prensa Ibérica group, spoke with Borja Adsuara, an expert lawyer in Digital Law, who recalls that “I have been warning about this situation for some time”, to the point of having written a mini guide on these terms “with the idea of being as didactic as possible”. Because more and more legal questions arise with the advancement of these technologies.
“Here the question is that we are talking about realistic images, but not real. The law makes that differentiation. The first could pass for real because they are created by AI. The second are photos or videos that were taken at some point. And curiously, the law does penalize realistic images of a sexual nature with minors, they don’t have to have happened. With that they are generated, they would already be a crime“.
But, in the case of the elderly, “it is not contemplated. And the solution would be to cut and paste the definition of ‘child pseudo-pornography’ and also apply it to adults. But it has not been done at the moment. And all this that are generating the AI It is something that will have to be legislated. Because until now, things were different and they have been changing”.
The Ants Case
Before, the dissemination of real sexual images was not sanctioned if the person had given consent to capture them. But that changed in 2015“, says Adsuara. It was after the case of Olvido Hormigos, the former socialist councilor of Los Yébenes (Toledo), who suffered the leak of an intimate video.
“From there, the law changed and they could no longer be disseminated without the consent of the person. It was the best-known case, but there were and are many. It is the so-called revenge porn: a person consents to be filmed or photographed in intimate moments with their partner. Then they break up and the ex, to get revenge, spreads the images. After the change in the law, it cannot be broadcast if the recorded person does not allow it.”
Quote Adsuara too “the case of the Iveco worker that he ended up taking his own life because they shared a sex video of him among his colleagues. She did not commit suicide because of the dissemination of the video, but because of the humiliation she was subjected to afterwards.” And, there are even other cases in which images that are not hers are attributed to a person, “such as a soldier who shared the photo of a woman in the shower, who looked like a partner. it wasn’t even herbut it served so that that person, who knew nothing and was not the protagonist of the photo, was harassed”.
Similar to the latter, the lawyer recalls that a very similar case has already occurred in the Spanish political sphere: a person shared a photo of a topless woman on networks, falsely assuring that it was Teresa Rodriguezthe leader of Adelante Andalucía.
Crime against moral integrity
Related news
If a person finds himself in the situation that they have disseminated false sexual material with his face (photos or videos), he could be prosecuted with the article 173.1 of the Penal Code: “Anyone who inflicts degrading treatment on another person, seriously undermining their moral integrity, will be punished with imprisonment from six months to two years).
The lawyer insists that the law must serve “not only to punish, but to deter. The norms have an ‘axiological dimension’ (values), which, in other words, is like when a child is told “that is not to be done& rdquor ;. AND that message won’t get through until someone goes to jail for creating with Artificial Intelligence and disseminating false sexual images of another person, without their consent. It will be the only thing that will dissuade the rest.”