Taylor Swift’s “nude photos” boosted the social media giant’s momentum

Message service X has taken new measures to curb the spread of photo fakes made of the megastar of the music world. The phenomenon is increasingly affecting ordinary people as well.

Pornographic content created with artificial intelligence forces social media giants to react and develop methods of action to eradicate non-consensual content from their platforms. The danger is the spread of fakes that look believable, potentially in front of hundreds of millions of eyes.

Elon Musk’s owned messaging service X has had to limit the megastar over the weekend Taylor Swift’s searching by name within the service.

The reason is the artificially generated pornographic images of the artist, which have already collected tens of millions of views on the platform. Last Thursday, it was reported that X has tried to delete the images and close the accounts that spread them.

Content cannot be searched for by Swift’s name

The measures taken have most obviously not been enough to curb the spread of the images.

If you now try to search for content in X using Swift’s name, the search tool will give you a message that publications cannot be downloaded at the moment. The user is prompted to try again later.

According to CNN, which reports on the subject, no similar restriction has been observed so far in, for example, Instagram and Reddit. CNN says that the images have been distributed mainly in X.

Like other major social media platforms, X’s terms of use also forbid the sharing of manipulated and out-of-context content, if they can mislead people or cause damage.

– This is a temporary measure, X’s representative told CNN over the weekend about restricting the search function.

Also Tavist as destinations

Pornographic photo fakes of celebrities created with image processing programs are not a new phenomenon, but thanks to advanced artificial intelligence applications, practically anyone can create new material without much skill.

In addition, artificial intelligence enables more convincing and harder-to-detect fakes than before.

In December, we reported how apps and websites used to create AI porn have become more popular than before. You can upload images to them, which the artificial intelligence “undresses” in a few seconds.

Artificial intelligence can’t really see under anyone’s clothes, it only replaces a naked body in the image. According to Bloomberg, many of these apps only work on female bodies.

According to a recent report, these services had almost 25 million users worldwide last September.

– We have noticed that this is increasingly being done by very ordinary people with pictures of very ordinary people, Eva Galperin, director of information security at the Electronic Frontier Foundation, told Bloomberg in December.

Bloomberg says that applications and sites are advertised on Reddit, X and YouTube, among others.

Tiktok and Meta have told Bloomberg that they have blocked certain words typically used in the marketing of these apps to prevent questionable services from being promoted on their platforms.

Sources: CNN, Bloomberg

ttn-54