Artificial intelligence strips anyone naked – Some applications only work with pictures of women

Bloomberg warns about artificial intelligence applications becoming more common, which are being promoted more and more strongly on social media. Social platforms, on the other hand, fight against unwanted advertising.

Artificial intelligence can be used to create pornographic material without the consent of the persons appearing in the images. Adobe Stock

Apps and websites that allow you to remove the clothes of the person in the photo are Bloomberg’s increasingly popular.

Artificial intelligence can’t really see under anyone’s clothes, it only replaces a naked body in the picture. According to Bloomberg, many of these apps only work on female bodies.

Based on an analysis by Graphika, the advertising of such applications and sites on social media has increased by no less than 2,400 percent compared to the beginning of the year.

According to Graphika, 24 million people around the world used the services in September.

The target is ordinary people

The apps are part of a worrying trend where pornographic material is produced and even distributed without the consent of the person in the image.

The phenomenon itself is not new, but in the past deepfake porn produced without permission has mainly been public figures. In addition, producing the material has traditionally been very laborious. However, with the development of artificial intelligence applications, the technology is now practically within everyone’s reach.

– We have found that this is increasingly being done by very ordinary people with pictures of very ordinary people, Director of Information Security at the Electronic Frontier Foundation Eva Galperin told Bloomberg.

According to Bloomberg, the applications have been advertised on Reddit, X and YouTube, among others. Tiktok and Meta told Bloomberg that they have blocked certain words typically used in the marketing of these apps to prevent questionable services from being promoted on their platforms.

Source: Bloomberg

ttn-54