This trick confuses the AI ​​- May come in handy

Professionals in the creative fields have taken to poisoning their work in a cunning way to protect it from AI.

Gobbling up pictures? There are a number of moral problems associated with image generators. The image was created with the Dall-E 3 image generator.

The proliferation of broad language models has raised many ethical questions. One of the most significant is related to the generation of images, where an artificial intelligence trained with a huge image bank can create illustrations through the input given to it.

Illustrators, artists and other creative professionals have been accusing companies working on image generators of unauthorized use of their work as training material for artificial intelligence for a long time. For their part, companies and artificial intelligence laboratories are often reluctant to reveal exactly what footage has been fed to the artificial intelligence.

Computer science researchers at the University of Chicago have created a “poison” that allows creators of creative work to protect their work from unauthorized use.

The new Nightshade tool “shades” the image at the pixel level in such a way that the artificial intelligence mistakenly thinks that the image represents something completely different. To the human eye, on the other hand, the differences are imperceptible. This way the artificial intelligence can think it is looking at a hat instead of a cake or a toaster instead of a handbag, MIT Technology Review – writes in his article.

Basically, an AI that has analyzed enough Nightshade images will eventually start producing images that no longer match the original input. The research team managed to exploit the AI ​​vulnerability after experimenting with feeding poisoned images to the Stable Diffusion model they trained themselves.

of VentureBeat according to the team that developed Nightshade is amazed by the popularity of their tool. In just five days, the AI ​​poison collected as many as 250,000 downloads.

Even earlier, the research team developed the Glaze tool, which allowed artists to hide their own personal style from the gaze of artificial intelligence.

Andrew can no longer have sex with the AI.

ttn-54