AI defakes of real crime victims are a nightmare

Grandma locked me in the oven at 230 degrees when I was just 21 months old,” says the cute baby with the huge blue eyes and floral headband in a TikTok video.

The baby, who speaks in an endearingly childlike voice to the plaintive tune of Dylan Mathews’ “Love Is Gone,” reveals herself as Rody Marie Floyd, a little girl who lived in Mississippi with her mother and grandmother. She says that one day she was hungry and wouldn’t stop crying, after which her grandmother put her in the oven, which resulted in her death. “Please follow me so more people know my true story,” the baby says at the end of the video.

The baby in the video isn’t real, of course: it’s an artificial intelligence creation posted to the @truestorynow account on TikTok, an account with nearly 50,000 followers that posts videos of real crime victims telling their story tell.

The gruesome story she tells is true, if only in part. The baby was not named Rody Marie, but Royalty Marie, and was found stabbed to death and burned in an oven at her grandmother’s home in Mississippi in 2018. The grandmother, 48-year-old Carolyn Jones, was charged with murder earlier this year. But Royalty was 20 months old when she died, not 21, and unlike the baby in the TikTok video, she was black, not white.

The victims often speak for themselves and tell of their horrible demise

Such inaccuracies are commonplace in the grotesque world of AI true-crime TikTok, which employs artificial intelligence to resurrect murder victims, many of whom are young children. In the videos, some of which have been viewed millions of times, a victim talks about the gruesome details of his death from a first-person perspective; most of them do not come with a content warning.

“They’re pretty weird and creepy,” Paul Bleakley, an assistant professor of criminal justice at the University of New Haven, tells ROLLING STONE. “They seem designed to elicit strong emotional responses because that’s the surest way to get clicks and likes. It’s awkward to look at, but I think maybe that’s the point.

Many of the accounts have a disclaimer stating that the videos don’t use real photos of the victims to “respect the privacy of the family,” like Nostalgia Narratives, an account that posts true crime victim videos for its 175,000 followers, writes in the captions of the videos.

The account not only tells the stories of famous child murder victims like Elisa Izquierdo, a six-year-old girl who was murdered by her abusive mother in 1995, and Star Hobson, a year-old who was killed by his mother’s girlfriend in 2020, but also adult murder victims like George Floyd and JFK. None of the accounts that ROLLING STONE contacted responded to requests for comment, but the fact that they alter victims’ appearances is likely due to TikTok community guidelines prohibiting deepfaking of individuals or minors, a generated media policy that the platform rolled out in March.

The distribution of these AI videos with real crime victims on TikTok raises ethical questions. Although documentaries like The Jinx and Making a Murderer and podcasts like Crime Junkie and My Favorite Murder have huge followings, many critics of the true crime genre have taken issue with the ethical implications of consuming real stories over horrifying ones Robberies and murders questioned as pure entertainment, with the rise of amateur detectives and true-crime obsessives possibly traumatizing victims’ families again.

That concern is doubly true for videos like Royalty’s, which tell a victim’s story from their perspective and use their name—presumably without family consent—to an incredibly chilling effect. “It has the potential to re-victimize people who have been victims before,” says Bleakley. “Imagine you are a parent or relative of one of the children in these AI videos. They go online and here is an AI image [basierend auf] Your deceased child who describes in very gory detail what happened to him.”

Complaints could come to nothing

Deepfake videos, as part of the true crime genre, differ from deepfake porn for obvious reasons. However, Bleakley can imagine that grieving families will take civil action against the makers of such videos, especially if they are monetized; however, he notes that since the individuals are deceased, it would be difficult for the families to make a slander argument. “It’s a very tricky, murky gray area,” he says.

However, one thing is clear: with the rapid development of AI technology and the fact that there is little regulation to curb its spread, the question is not whether videos like this will continue to grow in popularity, but rather how much worse the connection of true crime and AI is yet to come.

It’s easy to imagine that true crime video makers can not only recreate the voices of murder “victims”, but also recreate the gory details of crimes. “That’s always the question with any new technological development,” says Bleakley. “Where will she stop?”

This text is a translation of a Article from ROLLINGSTONE.com

ttn-30