Editor’s Note: This week, Block, a prominent technology company, laid off nearly half of its employees. Its founder, Jack Dorsey, cited the reasoning for the mass firings as “an opportunity to move faster with smaller, highly talented teams using AI [Artificial Intelligence] to automate more work.” While businesses are making radical changes in the face of emerging technologies, the criminal justice system, with its precedents and well-worn procedures, is struggling to keep up. In this week’s The Rabbit Hole, reporter Peter Beck takes a look at how the distance between cutting-edge technology and decades-old criminal statutes is affecting our ability to protect the most vulnerable of society. - Seamus

It was the middle of the school day when an alert popped up on the phone of an administrator at Corinth Middle School in Alcorn, Mississippi. The notification came from Bark, an app that monitors the school’s internet network for illicit content being downloaded or shared, and was labeled severe for its sexual content. Bark’s system had been triggered by a user who transferred three inappropriate videos, depicting Corinth students, to their personal Google Drive. That user was Wilson Jones, a teacher with the school.

That next morning, the principal and vice principal met with Jones as classes began. Jones admitted to the two men that he knew about the three videos. But, he protested, none were sexual in nature. Instead, Jones told the school officials that he had created the videos using Hailuo AI, a generative artificial intelligence (AI) application.

Edward Childress, the superintendent of Corinth’s school district, allegedly hesitated on what to do next. According to a criminal complaint in the case, he reportedly allowed Wilson to resign two days after the alert was issued. It would be two months before the incident was reported to Mississippi’s Department of Education and another month after that before law enforcement began its investigation. A federal grand jury indicted Childress for allegedly accepting Wilson’s resignation and for failing to report him. Wilson pleaded guilty and now awaits sentencing in May. Attorneys for Wilson and Childress did not return separate requests for comment.

The Mississippi case represents a small but growing number of federal prosecutions that target the malicious use of AI platforms. Many cases include disturbing allegations from a dystopian world in which friends, family members, children, peers, and complete strangers are stripped of their autonomy—unclothed by an AI software. And, in other cases, of AI agents being used to amplify the harm caused by criminal acts. It’s a trend that the rate of federal prosecutions does not come close to reflecting, and will only grow as AI platforms become more advanced and accessible. Yet, the law appears unready for this new world of generative crime.

On November 8, 2023, a federal judge in Charlotte, North Carolina, sentenced 41-year-old David Tatum to 40 years in prison for sexual exploitation of a minor and for using AI to generate images of child abuse. The case was one of the first of its kind, coming a year after AI agents became widely available to the public with the November 2022 release of ChatGPT. Tatum had been a child psychiatrist before his arrest. 

At trial, federal prosecutors said that Tatum used AI to manipulate innocuous pictures—from a school dance and another commemorating the first day of school—into child abuse materials. The pictures were decades old and showed Tatum’s own classmates from when they attended middle and high school together. One victim, who was 15 in the altered picture of her waiting by a school bus, was now in her 40s. Another victim, who appeared as a 13-year-old in the picture, did not even recall ever meeting Tatum.

The case led to a series of investigative hurdles for FBI agents, who struggled to identify the victims depicted and how the two images were produced. It took interviews with Tatum for the agents to find out more about who the victims are. Even then, one victim was skeptical enough to believe she was being targeted by a scam. Tatum’s attorney did not return a request for comment.

Identifying alleged victims is an essential step in every criminal investigation, but AI widens the reach of possible victims infinitely.

logo

This Story Is Behind a Paywall

Let's explain why. Our Friday morning roundups are always free. However, this story is part of our weekly Sunday Series we call The Rabbit Hole where we choose a single federal court docket, filing, or topic and dive deep into the details. To do the stories in the series properly, we invest significant reporting resources that can only happen with subscriber support.

Upgrade

Reply

Avatar

or to participate

Keep Reading