The DOJ makes its first known arrest for AI-generated CSAM

The US Department of Justice arrested a Wisconsin man last week for creating and distributing artificially generated child sexual abuse material (CSAM). As far as we know, this is the first such case as the DOJ seeks to establish a court precedent that exploitative materials are still illegal when no children were used in their creation. “Simply put, an AI-generated CSAM is still a CSAM,” said Deputy Attorney General Lisa Monaco. he wrote in a press release.

The DOJ says Steven Anderegg, a 42-year-old software engineer from Holmen, Va., used a fork of an open-source AI image generator. Stable diffusion then producing images that he used to lure an underage boy into sexual situations. The latter will likely play a central role in the final trial on four counts of “producing, distributing and possessing obscene visual images of minors engaged in sexual conduct and transmitting obscene material to a minor under the age of 16.”

The government says Andereg’s images show “naked or partially clothed minors lasciviously displaying or touching their genitals or having sex with men.” The DOJ claims it used specific instructions, including negative instructions (Additional instruction for the AI ​​model, what to tell it no produce) to encourage the generator to produce a CSAM.

Like cloud-based image generators The middle of the journey and DALL-E 3 has safeguards against this type of activity, however Ars Technica reports Anderegg is said to be using Stable Diffusion 1.5, a variant with fewer boundaries. Stability AI told the publication that the fork was produced by Runway ML.

According to the DOJ, Anderegg contacted the 15-year-old boy online and described how he used an artificial intelligence model to create the images. The agency says the accused sent direct messages to the teenager on Instagram, including several AI-powered images that “lustfully displayed the genitalia of minors.” Instagram has given these images to its credit National Center for Missing and Exploited Children (NCMEC), alerted law enforcement agencies.

Andereghi faces between five and 70 years in prison if convicted on all four counts. He is in federal custody pending a May 22 hearing.

The case will challenge the view held by some that the illegal nature of CSAM is based solely on the children who were exploited at the time of their creation. Although AI-generated digital CSAM does not involve any living people (other than those included in the shows), it can still normalize and promote material or be used to lure children into predatory situations. It seems like something the feds want to clear up as a technology is rapidly developing and gaining popularity.

“Technology may change, but our commitment to protect children will not,” Monaco Deputy AG said. “The Department of Justice will aggressively prosecute those who create and distribute child sexual exploitation material, or CSAM, regardless of how the material is created. “Simply put, AI-generated CSAM is still CSAM, and we will hold accountable those who use AI to create indecent, offensive and increasingly photorealistic images of children.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *