A software program engineer from Wisconsin was arrested on Monday for allegedly creating and distributing 1000’s of synthetic intelligence-generated pictures of kid sexual abuse materials (CSAM).
Court docket paperwork describe Steven Anderegg as “extraordinarily technologically savvy” with a background in laptop science and “a long time of expertise in software program growth.” Anderegg, 42, is accused of sending synthetic intelligence-generated pictures of nude minors to a 15-year-old boy by way of his Instagram account. Anderegg got here to the eye of regulation enforcement after the Nationwide Heart for Lacking and Exploited Youngsters flagged messages he allegedly despatched in October 2023.
In keeping with info obtained by regulation enforcement from Instagram, in 2023, Anderegg posted an Instagram story “consisting of a practical GenAI picture of minors carrying BDSM-themed leather-based clothes” and inspired others to “take a look at” what they have been lacking on Telegram. In personal messages with different Instagram customers, Anderegg allegedly “mentioned his need to have intercourse with prepubescent boys” and informed one Instagram person that he had “a ton” of different AI-generated CSAM pictures on Telegram.
Anderegg allegedly started sending the pictures to a different Instagram person after studying he was solely 15 years outdated. “When this minor reported his age, the accused didn’t refuse him and didn’t query him. As an alternative, he wasted no time in describing to this minor how he was creating sexually specific pictures of GenAI and sending tailor-made content material to the kid,” charging paperwork state.
Prosecutors stated when regulation enforcement searched Anderegg’s laptop, they discovered greater than 13,000 pictures, “with tons of, if not 1000’s, of pictures depicting nude or scantily clad underage minors.” Charging paperwork say Anderegg created the pictures utilizing the Secure Diffusion text-to-image translation mannequin, a product created by Stability AI, and used “extraordinarily particular and specific cues to create these pictures.” Anderegg additionally allegedly used “unfavorable cues” to keep away from creating pictures depicting adults and used third-party add-ons Secure Diffusion, which “specialised in creating genitalia.”
Final month, a number of main tech corporations, together with Google, Meta, OpenAI, Microsoft and Amazon, stated they’d overview their AI coaching knowledge for CSAM. Firms dedication to a brand new set of ideas which embody “stress testing” fashions to make sure they don’t create CSAM. AI stability additionally subscribed to those ideas.
This isn’t the primary time Anderegg has come into contact with regulation enforcement concerning alleged possession of CSAM by way of a peer-to-peer community, prosecutors stated. Prosecutors allege that in 2020, somebody utilizing the Web at Anderegg’s Wisconsin residence tried to obtain a number of recognized CSAM information. In 2020, regulation enforcement searched his residence and Anderegg admitted that his laptop had a peer-to-peer community and that he regularly rebooted his modem, however he was not charged.
In a quick assertion supporting Anderegg’s pretrial detention, the federal government famous that he has labored as a software program engineer for greater than 20 years and his resume contains latest work at a startup the place he used his “glorious technical information to develop synthetic intelligence fashions.”
If convicted, Anderegg faces as much as 70 years in jail, though prosecutors say the “guideline sentence could possibly be as much as life in jail.”