Rep. Anna Paulina Luna (R-Fla.) said at a House Oversight Subcommittee hearing on Tuesday that law enforcement agencies have to use artificial intelligence (AI)-created abusive and sexually explicit material against minors. He told his fellow MPs that he was having a hard time prosecuting the images.
The Child Sexual Abuse Materials Act (CSAM) requires “actual photos, real photos of the child in order to prosecute,” Carl Szabo, vice president of the nonprofit organization NetChoice, told lawmakers. . Generative AI is turning the average photo of a minor into fictional yet explicit content.
“Bad actors are taking photos of minors and using AI to manipulate them into sexually dangerous positions to avoid the letter of the law, not the purpose of the law,” Szabo said.
Attorney General of all 50 states I wrote a bipartisan letter. Urges Congress to “study means and methods” [AI] It is “used to exploit children” and aims to “propose solutions to stop and address such exploitation to protect America’s children.”
The letter called on Congress to “explicitly address AI-generated CSAM” for use by prosecutors.
“This is actually something that the FBI specifically asked us to look at when they were talking about cybercrime, because the FBI is currently having trouble prosecuting these really sick individuals. Because it’s a generated image, strictly speaking, the child is not harmed in the process,” Luna said.
The Hill has reached out to the FBI for comment.
Although AI-generated CSAM currently represents only a small portion of the abusive content circulating online, the ease of use, versatility, and highly realistic nature of AI programs The use of CSAM is likely to continue to grow, said John Sheehan, vice president of the U.S. Division of Exploited Children. This was announced by the National Center for Missing and Exploited Children (NCMEC).
Lawmakers and witnesses frequently cite research from the Stanford Internet Observatory. Generative AI enables Create more CSAM and create training data for publicly available AI models Contaminated with CSAM.
NCMEC provides a system known as the National Centralized Reporting System for Online Exploitation of Children. cyber tip line. Sheehan said that despite the “explosive” growth in the number of available apps and services, only five generative AI companies have so far submitted reports to the tip line. That’s what it means.
“State and local law enforcement agencies must address these issues because technology companies are not taking front-end steps to build these tools with secure design.” said.
Shehan also pointed out that the “naking” or “stripping” of AI applications and web services is particularly egregious when it comes to generating CSAM.
“None of the platforms that offer ‘undressing’ or ‘undressing’ apps have registered for reporting to NCMEC’s CyberTipline. “No one has engaged with NCMEC on how to avoid creating child sexually exploitative or nude content, and no one has submitted a report to NCMEC’s CyberTipline,” he said.
“The sheer volume of cyber tips initially prevents law enforcement from pursuing aggressive investigations that effectively target the most egregious criminals,” said Congressman Nick Langworthy (RN.Y.). It was common,” he said.
“In just three months, from November 1, 2022 to February 1, 2023, there were over 99,000 known CSAM-distributing IP addresses across the United States, but only 782 were investigated. “Right now, through no fault of our own, law enforcement agencies simply do not have the capacity to investigate and prosecute these overwhelming numbers of cases,” Langworthy added. previous testimony John Pizzullo, CEO of nonprofit organization Raven, spoke during a February 2023 Senate Judiciary hearing on online child protection.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.