Thousands of images generated by artificial intelligence (AI) depicting children, many under two years old, being subjected to child sexual abuse could overwhelm the internet, new data published by the Internet Watch Foundation (IWF) found.
The foundation found that the generated images are now becoming so realistic that under United Kingdom law, they can be treated as real imagery. Thousands have already been uncovered.
“Earlier this year, we warned AI imagery would soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers,” Susie Hargreaves, chief executive of IWF said in a statement. “We have now passed that point.”
Some of the images the IWF examined would “be difficult for trained analysts to distinguish from actual photographs.” The technology will only get better and pose more obstacles, the foundation warned.
The IWF, a U.K.-based organization responsible for removing images from the internet that exploit children, said its “worst nightmare” is coming true. The images are likely based on the real faces and bodies of children that were then built into AI models, they said.
Additionally, criminals are using the technology to create imagery of celebrities depicted as children in sexual abuse scenarios. There have also been instances where photos of children wearing clothing are uploaded online and the criminals sue the technology to remove that clothing.
In one month the IWF found nearly 3,000 images that breached U.K. law depicting child sexual abuse, with the majority looking so realistic that they were treated as real.
The organization said its worried that as AI-generated images become more common, it could distract analysts and take resources away from real cases.
“International collaboration is vital,” Hargreaves’ statement said. “It is an urgent problem which needs action now. If we don’t get a grip on this threat, this material threatens to overwhelm the internet.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.