Outrage over rise in AI-generated child sex abuse videos: 'The fact that some of these videos are manipulating the imagery of known victims is even more horrifying'
An investigation by the Internet Watch Foundation demonstrates the dangerous use of artificial intelligence to generate pornographic content involving minors.
"[We can] create any child porn we desire… in high definition." Those were the words of warning from the Internet Watch Foundation (IWF) on dark web forums. This statement is largely true thanks to artificial intelligence.
Some videos show adult bodies with children's faces, others real children's bodies with other children's faces. Others are completely artificial, although these are still of poor quality and therefore not very believable.
The organization warns that the tools to create this material are getting better and more accessible. "Free, open-source A.I. software appears to be behind many of the deepfake videos seen by the IWF," they argue in a report on Monday.
"The fact that some of these videos are manipulating the imagery of known victims is even more horrifying," explains Susie Hargreaves, CEO of the IWF. "Survivors of some of the worst kinds of trauma now have no respite, knowing that offenders can use images of their suffering to create any abuse scenario they want."
The agency also analyzed more than 12,000 A.I.-generated images. Some 90% of them, they claim, proved to appear so real that they could violate laws against actual child sexual abuse material. More than 3,500 were criminal.
While dark web forums provide recommendations on how to generate and improve this type of content, even on the "clear web," the one that normal people use in their daily lives, these images and videos are still being sold.