Voz media US Voz.us

Hundreds of images with child sexual content discovered on AI platforms

The Stanford Internet Observatory (SIO) revealed that "models trained on this dataset, known as LAION-5B, are being used to create photorealistic AI-generated nude images."

A bug in Google's new artificial intelligence caused its stock to plummet and gives Microsoft the edge in the war for AI hegemony.

(Pixabay)

The Stanford Internet Observatory (SIO) revealed in an investigation that it discovered hundreds of cases of images with child sexual abuse content in artificial intelligence image generation models, such as Stable Diffusion:

An investigation found hundreds of known images of child sexual abuse material (CSAM) in an open dataset used to train popular AI image generation models, such as Stable Diffusion ... Models trained on this dataset, known as LAION-5B, are being used to create photorealistic AI-generated nude images, including CSAM.

The study explained that "rapid advances in generative machine learning make it possible to create realistic imagery that facilitates child sexual exploitation using open source AI image generation models."

Ml Training Data Csam Report-2023!12!20 by Veronica Silveri on Scribd

The Stanford team of researchers used data from the National Center for Missing and Exploited Children and worked with the Canadian Center for Child Protection to provide third-party validation of the findings.

According to the report: "It is challenging to clean or stop the distribution of open datasets with no central authority that hosts the actual data. ... Future datasets could use freely available detection tools to prevent the collection of known CSAM." Thanks to the investigation, all the material found is currently being removed from the internet.

tracking