According to an investigation by The Wall Street Journal along with Stanford University and the University of Massachusetts Amherst (UMass Amherst), loopholes in the Meta Platforms-owned social network caused its algorithms to allow those engaged in buying and selling child pornography or promoting in-person encounters between adults and children to function.
"Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them," the WSJ explained in its article. "Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests."
One of the cases highlighted by the research is the use of explicit hashtags. Instagram "enabled people to search explicit hashtags such as #pedowhore and #preteensex, so that, in this way, pedophiles can have contact with those profiles "that used the terms to advertise child-sex material for sale." According to the WSJ, those profiles claimed to be run by minors hiding behind sexual monikers.
Among the contents offered were "videos of children harming themselves and imagery of the minor performing sexual acts," including their respective prices, in addition to tolerating users acting as third parties to facilitate encounters between adults and minors.
"That a team of three academics with limited access could find such a huge network should set off alarms at Meta. I hope the company reinvests in human investigators," said Alex Stamos, director of Stanford University's Internet Observatory and Meta's chief security officer until 2018.
The WSJ asked Meta to clarify why it is so easy for a pedophile to operate through Instagram. "Child exploitation is a horrific crime. We’re continuously investigating ways to actively defend against this behavior," said the tech giant.
In addition, Meta asserted that, in the last two years, it has managed to dismantle 27 networks that used Instagram to buy and sell child pornographic material or carry out any other type of pedophile act. In statements to the WSJ, the company said it "has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse." He also assured that they created a specific working group "to address the issues raised."