According to an investigative report by The Wall Street Journal, Instagram helps “connect and promote a vast network of accounts” that commission and purchase underage sex content.
The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst determined that Instagram “connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”
The researchers found that the social media platform, owned by Meta, “enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale.”
The accounts often claim to be run by the children themselves and use “overtly sexual handles,” according to the report.
BREAKING: Instagram algorithm promotes pedo networks, connects predators with victims: reporthttps://t.co/exf1QsTlk7
— The Post Millennial (@TPostMillennial) June 7, 2023
Instagram algorithms recommend pedo content. (Non-paywalled link.)https://t.co/zYYYZvhXBK
— Katie S. (@katietighe) June 7, 2023
The Wall Street Journal reports:
Instagram accounts offering to sell illicit sex material generally don’t publish it openly, instead posting “menus” of content. Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and “imagery of the minor performing sexual acts with animals,” researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person “meet ups.”
ADVERTISEMENTThe promotion of underage-sex content violates rules established by Meta as well as federal law.
In response to questions from the Journal, Meta acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised. “Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”
Meta said it has in the past two years taken down 27 pedophile networks and is planning more removals. Since receiving the Journal queries, the platform said it has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse. It said it is also working on preventing its systems from recommending that potentially pedophilic adults connect with one another or interact with one another’s content.
According to the report, pedophilic accounts use certain emojis, such as a map and cheese pizza, to function as a code.
BREAKING: Instagram algorithm exposed promoting pedophile networks in massive investigation, video sales, ‘preteensex’ menus, in-person meetups with underage boys and girls, using emojis such as a map and cheese pizza – WSJ
— Jack Poso 🇺🇸 (@JackPosobiec) June 7, 2023
— Jack Poso 🇺🇸 (@JackPosobiec) June 7, 2023
Cont. from The Wall Street Journal:
The pedophilic accounts on Instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography,” according to Levine of UMass. Many declare themselves “lovers of the little things in life.”
Accounts identify themselves as “seller” or “s3ller,” and many state their preferred form of payment in their bios. These seller accounts often convey the child’s purported age by saying they are “on chapter 14,” or “age 31” followed by an emoji of a reverse arrow.
Some of the accounts bore indications of sex trafficking, said Levine of UMass, such as one displaying a teenager with the word WHORE scrawled across her face.
ADVERTISEMENTSome users claiming to sell self-produced sex content say they are “faceless”—offering images only from the neck down—because of past experiences in which customers have stalked or blackmailed them. Others take the risk, charging a premium for images and videos that could reveal their identity by showing their face.
Many of the accounts show users with cutting scars on the inside of their arms or thighs, and a number of them cite past sexual abuse.
Join the conversation!
Please share your thoughts about this article below. We value your opinions, and would love to see you add to the discussion!