Shocking new reports show that Facebook's parent company, Meta, knew that parents were monetizing photos of their own children for pedophiles and even went so far as to actively promote accounts with “underage modeling” to people who were under suspicion of
inappropriate behavior toward children online.
This is according to reports from the
New York Times and the
Wall Street Journal, who claim that Meta allowed subscription tools to be used for child sexual exploitation and did nothing to stop it.
Although Instagram does not permit users younger than 13 to sign up, accounts may focus on children if they are managed by an adult. The news outlets claim that Meta’s safety staff warned superiors last year that hundreds of “parent-managed minor accounts” were being used to sell images of their young daughters modeling clothing such as leotards and swimsuits via their paid subscription tools.
Although the photos in question were not technically illegal as they did not contain nude or sexual content, many of those who purchased them made it clear to the parents who are selling the photos that they used them for sexual enjoyment. In fact, some even went so far as to interact with them.
The
Wall Street Journal noted: “Sometimes parents engaged in sexual banter about their own children or had their daughters interact with subscribers’ sexual messages.”
An audience demographics company calculated 32 million connections to male followers linked to a set of 5,000 accounts they examined for the
Times report. Some of the men bullied and outright blackmailed the girls and their parents in order to obtain increasingly racy photos from them. The paper monitored exchanges on the messaging app Telegram in which some of these men talked about sexually abusing the girls they follow on the platform and praised Instagram for making it so easy to find them.
In many cases, the parents were willing parties in the exploitation of their children in pursuit of financial incentives. Some of the mom-run accounts rake in as much as $19.99 per month per subscriber by offering extra photos, chat sessions with their daughters, and other exclusive content and incentives. Those with bigger followings attract more opportunities for products and discounts as well as higher visibility on Instagram, which can draw in even more followers.
Insiders say that Meta rarely takes action on reports of child exploitation on Instagram and Facebook, claiming that they lack the resources to take decisive action and are often overwhelmed by the scope of it. In the case of the teams who informed Meta last year about this behavior, the company rejected their suggestions to ban accounts that feature child modeling from offering subscriptions. Although they did implement an automated system aimed at banning suspected pedophiles from subscribing to these accounts, users could easily get around it by creating new accounts, and the technology was unreliable.
Meta sued for facilitating child sex trafficking
The state of
New Mexico sued Meta last year for facilitating child sex trafficking and distributing child sex abuse material, and Attorney General Raul Torrez announced this week that he will be investigating how their paid subscription services entice predators. As part of their previous investigation, a fictitious 14-year-old girl was offered $180,000 by a predator to appear in a pornographic video.
In addition, a group of more than 40 state attorneys general sued Meta last year, accusing its products of being
harmful to young people and alleging the company was well aware of these harms.
Sources for this article include:
LifeSiteNews.com
NYTimes.com