Instagram stands accused of promoting networks of child sexual exploitation in a number of ways, according to a new report.
A new investigative article by the Wall Street Journal has shed light on Instagram’s role in facilitating and promoting networks that buy and sell explicit underage content.
Researchers from Stanford University and the University of Massachusetts Amherst collaborated on this study that suggests that Instagram’s recommendation algorithms unintentionally favor pedophilic accounts and streamline the process of connecting these accounts with suppliers of explicit content.
Contrary to legal regulations and Meta’s own content policies, Instagram has permitted searches for explicit hashtags that are directly linked to accounts promoting the sale of child sexual materials. Shockingly, these accounts often purport to be managed by minors and use suggestive usernames to attract attention.
Accounts involved in this illegal activity offer a variety of explicit content, often through “menus” that outline the nature of the available materials. These menus even include prices for videos featuring self-harm by children and minors engaging in sexual acts with animals as discovered by the Stanford Internet Observatory researchers. Moreover, physical meetings with minors are also advertised on these platforms.
Meta has confirmed the existence of these issues and stated that it is committed to combating this appalling misuse of its platform. An internal task force has been established to tackle these issues. Over the past two years, Meta has dismantled 27 such networks and plans to continue this effort. In response to the Wall Street Journal’s contact, Meta has blocked thousands of hashtags that sexualize minors, some of which were associated with millions of posts. The company has also adjusted its algorithm to prevent suggesting search terms related to sex abuse to its users.
Researchers from Stanford’s Internet Observatory and UMass’ Rescue Lab set up test accounts and observed that viewing a single account from these pedophilic networks instantly resulted in receiving “suggested for you” recommendations related to the purchase and sale of child sexual content, including links to off-platform content trading sites.
The Stanford team found 405 accounts selling self-produced child sexual content, some of which claimed to be operated by children as young as 12. In total, 112 of these accounts had amassed 22,000 unique followers. There are also accounts that curate pro-pedophilia memes or discuss access to children.
Employees involved in Instagram’s child-safety initiatives estimated that the number of accounts primarily engaging with such content could range in the hundreds of thousands if not millions.
In 2022, the National Center for Missing & Exploited Children received 31.9 million reports of child pornography, a 47% increase from two years prior, with most reports originating from internet companies. Meta’s apps, which include Instagram, Facebook and WhatsApp, were responsible for 85% of these reports, with Instagram accounting for approximately 5 million.
The social media marketplace for child pornography has been remarked on for several years and even served as inspiration for the short story “Content Moderator” by the pseudonymous writer Jesus Is Victory. In this story, the narrator details his journey into the heart of darkness into the child pornography market on Instagram.
Though this work of fiction is a grimly hilarious take on the subject, real child pornography on Instagram is no laughing matter. Until these foul perverts are cleansed from the platform entirely, Meta will have a lot to answer for in its role for facilitating the corruption of minors.
Scroll down to leave a comment and share your thoughts.