Instagram Is Pornifying Your Children

Dec 2, 2023 by

By Carmel Richardson, American Conservative.

I don’t often agree with the editorial board at the Washington Post, but I did last week. An editorial over the holiday weekend titled “Schools should ban smartphones. Parents should help” listed just a handful of the numerous problems an ever-present screen has caused for children and their ability to learn.

It is practically impossible to prevent children from using phones in class when they are allowed to use them at every other point. The self-control muscle is weak in adults when it comes to screens; we should not be surprised that it is nonexistent in young children. (Ask yourself why it is the parents, not the children, who are most “enraged” at the suggestion that little Billy shouldn’t be allowed to check his notifications in the bathroom stall.) Banning phones in school completely, meanwhile, has been shown to make a world of difference in educational outcomes.

report out of the Wall Street Journal Monday gave a fresh edge to why caution is so necessary. After creating new accounts to test the Instagram reels algorithm, the Journal found the app promoted sexually explicit and pedophilic content to accounts that followed young cheerleaders, gymnasts, and teen and pre-teen influencers. If the test accounts followed some of the (mostly adult male) accounts that follow these pre-teen influencers, the app sent an even bigger flurry of lewd reels. With only a whiff of provocation, Instagram attempted to send its users—who, based on the criteria, could easily have been underage—down a dark and wicked spiral.

Obviously, this test was designed to produce just such a response, though it was easily replicated by the Canadian Centre for Child Protection with similar results. Underage girls in cheerleading garb are promiscuous enough that you’d be concerned to find your husband following them, but innocuous enough to slide beneath the radar when mixed in with other accounts, and therefore a perfect test case of the algorithm’s inclination. Clearly, as the Journal demonstrated, that inclination is on the side of promoting vice.

For those who have been keeping score, this is not Meta’s first pedophilic infraction. Back in June, the Journal reported that Meta algorithms on both Facebook and Instagram were connecting large communities of users that it suspected may be interested in pedophilic content. In response, Meta set up an internal task force, purportedly to detect more of this suspicious content and to suspend more harmful accounts. After this week’s discovery, however, Meta declined to comment to the Journal on why the algorithm still promotes underage sexual content. Instead, the social media conglomerate pointed to safety tools and the employees who remove or reduce the prominence of some 4 million harmful videos each month. They claimed the Journal’s tests “produced a manufactured experience.”

Read here.

Related Posts

Tags

Share This