New Study Finds YouTube Pushes Gun-Related Content To Children

Photo from Alexander Shatov on Unsplash

New Study Finds YouTube Pushes Gun-Related Content To Children

By Movieguide® Contributor

A new study has found that YouTube’s algorithm is pushing videos about guns and shootings to young children. 

The Tech Transparency Project, a nonprofit research group that looks into the impact of social media, started their experiment by creating two YouTube accounts that were set up like 9 year-old boys were the owners. 

The accounts simulated boys who liked video games, especially first-person shooter games. The only difference in the two accounts was that one clicked on YouTube’s recommended videos, while the other did not. 

The account that looked at YouTube’s recommendations soon started receiving videos about school shootings, gun training, and instructional videos about how to make a firearm automatic. 

In total, the account that clicked on YouTube’s suggested videos got 382 different firearms-related videos in the span of 12 months. The account that didn’t click on any suggestions received just 34 of the same type of video. 

The findings of this study show that, despite YouTube’s efforts to moderate and regulate content, things are still slipping through the cracks and making their way onto children’s screens. 

“Video games are one of the most popular activities for kids. You can play a game like ‘Call of Duty’ without ending up at a gun shop — but YouTube is taking them there,” said Katie Paul, the director of the Tech Transparency Project. “It’s not the video games, it’s not the kids. It’s the algorithms.” 

YouTube isn’t the only social media platform that has been accused of pushing harmful videos to young people. Movieguide® previously reported on TikTok’s algorithm:

From eating disorders to self-harm, TikTok’s additive algorithm sucks in unsuspecting teens and funnels them harmful content. Parents are desperate for control over the app and to protect their children from some of the more extreme videos. …

Parents aren’t the only ones wishing for more control of the app. Beth and other young people who have been affected by videos they’ve seen wish TikTok did a better job of weeding out the content they show users on their pages. 

“There was a point where I had to physically delete TikTok so I could get a full whole new For You Page but then that didn’t work,” Beth said. “I kept getting those videos to keep creeping in again.”

TikTok’s algorithm makes it extremely difficult for users to avoid unwanted content. The algorithm takes note of what users click on and how long they watch a video; that’s how they decide what to show users. 

“It’s scary but the algorithm does seem to know me and it does seem to,” said Perry Kornbluh, who also battled an eating disorder made worse by the platform. “If I’m having a bad day, I’ll start seeing more negative videos and if I’m having a really good day, a positive day, I’ll see more positive videos and it seems just really scary and specific in that way.


Watch IT’S THE SMALL THINGS, CHARLIE BROWN
Quality: - Content: +2
Watch REAGAN
Quality: - Content: +1