Pop culture obsessives writing for the pop culture obsessed.
Pop culture obsessives writing for the pop culture obsessed.

Parents have found messages about suicide spliced into YouTube videos for kids

Illustration for article titled Parents have found messages about suicide spliced into YouTube videos for kids
Photo: Brian Ach (Getty Images for YouTube)

Watch out Twitter, YouTube is making a serious play for your position as the worst goddamn place on the whole internet! We can only assume that the ultimate goal of both of these sites (and Facebook, while we’re at it) is to become as thoroughly monstrous as possible, because they’re good at that and fucking awful at everything else—including but not limited to: banning Nazis, preventing abuse, and protecting users’ personal information. It hasn’t even been a week since Disney, Nestle, and Fortnite studio Epic Games announced that they were pulling all YouTube ads due to the way the site was indirectly profiting from a “soft-core pedophile ring,” and now another way YouTube’s platform is being used to harm children has been exposed. Keep up the great work, YouTube. We’re sure you’re all very proud.


According to a post from an unnamed “physician mother” on pediatrician-focused parenting site PediMom (via The Washington Post), this unnamed mother was watching cartoons on YouTube with her child when the video suddenly cut away to a clip of an adult man holding out his wrist, saying “hey kids,” and offering information on how to commit suicide (the exact line is quoted at both of those links). This was on the YouTube Kids app, which was plagued with horrifying stories a few years ago, and the parent was eventually able to get YouTube to pull the video. However, as detailed in that Washington Post video, that wasn’t the only video like this.

Similar suicide clips, at least some of which appear to feature YouTube personality Joji/Filthy Frank, have appeared in videos related to the Nintendo game Splatoon. Parents have also recently found videos on the YouTube Kids app referencing or depicting suicide with video game and cartoon characters. YouTube has been pulling videos like this once it is notified of their existence, but as former American Psychological Association president Nadine Kaslow told The Washington Post, pulling the videos only solves part of the problem. “For children who have been exposed, they’ve been exposed. There needs to be messaging—this is why it’s not okay.”

Clearly, whatever changes YouTube made to its Kids app were not enough, and may have only allowed the horrifying shit like this harder to spot. Of course, as Facebook taught us just yesterday, even the most direct solution is far from ideal. At times like this, it seems like the only reasonable solution is to just shut off the internet entirely until humanity can learn to be trusted with it—which we all know will never happen.

If you or someone you know is struggling with suicidal thoughts, the National Suicide Prevention Lifeline is available 24 hours a day at 1-800-273-8255.