Take a trip to the automated hellscape of YouTube videos aimed at kids

There is something seriously wrong with kids videos on YouTube. Yesterday, a front-page article in The New York Times shined a spotlight on the service’s inability to keep bizarre, disturbing videos aimed at children out of its YouTube Kids app, which is marketed as a more filtered version of the site containing “tons of fun and educational videos that are just right for kids.” But the videos that slip through onto that particular app, while indicative of the broader exploitation that’s going on, are just the tip of the iceberg. Today, a writer named James Bridle published a piece that goes far deeper into the hellscape of kids YouTube, trying to break down what might be fueling this flood of copyright-infringing, nightmare-inducing nonsense that YouTube’s algorithms are serving up to children.
Basically, it comes down to automation. Whether it’s pregnant Elsa videos or the rambling word salads that Bridle points to, these clips and their titles are constructed to exploit YouTube’s computerized curation. They contain words and phrases that parents might search for or would trigger YouTube’s recommended videos feature, sliding them in alongside innocuous nursery rhymes and legitimate videos from popular kids shows.
The same characters and keywords pop up over and over again—“Spider-Man,” “bad baby,” “finger family,” “education,” “tantrum,” “colors,” “superheroes,” “rhymes,” “Peppa Pig”—and they’re almost always jumbled together into some incomprehensible sentence that must look absolutely scrumptious to a YouTube algorithm. It’s all so finely tuned that it’s impossible to tell where the human element of these operations begins and ends, or if there is one at all. As Bridle writes, “This is content production in the age of algorithmic discovery — even if you’re a human, you have to end up impersonating the machine.”