(Photo: Gabe Ginsberg/Getty Images)

YouTube’s troubles with algorithmic content for kids continued to intensify this week, after a Time Of London report came out yesterday, suggesting that a number of major brands were unknowingly running ads on videos containing content that might be inadvertently appealing to pedophiles. The Times report cited companies like Adidas, Cadbury, and HP, all of whom have pulled ads from the video uploading service in the wake of the allegations.

To be clear, most of these videos—which. usually incidentally, show young children in various states of undress—appear to have been uploaded with innocent intent, often by the kids themselves. But they appear to have allegedly drawn comments from people viewing the videos with sexual intent, creating a bizarre situation where ad revenue is being pulled not because of the specific content of the videos, but because of who they were accidentally appealing to.


It seems pretty obvious that this latest round of troubles is linked to YouTube’s recent decision to ban ToyFreaks, one of the most popular channels in its roster. Featuring videos of a dad playing with (and sometimes tormenting) his two young daughters, the videos were among the 100 most-subscribed channels on the service, but YouTube shut it down last week, just one of 50 or more channels the service has shuttered recently as part of an ongoing effort to create more kid-safe content.

Examinations of kid-focused YouTube have made it clear that children—one of the site’s primary consumers—enjoy content that features other kids, with their endless bingeing creating plenty of ad revenue for content providers. It seems like an impossible task to create content to appeal to kids that won’t also draw in predators, although the Google-owned company recently released a blog post stating its commitment to filter out objectionable material and provide guidelines to creators to help keep their children safe. In the meantime, multiple companies have said they’re pulling their ads from the service, at least until they can be sure that its algorithms won’t place them over objectionable material.


[via Variety]