One is a viral hoax. The other is rife with distressing and disturbing content, says Keza MacDonald, the Guardians video games editor
I first heard about Momo in my local parents WhatsApp group. Someone had screenshotted a Facebook post about a creepy puppet that supposedly appeared in unsuspecting childrens phone messages and spliced into YouTube videos, dispensing advice on self-harm and violent acts. I reacted with suspicion: this would hardly be the first time that something on Facebook turned out not to be true, and the Momo challenge seemed a bit too on the nose too obviously sinister to be real.
It turned out that Momo was indeed a hoax, a viral shock-story driven by a frightening image and well-intentioned worry about childrens safety online.
There have been videos on YouTube Kids with suicide advice spliced into otherwise innocuous cartoons as a malicious joke they just dont involve Momo. Parents have spotted them before; the American paediatrician Free Hess recorded and documented one on pedimom.com. And this is, lamentably, the tip of the iceberg when it comes to inappropriate content on the video platform, even on the version thats supposedly curated for kids.
YouTube has been battling disturbing videos for years, but a 2017 Medium post by the writer and artist James Bridle brought the problem to widespread attention, kicking off a slew of stories about the various horrors that could be found through the YouTube Kids app. Frightening videos of Peppa Pig at the dentist or Mickey Mouse being tortured were appearing in searches. Weirdly sexualised videos of Disney princesses were easy to find. Supposedly family-friendly channels showed children wetting themselves, being injured or screaming in apparent terror a father who ran one such prank channel allegedly lost custody of two of his children as a result.
YouTube has removed a lot of the worst videos that used to be rife on the platform, but they just keep coming, finding new ways to get around the algorithm. The most recent major scandal involves the discovery of a soft paedophile ring operating in YouTube comments, where users leave chilling comments on videos of children and exchange numbers to share further images, as reported by The Verge.
YouTubes key failing here is that it relies on a flagging system to find and purge inappropriate content, which means someone has to actually see the video in question and report it before anything can be done. Pre-moderation, where videos dont make it on to YouTube Kids until theyve been watched in full by a human being, is realistically the only way to keep the platform safe from malicious pranksters. But YouTube has shown no appetite for this, instead emphasising its robust content-reporting features in its responses to these continual controversies.
When you download the YouTube Kids app, it tells you as much in the introductory screens: We work hard to offer a safer YouTube experience, but no automated system is perfect. No shit. The truth is that YouTube was never intended to be a platform for children, and I have zero faith in its ability to adapt itself to that role.
Even on the less extreme end of things, YouTube can be a parenting minefield. When my teen stepson was a train-obsessed five-year-old who couldnt even read yet, we once left him watching videos of trains pulling into stations on the iPad for a few minutes and returned to find him innocently watching a video of a train accident that had appeared in the recommendations. Nowadays, with him having long since graduated from kids YouTube to obnoxious gaming channels, we have regular dispiriting conversations about whichever of his favoured YouTube celebrities has recently done something incredibly stupid like drop the N-word on a stream or told someone in the comments to kill themselves. Thats not even to mention the alt-right, anti-social-justice personalities who the algorithm regularly feeds to young male users watching Call of Duty compilations, or the dangerous flat-Earth or antisemitic content that the platform has recently been forced to address.
The majority of YouTube Kids content isnt distressing or disturbing but it is mostly brain-numbingly terrible. A vast amount of the kid-friendly videos that are uploaded are straight-up garbage: cheap, algorithm-driven songs or nonsensical stories featuring 3D models or toys of popular characters such as Elsa, Spider-Man and Peppa Pig. They are designed purely to extract views and thereby money from common search terms not to entertain or educate kids. Friends with young children regularly complain about the inane surprise-egg or toy review videos that have become household obsessions. My toddler would watch cheap, repetitive, unbearably cheery nursery rhyme videos for an hour if I let him.
The easiest solution for parents of young children might be to purge YouTube from everything phones, TVs, games consoles, iPads, the lot. This is the approach weve taken in our household, which inconveniently contains two video games journalists and, consequently, an absurd number of devices. You dont need to be a tech luddite to find YouTube Kids both irritating and vaguely worrying. There is no shortage of good childrens entertainment available on Netflix, through BBC iPlayer and catch-up TV, or through advert-free games designed for young players. And theres zero chance theyll come across any suicide tips there.
Keza MacDonald is video games editor at the Guardian
If you enjoyed our content, we'd really appreciate some "love" with a share or two.
And ... Don't forget to have fun!