It’s been confirmed what we already knew: YouTube is the most popular video site among children, and despite its younger, safer offering (YouTube Kids), consumers under age thirteen keep turning onto YouTube’s main site — and thus viewing upsetting and potentially harmful videos. The truly worrisome thing here is that YouTube and its parent company, Alphabet Inc. (aka Google, or at least the company that now owns Google), kind of know that they’re not doing such a great job when it comes to children’s safety on the net.
Back in May, Common Sense Media held a summit about digital wellbeing; it included a discussion on “not safe for kids” content that devolved into a debate about whether YouTube hasn’t done enough to protect our children’s privacy — and ultimately, their lives. At the summit, Alicia Blum-Ross, YouTube’s policy chief for kids and family, tried to defend her company’s position, affirming that YouTube Kids is devoted to delivering quality content to our kids, as well as including parental controls and “time for a break” reminders within the app. Blum-Ross even said that YouTube staffers “strongly encourage parents that the general site is not made for kids.”
What she failed to mention, however, was that older children (for example, siblings of toddlers who are not into nursery songs or countless versions of “Baby Shark”) either don’t use YouTube Kids at all or quickly shift into the main app and never go back. The reason could be the restrictive, unappealing design of YouTube Kids — or, you know, the fact that it was originally created for preschoolers. Either way, YouTube Kids isn’t exempt from advertisers or screened for safety. Even Blum-Ross admitted that the company hasn’t found the perfect formula to implement metrics without hurting its business.
At the end of the day, this is what it’s all about: business and money. That’s the YouTube operating system at its core. Children under 13 are, by law, protected from digital data collection, but the sheer number of videos uploaded to YouTube is making the issue more difficult to control and address. As of 2017, YouTube relies on its engineers to sort videos — especially those targeted to kids — by genre, and even though the tech giant once tried to hand-pick options the brand could label as “safer,” children ultimately still went to surf the regular site, leaving them exposed to messages, ads, or harmful trends that are definitely not for young consumers (remember the godawful Momo Challenge? There was no way to control similar videos in the beginning, and even now, if you try reaaally hard. you can still find it on YouTube).
It’s difficult to guess what will come next for YouTube safety, largely because we know that preteens and preschoolers are searching for completely different things on YouTube — which makes housing all “kid” videos in one place seemingly impossible. Unless, you know, YouTube would employ actual humans to monitor these things and put our kids’ safety first…