Screen Time

Stop Worrying About What Algorithms Do to Your Kids. Worry About What They’re Doing to You.

Photo illustration by Slate. Images via Thinkstock; ChuChu TV.

Screen Time is Slate’s pop-up blog about children’s TV, everywhere kids see it.

I was a young teen at the advent of the internet era, and as the internet and I came of age at the same time, I saw all kinds of messed-up stuff online. I looked at disturbing fan art, perused rare disease photos, and wandered among underground web rings stuffed with video clips so gruesome that I thought they were fake. My parents let me do all these things without much supervision, which I think now must have been, to some extent, intentional. They weren’t clueless—my dad was a tech writer much like I am now. One of their few ironclad rules was that I was never to give out my real name or contact information, and I obeyed this fairly strictly—I mean, other than the mortifying time my parents intercepted a couple of the “romantic” letters I hand-wrote to strange internet men. Otherwise, though, they didn’t intervene much, and here I am, right? I was never really traumatized by anything I found, and I never got involved in anything troubling.

The mainstream social media age, where parents commonly let their kids grab an iPad and wander among video and game content suggested for them by algorithms, brings its own fresh raft of horrors. When I wrote about “surprise egg” videos, I heard from dozens of validated parents who had been worried that they might be losing their grip in this new world, where little ones would much rather watch someone “unbox” Peppa Pig toys than the show starring Peppa herself. Which sounds super weird! The “learn colors with iPhone X” genre is weird. The “wrong heads” genre is weird. The sheer volume of YouTube videos in which Elsa from Frozen experiences the dentist? Extremely weird.

Writer and artist James Bridle recently wrote of his horror in unearthing the bizarre kinks of algorithm-driven content and recommendation in the kids’ YouTube pipeline, in a piece that reverberated across social media. Many parents, already exhausted by these content minefields, despaired at yet another kind of media to fear; swaths of child-free folk were shocked to learn that the navigational habits of small children on YouTube have created creepily specific popular categories, like head swapping, box opening, color-dipping, and tooth drilling. These recombine like mutating viruses in the hope of automating higher view counts—and climbing recommendation lists. Some of these videos, Bridle argues, are likely to be disturbing to children.

But the things kids like always seem exploitive and overwhelming to adults. Parents always try to steer their kids from noisome and overstimulating consumerist dross and toward things that seem “good for them.” That’s not new here, and neither is a certain juvenile interest in the grotesque. Kids behead, bury, deface, bind, and shear their toys regularly. They’re fascinated by toilet functions and forbidden words. They delight in the witches who eat children in fairy tales, or the maggoty cheese stuck in the beard of Roald Dahl’s Mr. Twit. They pretend to be kidnapped, imprisoned, dying, or in labor long before they truly understand what those concepts mean.

Bridle’s declaration that the platform is oriented to “systematically frighten, abuse and traumatize children, automatically and at scale” is alarming. For many readers, such an assertion easily becomes another note of certainty in what already seems like an inexorable march toward corporate tech dystopia. But it’s worth exploring how this particular panic about kids and media differs from the ones faced by previous generations. What can we learn from this bizarre new content landscape other than “supervise what your children are watching,” an adage that will probably be eternal?

What I wish readers were most frightened about in Bridle’s piece isn’t what children might see when left to their own devices. It’s what algorithms do when they’re left to theirs. Algorithms like these will disassemble and recombine culture into piecemeal nightmares, bodies stitched onto wrong heads. This affects not only our experience of media, but eventually of reality itself: When we search for something on a platform, we see less and less of what exists—and more of what the calculations of a rudderless but self-reinforcing machine has decided we should see.

This is the important takeaway from Bridle’s piece: Not necessarily that there’s scary stuff out there for kids to find, because that has always been the case. But the world of headless sobbing Disney bootlegs Bridle discovered warns of the endgame we all face when we automate to that extent. The same kinds of pernicious systems leading your children down these eerie digital lanes are able to decide what adults see online, too, from advertisements to Facebook hoaxes to search results. It’s icky to think of kids watching “Elsa baby drives to McDonalds for Happy Meal with Joker family,” but what’s scarier is the growth of similarly unexamined algorithms involved in the chillingly titled “predictive policing.” Unchecked algorithms and the presumption of “neutral” technology can severely exacerbate existing problems with data methodology, and ultimately increase socio-economic inequality, disorient the news media landscape, and make unjust hiring decisions.

The thing I treasured most about my unsupervised internet time as a child was the chance to see the real world—I met pen pals from cities I would never visit, saw photos that challenged my understanding of the human body, laughed at faraway folk dance accidents. Today’s internet is not a reflection of the real world, but a malleable distortion continually vomited back at us by simple-minded but harmful systems, artificial but not intelligent. This means that not only must we question our experience of technology today, but we will have to be skeptical enough to mitigate the view of society selected for us by these self-interested machines for a long time to come.

Yet the view of technology and automation as “neutral” pervades, as if these things might be somehow untouched by the millions of smudged and troubled and messy human inputs with which they are built. Yes, your kids are being exploited by “the violence of algorithms.” So are you.