Future Tense

New Internet Users Aren’t Being Taught to Protect Themselves Online

You can’t teach someone to drive without teaching them about seatbelts. The same principle should apply to teaching new Internet users about privacy protections.

Photo by SABAH ARAR/AFP/Getty Images

When the small-town adventurer sets out for a new life and hops off the bus in the big, bad city, callous predators come out to prowl, waiting to take advantage of his inexperience.

That’s the same anxiety that captivates new and inexperienced Internet users as they go online for the first time: They feel on display and vulnerable.

Digital literacy classes encourage novice public Internet users—the 6 percent of the adult U.S. population that has no home Internet access and relies on community computer centers to get online—to apply for jobs, open online bank accounts, and complete a GED. But they aren’t adequately teaching them how to use the Internet securely and protect their privacy—in part because they lack the time and expertise. That’s like teaching someone how to drive but forgetting about seatbelts or stop signs.

I’ve spent a year and a half of researching the work and impact of four digital literacy organizations that aim to relieve social and economic inequalities by giving people the technical skills required to navigate today’s societies. But without resources to teach online privacy, overstretched digital inclusion programs may be actually putting their students at increased risk.

In a perfect world, new users should learn about privacy-protecting tools before they type their first Google search or craft their first tweet. Instead, the individuals I met were woefully unprepared for the challenges of being targeted, tracked, and profiled online. They didn’t know to avoid those jiggling pop-up ads on the side of the screen, to delete sketchy emails from unknown senders, or to resist the temptation to click flashing banners promising fast cash and weight-loss tricks.

Yet they weren’t ignorant about safety and security issues—privacy intrusions and surveillance concerned them quite a bit. They felt exposed when visiting training centers to seek help in filling out government forms. They worried that companies might try to sell them a bad deal. And they were afraid that cybercriminals might steal their identities and personal data online.

Their concerns are rooted in reality. Many individuals who enroll in digital inclusion programs already experience surveillance on a regular basis. They’ve been followed by payday lenders. They’ve faced stop-and-frisk practices. They’ve had to divulge personal details about themselves to stay eligible in public assistance programs. By the time members of underserved communities enter a public library, community organization, or computer training center, they know what it’s like to be monitored and preyed upon. And as companies and governments grow more sophisticated at data collection, sharing, and analysis, there’s a good chance they’ll be targeted, tracked, and profiled as new Internet users.

But at the digital literacy organizations I’ve observed, I found little evidence that program participants get the training they need to prepare them for the risks of sharing information online. Students got questionable advice on how to create passwords. They didn’t always understand instructions to log out of online accounts. Instructors glossed over the differences between ads and search results. And the idea of a digital footprint rarely came up in class.

Why the massive gap in the digital literacy curriculum? Time and expertise. Many digital literacy classes run short: At the public library, an intro class to computers and the Internet met five times for an hour each, and only three of the sessions covered Internet-related topics. That’s not enough time to get students familiar with typing a URL into a browser’s address bar, let alone introduce privacy protecting tools.

In addition, staff members aren’t privacy-literate themselves. They felt ill-equipped to address the privacy and surveillance questions and concerns of their students. I found that most staff knew little about how tracking and targeting works, or what to do about it. Some staff didn’t even know what cookies were and asked why they saw the same ads at work and at home.

That’s unacceptable. Privacy protection should be an equal opportunity for everyone—not just those with the computer-science savvy or disposable income to access tools that can hide your identity online or secure personal data. As policymakers contemplate the future of digital inclusion policies, the challenges of privacy and surveillance need to be dealt with head on.

A good place to begin would be to continue funding of public digital literacy programs with an emphasis privacy education. Two of the country’s most well-established digital literacy programs—the U.S. Department of Commerce’s Sustainable Broadband Adoption projects, whose $251 million budget has financed digital education at more than 14,000 community organizations, and Public Computer Centers, whose $201 million budget has financed hundreds of thousands of training hours—will end soon. As organizations supported by these programs seek new sources of local, state, or private funding, like Connect2Compete or Comcast’s Internet Essentials program, why not earmark some of that cash to train staff on online privacy issues, so they can teach students about the stakes of information sharing? Another step: shift attention in privacy legislation to incentivize the development of effective privacy tools. Users need reliable products whose privacy-protecting costs are hidden, not extra, and whose design is so user friendly that privacy protection runs in the background of a digital device.

Until these changes happen, the promise of digital inclusion is at odds with the potential harms wrought by a surveillance society.