Future Tense

Whose Device Is It Anyway?

Samsung and Sprint won’t let me install a privacy-enhancing operating system on my smartphone.

Samsung Galaxy III and Galaxy III Mini
Whatever you do, don’t try to change the OS.

Photo by Ralph Orlowski/Reuters

When I bought a new personal computer last year, I didn’t want to use the Windows operating system that came with it as my primary computing environment. So I installed the Ubuntu version of the GNU/Linux operating system, which offers more freedom and functionality in ways that matter to me.

But when I recently bought a used Samsung mobile phone and tried to replace the bloated, privacy-invading system that comes standard with Samsung’s Android devices, I ran into all kinds of trouble. Why? Because Samsung and Sprint, the carrier that handles voice and data for that phone, have decided they—not I—will retain ultimate control over the device I purchased. Oh, I can use it, but only in ways they consider permissible.

This is the lockdown method of modern technology, a growing phenomenon that deserves much wider notice—and, for the most part, condemnation. To put it simply, we are being told that we don’t actually own what we buy. (Note: When I talk about “locking” here I’m not referring to the industry’s now-illegal refusal to let customers use their phones on other compatible networks; what I’m discussing here is a different kind of control-freakery.)

Mobile phones may be the most obvious example, but this kind of corporate thinking is spreading. Amazon has yanked books from customers’ Kindle e-readers. The owner of the Keurig one-cup coffee-maker is launching a new model that’s designed to only make coffee using company-approved refills. (The good news is that competitors have already bypassed this system.) My hearing aids can be adjusted only by a specialist who plugs them into a proprietary machine. When Facebook changes the way your news feed operates, reducing visibility of some posts and enhancing others, you take what you get or look for a new social network.

These are just a few examples of a much wider phenomenon as more of what we use is centralized and/or remotely configurable. It’s a ratchet process, an ever-tighter locking down of once-free (as in freedom)—or at least freer—technology and communications. In an arms race with big companies the average user will lose.

In the case of the phone, I’d purchased a Samsung Galaxy S III, a model that is several years old now but more than good enough as a basic smartphone, at least in its hardware specifications. The device has been popular in the Android hacker community, where people modify devices; I’m not the only one who wants to remove the many unnecessary apps and other software that Samsung and Sprint (and other major carriers) load onto many smartphones. As you’d expect, a cottage industry has sprung up to help customers remove this junk and keep the good stuff.

I wanted to go further: replace the operating system entirely with an open-source alternative called CyanogenMod. Think of it as Google’s basic Android system without the third-party crapware but with some useful enhancements, most notably privacy options that actually mean something.

Once, this was relatively straightforward. But according to the technical experts who populate sites like the excellent XDA Developers hacker site and forum, Samsung and the carriers issued an operating system upgrade to the original Galaxy S III last year. Some of the changes, such as patching security holes, were useful. But another part of the update modified what’s called the “bootloader”—software that checks the system at startup (boot up) and tells the phone what to do next. In this case, it essentially tells the phone not to allow any operating system to load unless it’s the authorized one.

To justify this level of control, mobile companies often cite security. That claim would ring truer if Android device-makers and the carriers that peddle them were quick to install security and other updates on already-sold phones. They aren’t. Sometimes, they outright refuse to update customers’ phones—leaving us wide open to genuine threats—and hope we’ll simply buy new ones. (Samsung does seem to update on a regular basis, at least.) Technologists at the American Civil Liberties Union have asked the Federal Trade Commission to insist on these updates, so we’ll have safer phones, but there’s little sign that federal regulators care.

Apple, at least, sends its security updates on a reasonably prompt basis to iOS devices. But in other ways Apple is even more eager to restrict customers’ options, even to the absurd point of refusing to let users load unapproved apps of any kind, much less tamper with the operating system. Needless to say, there are unauthorized hacks available for iPhone, though Apple keeps trying to stop them.

A more plausible reason for the industry’s controlling behavior is the difficulty of providing technical support for modified devices. Switching operating systems invalidates the warranty. OK, but when I buy a device that’s long since gone out of warranty, I’m not worried about that.

I asked Samsung and Sprint for an explanation of why they are so ardently trying to stop people like me from making my phone a better device. Their responses never quite answered what I considered fairly simple questions. Here are two examples. (Note: “Knox” is a collection of security and management software, including a bootloader, that Samsung includes in its version of Android.)

Sprint: “The decision to unlock the bootloader is made jointly between the phone manufacturer and carrier. We do offer devices from other manufacturers with unlocked bootloaders. If a manufacturer would like to unlock the bootloader, we would work them to develop the device.” (Translation: It’s not our fault! It’s Samsung’s.)

Samsung: “We designed Knox primarily to ensure data and OS integrity while also providing the development community the ability to tinker without jeopardizing the integrity of the data.” (Translation: Don’t blame us, either.)

With more effort, I can hack this phone and use it my way. I may not bother. Ultimately, it would probably be easier to just get a different phone that doesn’t require as many annoying contortions, and then try to resell this one.

But I shouldn’t have to do that. Imagine how outrageous it would be if Apple or Windows-based computer makers refused to honor warranties if you modified the system in a way they found improper. The fact that we let phone-makers do this speaks to our weird acceptance of captivity when it comes to devices that are, in reality, nothing more than portable computers.

If we had proper consumer-protection laws in this country, it would—with very, very few exceptions—be illegal for manufacturers to lock down devices in this way. But expecting our current political system to favor customers over corporations is almost foolhardy.

And there’s a question you should ask yourself when you decide to buy something that contains software and can be connected to digital networks: Who ultimately controls it? You, or the company that “sold” it to you? If the latter, you aren’t buying. You’re just renting.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.