Future Tense

If You Can’t Control It, You Can’t Own It

In this photo illustration, Keurig Green Mountain Inc. K-Cup coffee packs are seen on March 5, 2015 in Miami, Florida. 

Photo Illustration by Joe Raedle/Getty Images

When Green Mountain Coffee Roasters announced its quarterly financial results last week, the news was bad for the company—but good for people who believe that when they buy something, they should decide how to use it.

Here’s what happened. A year ago, Green Mountain announced it would be selling a new version of its Keurig coffee machine, a single-cup brewer, that would work only with coffee sold by, you guessed it, Green Mountain. This would be accomplished through the use of digital rights management—technology that restricts how customers can use their machines.

Customers rebelled, I’m glad to say. They broke the DRM and then voted with their wallets. And the company’s CEO, Brian Kelley, found himself telling journalists and analysts last week that he’d made a mistake with the DRM. Even though his mea culpa sounded more grudging than heartfelt, he said his company would henceforth allow its customers to use whatever coffee they liked.

This was a rare but important victory for you and me in an ongoing war. At stake is who will control the devices we use, and our communications. Increasingly, it’s not us.

Device makers have been taking liberties for years. Apple, for example, lets you decide what apps you want to load on an iPhone or iPad, but you can choose only from apps Apple allows in its online store. Amazon has actually removed book files (including, oh the irony, 1984) from its customers’ Kindles. The list is long and getting longer.

The war for control will expand in coming years, because computers and software are becoming part of almost everything we touch. The people who sell us things are embedding them with programmable chips, then connecting them to digital networks—and making choices for their customers about what’s allowed after we buy them.

Now, adding connected intelligence and memory to our everyday environment can have real benefits. We can understand our world better. If you’ve used traffic maps generated by monitoring the locations and velocities of countless cars, you know how helpful embedded, connected intelligence can be.

We hear a lot about the surveillance implications of this trend, and we’re right to worry. But we don’t talk enough about the outright control can give others when we use devices in such a world where we truly own nothing. If you need permission from a third party, it can be revoked.

The Keurig saga is, by itself, relatively trivial, in part because we have so many other options for coffee. But DRM and other third-party control mechanisms have had an enormous impact in areas such as the arts. The copyright cartel has squashed (or tried to) any number of useful innovations in recent decades and works constantly to torpedo or control any technology that might possibly be used for copyright infringement, never mind how much it might advance artistic creativity or spread useful information.

This is why the Big Sports branch of the entertainment business is going berserk over several new mobile apps, notably Meerkat and Twitter’s Periscope, that record video and simultaneously stream it to the Internet. An amazingly large number of people paid good money to watch the recent megabucks boxing match via live TV, but there were plenty of live streams available from folks who just pointed their phones at the TV screen. In a truly bizarre development, people are actually watching Game of Thrones via Periscope.

Hollywood and its allies have been in a minipanic for years about what they call the “analog hole”—another expression for our eyes and ears, since at some point audio and video have to be made available in a format that our analog selves can watch and hear. The copyright profiteers have periodically pined for new kinds of DRM to blind and deafen devices by requiring camera makers to build in technology that refuses to capture another video if that video says “don’t record me.”

If our cameras can be ordered not to record one kind of thing, they can surely be told to stay off in other circumstances. Want to take a picture of public building deemed “sensitive” by some paranoid government agency? Nope. Those videos of police misbehavior that are, at long last, shining a light on a national disgrace? Sorry, no longer allowed.

The implications of this go much further. Consider this example, less about DRM and more about ultimate control. Tesla, the innovative electric-car maker, periodically sends software updates to its customers’ Model S cars. One recent update improved the car’s performance, including faster speeds.

But Tesla’s cars, and increasingly all cars, are computer networks on wheels. And if a car company can remotely give vehicles a way to go faster, it can also remotely take that away. Maybe Tesla won’t do that by its own accord, but you can bet it will someday when ordered to do so by a bureaucrat or judge.

Tesla’s software and hardware updates are also pointing toward the driverless car era, when algorithms make the key decisions about how we’ll get where we want to go. Companies that issue subprime auto loans, for instance, have installed starter interrupt devices in cars, so they can remotely disable a vehicle if someone misses a payment. Now consider what happens when a government decides to shut down auto access to a certain geography, or decides you shouldn’t be allowed to drive, period. It’ll issue an order to the companies that control the vehicles, and that will be that.

We’ve barely begun to think through these issues, unfortunately. We prefer to deploy and react, because we love our new gadgets and capabilities. Can we afford to do that now?