Future Tense
The Citizen's Guide to the Future

Nov. 12 2015 11:36 AM

Netizen Report: Russia Could Have the Power to Knock (Some) Websites Offline Worldwide

The Netizen Report offers an international snapshot of challenges, victories, and emerging trends in Internet rights around the world. It originally appears each week on Global Voices Advocacy. Ellery Roberts Biddle, Lisa Ferguson, Hae-in Lim, and Sarah Myers West contributed to this report.

GVA logo

Russian Internet watchdog Roscomnadzor may soon have additional powers to block websites without a court order. Currently, Roscomnadzor can block sites at the URL level, which prevents Russian users from accessing them but leaves the content available to outside users, as well as Russian users deploying circumvention tools. Under the new expansion, the agency could block websites with the .ru and .рф top-level domain not only in Russia but for users around the world.

Net neutrality activists protest Facebook’s Free Basics at Internet Governance Forum
At the U.N.-sponsored Internet Governance Forum in Brazil this week, several activists were stopped from demonstrating in support of net neutrality and against Facebook’s Internet.org initiative, now known as Free Basics, which offers a limited set of online services, including Facebook, to mobile phone users in less-developed countries. The activists walked through the opening plenary session of the event, holding signs that read “No to Net Neutrality violation in Brazil and all the world!” and “Free Basics = Free of Basic Rights.” Multiple videos captured by conference participants show security guards and staff taking the signs from the demonstrators’ hands and forcibly removing them from the hall. The activists’ identification documents were briefly seized, and organizers confiscated their credentials.

Shortly thereafter, civil society leader and Brazilian free expression advocate Joana Varon, an invited speaker at the opening session, condemned the act, saying, “[L]et's let people who cannot be on the stage also symbolically express their key questions regarding the future of Internet in front of high level panels like this.” Grass-roots group Marco Civil Ja, a leading supporter of Brazil’s “Bill of Rights for the Internet,” quickly issued a statement reprimanding IGF leadership for its actions: “This kind of repression is incompatible inside an event that is precisely discussing how to protect freedom of speech [on] the Internet and how to guarantee privacy of those who use the world wide web.”

The activists’ credentials were reinstated several hours after the incident.

Facebook user in Hungary faces fine, irony
A local government in the city of Tata in Hungary fined a woman for a Facebook post critical of government spending.

Four Tanzanians prosecuted for WhatsApp messages during elections
Four political activists
have been charged under Tanzania’s Cybercrime Act for published an audio message on a WhatsApp group that was “intended to mislead the public” during Tanzania’s October 2015 general elections. The message stated that the country’s ruling party had engaged in vote fraud, an allegation that both local and foreign media outlets echoed.

The Gambia garners low marks on Internet openness
In Freedom House’s recent review of free expression online in 12 countries in sub-Saharan Africa, the Gambia ranked right at the bottom, with only Ethiopia ranking worse on Internet freedom. In the past year, the Gambia has jailed online journalists, blocked opposition and news websites, and blocked Internet access entirely, for as long as a full week in April of this year.

U.K.’s new surveillance law is “worse than scary”
A proposed law granting greater surveillance powers to U.K. law enforcement has been described as “worse than scary” by U.N. Special Rapporteur for Privacy Joseph Cannataci and could bring “very dire consequences,” according to Apple CEO Tim Cook. Among other things, bill would provide a mandate for bulk surveillance allowing law enforcement to hack into and plant monitoring tools on computers and phones. It would also obligate Internet companies to help government officials bypass encryption. The bill was introduced in the House of Commons last week and will now be open for debate.

New Research

Video Advertisement

Nov. 11 2015 6:32 PM

Microsoft’s New Plan for Fighting Government Surveillance: Store Data in Germany

Microsoft announced on Wednesday that it will open new data centers in Germany in late 2016. For a company with such extensive cloud infrastructure, that might not seem so unusual, but these data centers are part of an ongoing effort to reassure European clients who worry that using United States–based tech services means subjecting themselves to U.S. government surveillance. T-Systems, a subsidiary of Deutsche Telekom, will be the "data trustee" of the centers, and Microsoft employees won't have access to any of the data stored there, except with T-Systems’ approval.

Germany has strong privacy laws that Microsoft is hoping will protect the centers in case a U.S. government agency demands data from the company. The Financial Times reports that Microsoft customers who want to use the German data centers specifically will have to pay an additional fee. Microsoft CEO Satya Nadella said in a statement that the new data centers will "offer customers choice and trust in how their data is handled and where it is stored."

Microsoft and other U.S.-based tech companies have been scrambling to reassure European clients ever since Edward Snowden's revelations about the National Security Agency in 2013. And more recently, the European Union struck down the "Safe Harbor" data-transfer pact that allowed tech companies to move data from European to U.S. data centers. Negotiations on a new agreement are ongoing, and Microsoft's new German data centers may influence the discussion.

Paul Miller of Forrester Research told the Financial Times that it is unclear how well Microsoft's new approach will stand up. "As with all new legal approaches, we don’t know it is watertight until it is challenged in court," he said.

On Tuesday, Microsoft also announced that it will open data centers in the United Kingdom in late 2016 so local clients can use in-country cloud services. "Local data centers will help meet ... demand, especially for those organizations looking for solutions delivered from data centers based in the UK," said Michel Van der Bel, the area vice president and general manager of Microsoft UK. It's a subtle way of saying that enterprise customers know the privacy laws in each country and want to be able to choose where they keep their data. Microsoft also noted that its data centers in Ireland and the Netherlands are now operational.

Nov. 11 2015 11:32 AM

Future Tense Newsletter: Mapping the Library of Tomorrow

Howdy Slate-liens,

If you think the weather’s been strange lately, you’re not wrong: 2015 has been the warmest year in millennia, and it’s also exhibited baffling and frightening climatological patterns. As Eric Holthaus suggested in Future Tense, it may be time to step up environmental fights, especially now that the Keystone XL pipeline is finally dead. In one positive step, the EPA is developing tests that will make it harder for automakers to cheat on emissions tests. Another, less direct development may be found in global library initiatives that can, proponents claim, organically encourage climate activism, public health, and more.

Libraries have other civic roles to play, including protecting the data of their patrons as they begin to provide more and more digital services. And in other privacy news, federal lawmakers pushed to further regulate cellphone surveillance technology at the state and local levels. Of course, we already know what you’ll be doing on your cellphone in the future: streaming endless hours of video, which T-Mobile now won’t be counting toward data allotments, at least if you’re using the right services.

Here are the other stories that we were tempted to post on President Obama’s new Facebook page this week:

  • Gaming: So long as you set goals for yourself, video games can help players stave off depression.
  • Empathy: A new study shows that we exhibit empathic responses to robots in apparent pain.
  • Drones: Fighting fires just got a little easier thanks to a drone that shoots fireballs. Seriously.
  • Artificial intelligence: Researchers at Facebook announced some exciting developments in computer vision and other machine learning technologies.


Waiting until tomorrow comes,

Jacob Brogan

for Future Tense

Nov. 10 2015 6:58 PM

T-Mobile Is Making It Easier to Stream Video Without Using Up Your Data. What’s the Catch?

On Tuesday, T-Mobile announced at its 10th “Uncarrier” event in Los Angeles that it will stop counting video streaming from certain services against customers’ monthly data allotments. Starting on Nov. 15 for new customers and Nov. 19 for existing customers, T-Mobile users with 3GB data plans or higher will be able to do unlimited mobile streaming from more than 20 services like Netflix, HBO GO/NOW, Hulu, and VUDU.* If this isn’t living the dream, what is?

Current Binge On options.

Screencap from T-Mobile

T-Mobile is calling the program Binge On. It builds off of the carrier’s Music Freedom initiative, which launched in June 2014 with a similar setup for unlimited music streaming.

Unlimited video streaming without data overages sounds like everything everyone has ever wanted. So what are the downsides? First of all, Binge On only streams at “DVD-quality,” otherwise known as 480p. It may not be a total dealbreaker, but high-definition 720p or 1080p have been standard for a long time on 4G LTE. And these days you can actually see the difference on most smartphone screens. If you’re eating up your data allotment (or you’re on Wi-Fi) you can stream at whatever resolution you want, but if you have Binge On enabled you’ll be stuck at 480p.

The other problem has to do with the structure of these types of arrangements. As with Music Freedom, Binge On puts T-Mobile in the position to choose which video streaming services get priority in the marketplace. T-Mobile customers will get nudged toward the services on the list. This is a net neutrality issue that could affect small or up-and-coming services. It can also exclude services that are already popular. For example, YouTube and YouTube Red aren’t on the Binge On list right now.

T-Mobile seems prepared for this criticism. The company says, “If your go-to video streaming service isn’t part of the program yet, tweet us your favorite service @TMobile, along with the hashtag #BingeOn. If they meet our requirements, we’ll investigate the feasibility of adding them.” And just to emphasize that this isn’t paid prioritization, T-Mobile adds, “No one pays to join and no money is exchanged.”

It’s certainly better than an arrangement where streaming companies have to buy in, but it’s still a situation in which a carrier (the mobile version of an Internet service provider) is incentivizing customers to use some services over others. The moral argument probably won’t keep you from marathoning Narcos on your commute, though.

Correction, Nov. 10, 2015, 8:15 p.m.: This post originally misstated that Binge On will be available to all T-Mobile customers on Nov. 15. It will become available for new customers on that day, and to existing customers on Nov. 19.

Nov. 10 2015 4:44 PM

Fresh Climate Data Confirms 2015 Is Unlike Any Other Year in Human History

Over the past few days, a bevy of climate data has come together to tell a familiar yet shocking story: Humans have profoundly altered the planet’s life-support system, with 2015 increasingly likely to be an exclamation point on recent trends.

On Monday, scientists at Britain’s national weather service, the Met Office, said our planet will finish this year more than one degree Celsius warmer than preindustrial levels for the first time. That figure is halfway to the line in the sand that scientists say represents “dangerous” climate change and global leaders have committed to avoid—an ominous milestone.

This year’s global heat wave—about two-tenths of a degree warmer than 2014, a massive leap when averaged over the entire planet—can be blamed most immediately on an exceptionally strong El Niño but wouldn’t exist without decades of heat-trapping emissions from fossil fuel burning. Separate data released on Monday by the U.S. National Oceanic and Atmospheric Administration showed the current El Niño, a periodic warming of the tropical Pacific Ocean, has now tied 1997 for the strongest event ever measured, at least on a weekly basis.

"We've had similar natural events in the past, yet this is the first time we are set to reach the 1 degree marker and it's clear that it is human influence driving our modern climate into uncharted territory," said Stephen Belcher, director of the Met Office’s Hadley Centre in a statement.

The Met Office data were quickly confirmed on Twitter by Gavin Schmidt, who leads the research center in charge of NASA’s global temperature dataset, which uses a slightly different methodology:

If that wasn’t enough, the World Meteorological Organization, a division of the United Nations, also confirmed on Monday that global carbon dioxide levels reached a new record high in 2014—for the 30th consecutive year. The more carbon dioxide in the atmosphere, the more efficient the planet is at trapping the sun’s heat, and so global temperatures rise. Since our carbon dioxide emissions have a lifespan of a hundred years or so, there’s a significant lag in this process—temperatures will keep rising for decades even if all human emissions ceased today.

That means not only will 2015 end up as the planet’s warmest year in millennia—and probably since the invention of agriculture more than 10,000 years ago—but that there’s a lot more warming that’s already baked into the global climate system.

All that extra heat is already changing the planet in complex ways. For example, as of last week, there’s fresh evidence that the Atlantic Ocean’s fundamental circulation system is slowing down.

Over the past few years, a notoriously persistent cold patch of ocean has emerged just south of Greenland in the north Atlantic. There have been several theories as to why this is happening, but most involve a slowdown of the Atlantic Meridional Overturning Circulation, part of the global oceanic “conveyor belt” system of heat and water that helps regulate the Earth’s climate by cooling off the tropics and gently warming polar regions.

You wouldn’t necessarily expect persistent record-cold temperatures when the planet overall temperature is at record highs, but that’s exactly what’s happening:

That weird little cold patch in the north Atlantic has an interesting story.


The AMOC is so important that its slowdown has been linked to past episodes of abrupt climate change, like a three-degree Celsius drop in Northern Hemisphere temperatures in less than 20 years about 8,000 years ago, and formed the highly dramatized basis for the planetary chaos featured in The Day After Tomorrow. Earlier this year, an important study provided further strong evidence that melting ice from Greenland has begun to disrupt and slow down the ocean’s circulation by changing the density of the north Atlantic, with profound consequences: In 2009, East Coast sea levels sharply—and temporarily—jumped by about four inches as water piled up. Stronger winter storms and an interruption of the Atlantic marine food chain also may already be happening.

According to a new analysis released last week, scientists used data from a pair of NASA satellites to track climate-related changes in the north Atlantic—the first time ocean currents have been tracked from space. Over the last decade, the satellites were able to take highly precise measurements of the literal weight of the ocean between Florida and Iceland that corroborated measurements from a network of ocean buoys over the same general place and time. From that information, they were able to calculate that the Atlantic’s circulation is indeed slowing down, a potential climate tipping point that’s been long predicted to occur at some point in the 21st century. Call it one more data point from a rapidly changing planet.

Still, despite the blindingly clear data, there’s hope that the tide could—finally—be shifting on climate change. Later this month, world leaders will be gathering in Paris and are widely expected to agree to the first-ever global agreement to constrain future emissions trajectories in a meaningful way—possibly enough to avoid the worst-case climate scenario.

Nov. 10 2015 9:00 AM

Google Maps Will Now Give You Directions Even If You Don’t Have Service

You finish a hike and you want to look up where to get a snack, but your smartphone can't find service. Your car garage is a dead zone. Or maybe you're traveling internationally and don't want to pay for smartphone connectivity. On Tuesday Google announced a new feature for its popular Maps service that will allow you to store data offline so you can look up local businesses and start turn-by-turn navigation when your device is totally cut off. And if service comes back at some point, everything seamlessly updates with the real-time data you've been missing.

The idea is that you select a broad geographic area that encompasses all the regions you might go to on a typical day. Google Maps calculates how much storage it will take to keep the information for that area locally on your device, and then you start the download. Amanda Bishop, a product manager on Google Maps, says that the Bay Area weighs in at about 200 MB, New York City is around 250 MB, and London is around 300 MB. (To compare, the Google Maps app by itself takes up 36.1 MB on iOS.) In offline mode, you won't see photos or written user reviews for local businesses and you can't get traffic estimates, but there is still information like hours of operation and phone numbers for businesses, and routes update and adapt to traffic conditions if your phone regains service.

"This is not just a map, this is the full-blown thing," Bishop said. "The biggest engineering challenge is that Google Maps was architected from the beginning to be a cloud-based app that leverages the power of Google's cloud and all of those servers and those massive cores that can crunch on these really complex algorithms." She explained that condensing the Google Maps product into something that could be powered by a typical smartphone processor was difficult. "But we really did want to do that, we didn't want to build something for offline from scratch because then you can't do this seamless transition between the two."

If you travel all the time for work or are going on a long vacation, you can even download an enormous area, like all of Europe, if you have the space to store the data on your phone. But even if you have a low-end smartphone that doesn't have a very powerful processor, offline maps is designed to work. Google sees one of the feature's most crucial use cases as being in emerging markets. "Mobile data speeds are really slow, 2G primarily, 3G at best," Bishop said. "And [data] is also really expensive relative to those users' incomes." Being able to store a region's data on a device through a one-time download (that updates when a device has connectivity) means an initial investment to do the download followed by the option of using zero data for mapping in the future.

It's going to be easy to get used to this. The hard part will be forgetting to do the big downloads for new places and just expecting it to work everywhere.

Nov. 9 2015 7:32 PM

President Obama’s Getting Ready for Civilian Life: He Now Has His Own Facebook Page.

Barack Obama has 437 days left in office, which I know because I correctly assumed that a lot of results would come up if I searched for "countdown to the end of obama presidency." As he prepares to mosey on to the role of public statesman (and possibly teach at Columbia University), the president has been building a social media presence fit for civilian life. He set up a personal Twitter account in May—and now he has his own Facebook page.

The first post on the account is a video tour around the backyard of the White House. As with most of the videos our Facebook friends post, it's pretty boring (sometimes he sees a fox!), so Obama’s fitting in well so far. The page bio describes him as a "Dad, husband, and 44th President of the United States," in that order. The backyard video is prefaced with a post explaining that the Facebook page is "a place where you can hear directly from me, and share your own thoughts and stories."

Unlike the "Politician" Barack Obama page, which is run by Obama's advocacy group Organizing for Action, the new page puts Obama in the "Public Figure" category. The one dicey thing here is that the old campaign Facebook page has the URL "https://www.facebook.com/barackobama" and the new page has "https://www.facebook.com/potus," which is basically the exact opposite of what the two pages are trying to convey. It would have been tough to plan for all of this around 2006 and 2007, though. (Custom Facebook URLs weren't even released to everyone until June 2009.)

The new URL is consistent with Obama's personal Twitter account, however, which is @POTUS. Apparently his strategists want to lock down the POTUS brand so no one else can have it. The Pope's Twitter account @Pontifex gets passed from one pope to the next, but Obama may not want to be so generous. So far his new page has more than 350,000 followers.

Nov. 9 2015 6:26 PM

What Is “TensorFlow,” and Why Is Google So Excited About It?

Google made an announcement Monday that will send ripples throughout the technology industry for years to come—even if it sounds like gobbledygook to the average person.

The company said in a blog post that it has built a new machine-learning system called “TensorFlow” and that it will make the software open-source, so that others can use and build on it.

That’s important for at least two reasons.

First, machine-learning algorithms are increasingly at the core of Google’s technology. I’ve written before that Google is no longer a search company—it’s a machine-learning company. What’s machine learning, you might ask? My colleague David Auerbach recently wrote a very nice column explaining the concept and its growing importance. In the simplest terms, you can think of it as software that makes inferences based on data and can learn from its mistakes.

TensorFlow is already powering nifty Google features like speech recognition in the Google app, “smart reply” in the Inbox app, and the surprisingly powerful search function in the Google Photos app. (Search for “dogs” and it will attempt to show you all the photos in your library that have dogs in them. Oh, but maybe don’t search for “gorillas.”)

It’s also built to tackle more abstract, science-fiction-y challenges. TensorFlow is the successor to Google’s vaunted research-grade machine-learning infrastructure, called DistBelief. That system was responsible for such achievements as DeepDream—the subject of another excellent Auerbach explainer—and a computer network that taught itself to identify cats on YouTube. In a post on its research blog, Google says TensorFlow is faster, more flexible, and easier to use than DistBelief. And whereas DistBelief was geared specifically toward one type of machine-learning software, called neural networks, TensorFlow is designed to accommodate different approaches.

Which brings me to the second reason the announcement is important: By making TensorFlow’s code open-source, Google is opening the door for companies and computer scientists around the world to implement cutting-edge machine-learning algorithms in their own products and research. As with many things Google does, this is at once altruistic and self-serving: It will help lots of people who are not Google, while at the same time helping to entrench Google’s approach to machine learning as an industry standard.

It’s fair to compare TensorFlow to Android, Google’s mobile operating system, which it has allowed smartphone and tablet developers to use in their own products. But a more instructive comparison might be MapReduce, the groundbreaking Google data-processing algorithm that has found an open-source implementation in Hadoop. It’s surely not a coincidence that both TensorFlow and MapReduce were developed in part by Google superprogrammer Jeff Dean. I profiled Dean in a 2013 Slate story that sought to explain how he became “the Chuck Norris of the Internet.” In short: He’s a wizard when it comes to simplifying problems of enormous complexity.

MapReduce and Hadoop helped to make “big data” a household phrase and a part of just about every Fortune 500 company’s strategy. TensorFlow could do much the same for machine learning.

Where MapReduce was about generating and processing data sets, TensorFlow is about harnessing them to build “smart” software programs that can do cool stuff. Machine learning already plays a role in applications ranging from Netflix recommendations to your Facebook feed to self-driving cars. In the years to come it will find an even broader range of applications—some ingenious, others inane. Google won't pocket a check when people use TensorFlow to build their own machine-learning software. But rest assured it will find fresh ways to profit from a world in which yet another of its core technologies has become ubiquitous.

Nov. 9 2015 3:41 PM

After VW Scandal, EPA Will Emission Test Diesel Cars in Real-World Conditions

More than 11 million Volkswagen diesel vehicles were able to game the Environmental Protection Agency's emissions tests by using sneaky software to identify when an assessment was going on—and the EPA doesn't want to get fooled again. The New York Times reports that the agency will begin conducting a randomized set of tests in real-world conditions rather than labs to make it significantly harder to manipulate results.

Volkswagen is currently the only car maker known to have cheated on emission tests. Before the scandal, the European Union was already planning to switch to road-based assessments in 2017, and now other countries like India, China, and Mexico are mulling the change as well. The EPA's director of the Office of Transportation and Air Quality, Christopher Grundler, told the Times that, “Manufacturers have asked us what the test conditions would be, and we’ve told them that they don’t have a need to know. ... It will be random.”

Last Monday the EPA published evidence that there are 10,000 more cars equipped with "defeat software" than previously known. Then on Tuesday, Volkswagen announced that an internal investigation had revealed 800,000 gasoline-powered cars that have “irregularities” related to their carbon-dioxide output.

As Volkswagen continues an investigation into what happened internally and who is to blame for the pernicious cheating software, a larger picture of internal corporate politics is starting to emerge. For example, the Times separately reported on Sunday that VW admitted to concerns about some of its gasoline engine cars because of pressure from an engineer/would-be whistleblower.

Volkswagen CEO Matthias Müller said in a statement last week that, "I have pushed hard for the relentless and comprehensive clarification of events. ... This is a painful process, but it is our only alternative.” Painful is the right word for it.

Nov. 6 2015 6:15 PM

Community Noise Complaints Fail to Shut Down French Data Center 

You might think of noise complaints as being primarily related to construction or raves. Maybe sometimes there's a dog that won't stop barking or a loud ventilation system. But residents in a Paris neighborhood have been dealing with the hum of a cloud computing center since 2012. And they're not happy about it. Unfortunately, a French court recently shot down their noise complaints.

For a little while it looked like residents from La Courneuve, Paris, might win. Last month an administrative court revoked cloud company Interxion's license for operating its PAR7 Paris data center. But Datacenter Dynamics reports that a new court ruling last week reversed this decision and allowed Interexion to apply for a new operating permit on Monday.

PAR7 is an $141 million data center with almost 50,000 square feet of space. ZDNet reports that it has eight diesel generators and houses more than 500 million liters of fuel. Community group Urbaxion ’93 Association argued that PAR7 is an evironmental hazard due to factors like the noise it generates and the fuel it houses on site. But a 2014 noise study commissioned by Interxion may have affected this week's court decision. The evaluation showed that PAR7 was generating noise within the pre-set limits planned for it before it was built.

And Interxion spokesperson told Datacenter Dynamics, ”Continuing to operate PAR7 allows us to meet the growing needs of our clients, whether in the public or private sector, and to address the increasing demand for digital infrastructures in France, both for business and individual use.”

We all want our photos and documents from the cloud, but local residents are probably not satisfied with that justification.