Future Tense

The Race to Build a Computer Powerful Enough to Predict the Future

It will take a machine 1 trillion times faster than your laptop.

Photo illustration by Slate. Photos by Thinkstock, Courtesy of Oak Ridge National Laboratory, U.S. Dept. of Energy.
The U.S.’s Titan supercomputer is no longer even one of the three fastest computers in the world.

Photo illustration by Slate. Photos by Thinkstock, Courtesy of Oak Ridge National Laboratory, U.S. Dept. of Energy.

In June, for the first time in two decades, the United States did not operate one of the top three most powerful computers in the world. Instead, China took the highest two slots, and Switzerland came in third, according to the Top500 list, a global ranking of the most powerful supercomputers on the planet.

The two fastest supercomputers from China clock in at 93 and 33 petaflops. A petaflop is a unit of measuring computer performance that translates to 1,000,000,000,000,000 calculations per second. But even China’s 93 petaflop machine is slower than the supercomputer that the U.S., Japan, and other competing nations want to build—because what these countries really want is to build the world’s first exascale computer. An exaflop is 1,000 petaflops, and there’s no computer that powerful in the world now. Not even close. For perspective, most consumer laptops operate at gigascale speeds, which is 1 billion calculations per second. An exascale computer is a trillion times faster than that.

So what’s the point of that? The more powerful the computer, the more realistic the models it can create. Supercomputers are already used to predict weather and earthquakes, but there’s not currently enough computing power to model complex biological systems precisely enough to make endeavors like large-scale transitioning to wind energy, for example, feasible. An exascale computer would be powerful enough to uncover answers to questions about, say, climate change and growing food that can withstand drought. It could even predict crime (hopefully with more accuracy and fairness than current predictive policing systems).

Building an exascale computer is a national-level project. And earlier this summer, the U.S. Department of Energy shelled out $258 million to six different companies—Hewlett-Packard, Intel, Nvidia, Advanced Micro Devices, Cray, and IBM—all working on the components that would one day go into building such a system.* “There is no single company that can afford to do this, and even a consortium of companies would not be able to do it,” said Thom Dunning, a chemistry professor at the University of Washington and the co-director of the Northwest Institute for Advanced Computing. Which makes the exascale project a perfect example of why government funding of science is so important. The firms awarded the funds will cover at least 40 percent of the cost of the research themselves.

Japan and China both have initiatives to build exascale computer systems, too, and the nation that does it first will unlock all kinds of ways of predicting the future and understanding the present. It could put that country far ahead of the rest of the world in terms of scientific and technological achievement, which in turn translates to economic power.

Take the problem of transitioning to more wind energy. At the moment only about 5 percent of U.S. energy needs are met through wind power. That’s because wind farms aren’t always more cost effective than fossil fuels when factoring out subsidies, even if wind energy is ultimately better for the environment. And that’s why, according to a recent paper by the National Renewable Energy Laboratory, the DOE has dubbed the effort to improve the efficiency of wind power plants a national “grand challenge” that requires “the world’s largest computers and advanced computationally efficient algorithms to resolve.” In other words, with better computing, researchers will be able to accurately model how wind flows through a plant. That know-how will filter directly into better industrial designs and cost reduction of sustainable energy systems. As part of the Department of Energy’s larger exascale project, the NREL is working to build predictive wind energy models that can work on an exascale-level machine by 2022.

Stronger computers mean a better understanding of how we engineer a more sustainable future. The same goes for creating drought-resistant plants or biofuels in the future, which is what Dunning’s research team is working on. Understanding why plants stress out in droughts, says Dunning, has to do with the way ions travel across a cell membrane. And to model that process with the kind of detail needed to make accurate predictions, “you’ve got to worry about the membrane, the ions, and everything that’s inside the cell.” That takes a lot of computer power. “But if you can understand that process better, you may be able to engineer crops that are used for biofuels, as opposed to food, or you could engineer them to better respond to droughts,” Dunning explained.

Exascale computing power would also allow for the federal agencies tasked with making sense of surveillance data, like the National Security Agency and the FBI, to actually analyze the massive amount of information they sweep up in their dragnet global digital surveillance operations. That data is currently stored at a network of data centers across the country, but it’s not necessarily being analyzed at a rate fast enough to thwart attacks. With an exascale system, paired with exascale-level software, law enforcement could scan social media in real time. That, coupled with other data sources, could ostensibly more accurately predict when someone is about to commit a crime and try to stop him. While homed-in surveillance and predictive policing might be one of the more disturbing uses of these systems, national security is one of the key motivations behind exascale research. And like all technologies, it can be used in positive ways, like to promote sustainable energy, and potentially nefarious ways, like deepening state surveillance.

The DOE hopes to get the first exascale computer working in the field by 2021. But in order for that computer to work in a meaningful way, it will need to run software that can handle the processing in an ethical and efficient way, too. And considering how biased and at times racist software can be in today’s systems, which are relatively prehistoric in capacity, it’s not going to be easy to build. Still, one of the hallmarks of the DOE’s work is a strong focus on software and computing power, which sets the U.S. apart from China, which, according to Dunning, is overly focused on hardware.

Currently, the fastest computer in the U.S., Titan, runs at about 18 petaflops per second, but by next year, the government is expecting to unveil a new supercomputer that, at peak performance, will run about 200 petaflops per second. That computer, Summit, is located at the Oak Ridge National Laboratory in Tennessee. That’s a huge jump in capacity, but China is pursuing its exascale computing project aggressively too. And if the U.S. does hope to be the first to capture that level of computing power—a level so high that it can model the present and predict the future—then the $258 million the DOE invested this summer was the right move. Perhaps it’s the nationalist nature of the effort that inspires President Trump to continue government funding of this scientific research, or its ties to national security. But whatever the reason is, it’s exactly the type of scientific research that depends on government-level funding. And the future of our future depends on getting this right.

*Correction, Sept. 18, 2017: This article originally misstated the amount of money the Department of Energy recently invested in supercomputing research. It was $258 million, not $258 billion. (Return.)

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.