Moneybox

U.S. Cities Have Grown. Their Cores Have Not.

Los Angeles in a class of its own, as usual

Ian McFarland, via Flickr

The 1960 census marked the dawn of urban decline in the United States: It was the first to record population drops in pre-war industrial hubs like New York, Chicago, Detroit, Baltimore, St. Louis, Boston, and Cleveland. The ensuing decades saw American cities, particularly in the Northeast and Midwest, lose jobs and residents by the hundreds of thousands. Busy neighborhoods emptied out as workers moved to the suburbs, or to the Sun Belt.

In percentage terms, the country continued to urbanize all the while, with a dozen U.S. cities—led by Sun Belt powerhouses like Phoenix, Houston, San Antonio, and Nashville—adding more than 500,000 residents in that time. Today, rebounding city populations all across the country are often cited as evidence for a new urban era in America.

But the country’s second wave of urbanization hasn’t looked much like the first, to say the least. A new and illuminating analysis by Yonah Freemark, a project manager at Chicago’s Metropolitan Planning Council and the author of the Transport Politic blog—well worth reading in full—reveals some important trends in the past half-century of city-building. Only a handful of American cities have added people, on balance, to areas already developed in 1960. This can be attributed both the severity of the urban crisis (which is a story often told)—and, just as importantly, to the selective nature of growth since.

“Urban” growth has mostly been greenfield sprawl: The conversion of farmland or other lightly used tracts into housing. “The average of the 100 largest cities grew by 48 percent overall,” Freemark notes. “Yet the average city also lost 28 percent of its residents within its neighborhoods that were built up in 1960.” That’s not just true in Youngstown and Detroit, post-industrial Rust Belt cities that have struggled with blight. Houston, Dallas, Charlotte, Las Vegas, and Nashville, to name just a few booming cities, have all lost people in built-up, midcentury neighborhoods. In fact, Freemark shows, inner-core residential decline in Southern cities is virtually identical to that in Midwestern cities, despite divergent population trends in the cities at large. Older, denser, inner neighborhoods are, in almost every city, much less populous now than in 1960.

Even some growing cities with reputations for people living close to the city center, like Denver, Austin, Portland, and New York City, have fewer people living in older urban districts today than in 1960.

The marker that Freemark uses to qualify midcentury built-up neighborhoods is 4,000 persons per square mile. That’s a fairly low density marker—Las Vegas is denser—but it illustrates the extent to which postwar “urbanization” in the U.S. has meant, almost exclusively, turning fields into suburbs, not suburbs into urban neighborhoods.

Finally, there is one city that is such an outlier it appears to be almost a data error: Los Angeles. Some cities have experienced serious infill population growth since 1960: Miami, San Jose, San Francisco, and Seattle. But Los Angeles and Long Beach have together added more than a million people to urban areas since 1960.

That’s an astonishing statistic, especially when compared with development patterns in the rest of the country during that time. It also helps explain why today’s L.A. is so different from the freeway metropolis lionized by Joan Didion and Reyner Banham.* It explains both why L.A. is in the midst of America’s most ambitious mass transit construction program, and why entrenched residents have opted to restrict the city’s building envelope and will put one of the nation’s most aggressive anti-development laws on the ballot next year.

They say Los Angeles is full. (Is such a thing possible?)

What’s beyond dispute is that it has, unlike virtually every other American city, filled up.

*Correction, August 22, 2016: This post originally misspelled Reyner Banham’s first name.