Posted Tuesday, Sept. 4, 2012, at 10:34 AM
President Barack Obama delivering an address in Washington last May.
Photograph by Kristoffer Tripplaar-Pool/Getty Images
A great deal more will be written on the question of whether an incumbent president who presided over an entire term of disappointing jobs and GDP growth deserves re-election. This is, of course, precisely the kind of question—devoid of context and precooked to solicit a particular answer—that mass market democratic contests usually turn on.
Are we better off today than we were four years ago?
Economically, the answer is not in doubt: Yes, we are. But there are lots of caveats.
First, the view that President Obama wants to emerge from Charlotte: Four years ago the country was sliding over the edge of an economic cliff. Today, we’ve got one leg back on top, and even with the Republican congressional caucus holding onto the other leg and screaming, “I’d rather fall to my death than climb back onto that debt-strewn precipice”—we’re clawing our way to safety.
All this is true enough.
But is everything all right? No.
Is the country happier than it was four years ago? Hardly.
Is the country more certain of its future? Perhaps, but if so, we don’t like what that future looks like, and we’re desperate to blame someone else for our misfortunes (China, immigrants, Muslims, bankers, socialist Kenyans).
Are the structural economic problems that caused the 2008 debacle solved? Mostly no.
Are we getting back to normal? Well, of course not: Times were not normal to start. To get back to that normal would be national suicide—an asset bubble fueled normal more unsustainable than anything either of our political parties is flirting with today.
The "good old days" is another staple of dumbed-down campaigning, and it is in full view today. In reality, “normalcy,” the alleged “promised land” that Warren G. Harding offered his Republican constituents in the 1920 presidential race, is always unattainable.*
In Harding's day, it disappeared when combustion engines replaced horses. Harding knew people pined for that simpler time—when don’t they? But it turned out there was no going back to the days before World War I then, and there’s no going back to the false prosperity of the bubbly Bush years now, either. Harding’s promise was the sham, and the absentee governance of his administration, and those of his two GOP successors, Calvin Coolidge and Herbert Hoover, set the stage for the collapse that followed.
Do we really pine for the bubble years? Remember, folks, the “prosperity” now implied by those who as about “four years ago” were fuelled by a runaway financial system that treated peoples’ homes, jobs and lives like so many chips in a casino.
Would we be “better off” if the bubble loomed over us again? No, we’d be walking toward an even deeper cliff.
The fact is, a “balance sheet recession”—a recession caused by reckless spending and our unwillingness to raise the revenues necessary to cover expenses—takes longer than other downturns to mend. It’s the difference between the headache that results from having a few too many glasses of red wine and the medically induced coma that comes from mixing your red wine with steroids, Extacy, West Texas moonshine and anything else high-spending lobbyists put in your hands. That latter scenario does real damage, a near death experience with a long recovery time and no guarantee that you will ever again be the same.
That said, Americans have plenty of reason to be disappointed in Obama’s performance. He took the high road too often on economic policy in the beginning, failing to lay out for Americans a realistic timeline for recovery or explain the extent of the hole they were in. “Yes We Can” can-doism led to candy-coated politics. He should have been driving home the lesson of the Bush years (and, to be fair, Clinton’s second term): deregulation of markets is not a religion, it is a theory. It has limits, that those limits must be monitored or we will destroy the economy.
But he did not do that. Today, as he prepares for his second nomination speech, there will be no Gospel-like refrain that calls to mind those days when most everyone was glad to see him—a "pay any price, bear any burden" kind of phrase harkening back to a time when the whole country was yearning for leadership.
So yes, he steered us away from the cliff. But he seems to have fallen asleep shortly thereafter at the wheel.
Ironically, his economic policies are not the real problem. Again, this was always going to take a long time to solve. We can argue whether there should have been more stimulus (I think so). But on the finer economic points, the general direction has been correct.
The Economist, usually inclined toward centrists of either party, cannot bring itself to completely condemn Obama, just as it cannot seem to believe what Mitt Romney has morphed into. The magazine noted this week in a clever “End-of-Term” report on Obama’s first term, “His handling of the crisis and recession were impressive. Unfortunately, his efforts to reshape the economy have often misfired. And America’s public finances are in a dire state.”
Recessions, as Europe demonstrates every single day, are no time to cut government spending: the result is a vicious circle in which austerity kills growth and deficits become nearly insurmountable (especially in countries that have to fund them on the open market). So even if deficits rise during a recession, the idea is to hasten the return of growth that, in the end, is the only real solution to such gaps.
As the Economist points out, “normal standards of fiscal rectitude have not applied in the past four years. When households, firms and state and local governments are cutting their debts, the federal government would have made the recession worse by doing the same.”
How is it a man who rode to office on such a raft of soaring rhetoric has failed to put that last point in simple language for the American voter? Whatever the depth of the damage he inherited or the obstructionism of his opponents, it could turn out that the lack of a mantra sinks him. At such times, few choice words uttered under pressure—“nothing to fear but fear itself”—“morning in America”—are worth a million lines of legislation.
Correction, Sept. 5, 2012: This blog post originally referred to Warren G. Harding's 1924 presidential race. Harding ran for president—and won—in 1920.
Posted Monday, Sept. 3, 2012, at 12:27 PM
Photo by Streeter Lecka/Getty Images.
"All questions in a democracy [are] questions of money."
--Mark Hanna, 1896 campaign manager for GOP candidate William McKinley
Well-meaning just doesn’t cut it anymore. The history of attempts to prevent money from perverting our politics is as old as the Republic, but only in 1896 did Mark Hanna, a manipulative genius of the highest magnitude, turn political fundraising into a science.
The reaction of well-meaning people has been behind the curve ever since. Money finds a way.
As Jack Beatty noted in a wonderfully prescient piece in the Atlantic in 2007, William McKinley’s talent for channeling corporate money into his 1896 campaign won him the White House but also thoroughly blackened his name and compromised his administration.
After an anarchist murdered McKinley in 1901, his successor, Teddy Roosevelt, demanded a complete ban on corporate giving to presidential campaigns—something he got from Congress in 1907.
The Taft-Hartley Act in 1947 (passed by a Democratic administration, to continue the irony), extended the ban to labor unions. Campaigns, unions, and corporations continued to seek loopholes—Nixon famously had “bagmen” collect money from plutocrats like Yankees owner George Steinbrenner, helping spawn the Federal Elections Commission in the reform-minded early 1970s. But for years, Taft-Hartley and TR’s 1907 reforms did for U.S. political campaigns what the Glass-Steagall Act had done for markets: ensured that efforts to manipulate them would be illegal.*
Alas, money found a way.
Eventually, in part due to the grandiosity of these “well-meaning” efforts, each one of these reforms failed, all falling afoul of those who claim their right to free speech was being unconstitutionally curbed because (as they argued) spending money on campaigns is a form of speech. Add to the list the post-Watergate effort to ensure only public money was available for campaigning, efforts in the 1980s to ban soft money, the largely failed campaign to get candidates to foreswear PAC money in the 1990s and 2000s, and of course the McCain-Feingold effort that began in 2002.
Well-meaning people still haven’t taken the lesson, though. Current efforts to curb out-of-control super PACs and corporate and labor donations still rely on high-minded constitutional arguments. Americans Elect, that late-night dorm-room attempt to inject reason into our politics, died on the vine. In fact, the solution is much simpler: Attack the conflict of interest that exists at the heart of these transactions, not the transactions themselves.
Rather than legislating what types of money are or are not dirty or attempting to define which class of citizen (corporate or otherwise) deserves free speech, Congress should simply create incentives for civic-minded behavior.
How? Well, start with corporations: The law should make political neutrality a requirement for doing business of any kind with the U.S. government and reserve certain privileges—like the right to classify all one’s income as investment income or qualification for Federal Deposit Insurance, for instance.
Corporations could always defy that and eschew government business and benefits—that’s their choice as “people.” Or, if they truly love democracy, they can choose to donate to the federal public campaign financing fund (the one funded by the little $3 check box at the top of your tax return). Banks could do the same, but interest rates on the cheap money being poured into their balance books by the Fed would go up. Ditto defense contractors, think-tank professionals, members of federal employee unions. Freedom, after all, isn’t free.
This at a stroke would eliminate most of the corporate money and special interest money in our politics today. Indeed, it would eliminate the donations of a good number of the unions (whose pension funds are backed by U.S. pension insurance), and a whole lot of so-called nonpartisan and bipartisan groups (that avoid taxes with those phony monikers).
But wouldn’t that leave the country at the mercy of the billionaires and their super PACs—figures the right love to hate (Warren Buffett, George Soros) and their counterparts hated by the left (Sheldon Adelson, Foster Friess)? Not if the law were properly configured.
To discourage the kind of political vandalism our moneyed class is funding this year, I would advocate changes to U.S. tax law that tie excess campaign donations to the tax code. In effect, anyone donating beyond the $50,000 individual limit—itself a huge amount to most people—would no longer be able to avail themselves of itemized deductions or special tax vehicles of any kind. The cost of doing that kind of business on our democracy, in effect, would be applying the 1040EZ formula to plutocrats. It’s a way the balance the budget, too!
One can imagine how Romney would feel about this. But don’t be unfair: The Democrats, in this cycle, may not be as rich, but they are at least as hypocritical.
Look at what’s going on in Charlotte, N.C., this week. Little real business will be conducted—we all known President Obama will leave at the top of the ticket, and that Smokin’ Joe Biden will be beside him.
But thanks to the failure of one BIG IDEA after another on campaign finance reform, there will be plenty of money changing hands in Charlotte, too. Even though the Democrats nominally banned corporate money from their convention back in February, like most such vows these days, this one was not worth the paper it was printed upon.
By now, the activities of New American City Inc., the shell company created by a bunch of banks and other companies to get around President Obama’s corporate money ban, is as famous on the right as Romney’s super-secret (or nonexistent!) tax returns are to the left.
When AFL-CIO President Richard Trumka chose not to pony up extra funds for the convention—a protest against the DNC’s “right to work” choice of North Carolina at the host state—it may have put a crimp in party coffers. But, as Bloomberg notes, that will hardly put a crimp in the lobbying.
From airlines to investment banks to Disney, corporate lobbyists and senior executives will be present in force, sponsoring cocktail parties, shuttle services, raw bars, and after-hours entertainment. And, of course, many of the valued “super delegates” owe their very careers to these same folks who “educate” them with campaign money whenever times get tough. No questions asked—really—they do this just because they love democracy.
The Republican convention was, if anything, even more of a Gomorrah. But this is all part of the free market, of course, and since corporations are people, in Mitt Romney’s famous phrase, what’s the fuss about these “people” rubbing elbows with their elected representatives?
Google had a lavish media lounge, free copies were available from Xerox, free Cokes from Coca-Cola. Somehow, freed from having to pretend not to be in the thrall of corporate America, all the Wall Street versus Main Street rhetoric of the Tea Party got somehow lost in the officially sponsored confetti.
Corporate receipts, according to Public Citizen, amounted to $55 million for the GOP gathering—added to $50 million the federal government ponies up for security and another $17.7 million in taxpayer funds available to, well, fund just about anything. (Now there’s a spending cut I can get behind). The Democrats are still claiming it’s all done with small donations, but they have refused to comment on New American City Inc., which is, of course, not affiliated with the DNC. Yes They Can!
How can we let this happen? The answer, simply put, is that well-meaning people do not fare well in our overly litigious society. In our system, money will find a way as sure as water will enter a basement. The tactics we have pursued to shut down corporate money have been shut down on the grounds of First Amendment objections and high-minded notions of a city on a hill. It’s time to take a more incremental, targeted approach. Because if it’s money that’s the problem, chances are that money is the solution too.
Correction, Sept. 4, 2012: This article originally misspelled the Glass-Steagall Act. (Return to the corrected sentence.)
Posted Tuesday, Aug. 28, 2012, at 4:58 AM
Photo by Peter Macdiarmid/Getty Images.
Over the past few years as I've been one of those touting a new, more optimistic trajectory for Africa, the predictable knee-jerk response from the left has been to call me either a fool or an apologist for Western neocolonialism. But like a handful of others, I persisted, pointing out that Western approaches to foreign aid often do as much damage as good, and also that increasingly Africans are saying "no thanks" to our advice.
The geoeconomics of it all are clear: sub-Saharan Africa last year grew faster than any other region, and some of its best performing countries—while still poor—have been growing faster than 7 percent a year for more than a decade. That's a pace that changes realities and brightens futures.
My employer, Renaissance Capital, has skin in this game to be sure, with stakes in major housing, infrastructure, agricultural, and mining operations around the continent. Among other things, the bank's development arm, Rendeavour, is funding the construction for middle-class housing on the outskirts of Nairobi, Kenya and Ghana's capital, Accra.
They typical knee-jerk response to this is to ask, "How can you build middle-class housing when so many Africans lack basic housing?" The answer: because Africa has the fastest growing middle class in the world. As the Observer (the Guardian's Sunday title) noted this weekend, almost one-third of Africa's 1 billion people are in or emerging into the middle class. Do they not deserve a decent living?
Ian Birrell, an old Africa hand and columnist at the Guardian, did a good job of providing a mea culpa (or nos culpae, to be more precise) on behalf of the left with a column entitled "Our Image of Africa Is Hopelessly Obsolete." This was the kickoff to a series called New Africa in which Guardian correspondents explore these changes.
Thanks to the Guardian for this and for the corrective from Birrell. The motives for focusing on Africa's destitution was never in question—but the inability of anything else to get through the murk has created real obstacles to progress. So welcome to the club.
Now, if we can just get the American right to look at Africa as something other than a base for drones and potential haven for al-Qaida ...
(On a related note: a bit of an update on an African topic explored by a guest blogger on The Reckoning a few months back—contraception in Africa. As always, the imbeciles came out of the woodwork to harangue my friend, Randi Hutter Epstein, a doctor and medical journalist of 20 years. Here's a debate between a Gates Foundation expert and a Catholic Nigerian obstetrician, taking admittedly predictable positions on the topic. But the point is—the debate is valid, no matter what Slate's snarky commentariat may say over their brie and Malbec.) - MM.
Posted Monday, Aug. 27, 2012, at 6:42 AM
Juvenile gulls fly above Blackpool, England on August 21.
Photo by Dan Kitwood/Getty Images
I don't remember the first time I heard the term Bank Holiday Weekend, but I definitely remember the light bulb that ignited when I realized this was not simply a generic term for a day when banks don't open.
In 1993, the first year I lived in Britain (during my first stint here as U.S. affairs analyst at the BBC), my colleagues kept asking me what I was planning to do during the Bank Holiday weekend, and I suppose only after the first few times that odd phrase was used did it dawn on me that this might actually be the official name of Britain's end-of-summer long weekend.
Today, those of us who live and work on this island, as I do at the moment, have a day off—the end of a long weekend that takes place each year a week before America's own Labor Day (or Labour Day, as they would have it here).
This got me to thinking: In a nation with mellifluous surnames like "Mrs. Tiggywinkle" and colorfully named streets (Seven Sisters), squares (Piccadilly), insurance companies (Scottish Widows), and even Shadow Chancellors of the Exchequer (Ed Balls!!!), how can the populace stand a banality like "Bank Holiday Weekend"?
Indeed, there's even another one—the robotically named "Early May Bank Holiday," roughly equvalent to our Memorial Day. Never mind that the Left calls it May Day, the fact is "Bank Holiday" seems to be about the best they could come up with. (The English Wikipedia page on this topic makes for interesting reading).
My fellow Americans, and the rest of you, too, our British friends need our aid! Let's face it, given what "banks" have done for our countries of late, naming the two best secular holidays of the year after the scoundrels makes precious little sense. So, the nominations are opened. I'll start with two of my own, but please let's have at it, as the Brits would say.
And to all my British friends, Happy Bank Holiday weekend. May it be the last!
MAY HOLIDAY: Liberation Weekend. On June 4, 1944, U.S., British, and Canadian troops begin the Western allies' part of the liberation of Europe in World War II, D-Day. The risk that this date will go the way of Dec. 7, 1941, (largely forgotten today in spite of its alleged "infamy") is all too real now with the generation that lived it fading away. Not a bad idea to remind the French, too, whose tourist industry benefits from a smaller, more casual British invasion about this time each year anyway.
AUGUST HOLIDAY: Mountbatten Weekend. On Aug. 27, 1979, the IRA managed to plant a bomb in a small fishing boat used by the Earl of Mountbatten—the Queen's cousin—killing the storied Earl along with his grandson, a royal relative and a young employee of the boat company. Mountbatten, whose career was checkered with quasi-disasters like the partitioning of India and the poorly planned Raid on Dieppe in 1942, nonetheless hardly deserved to die like this. But beyond the man, the holiday could also serve to remind people of the foolishness of the imperial era and the importance of an achievement that remains as fragile as it is undervalued here: the ending of the Northern Ireland conflict.
Posted Thursday, Aug. 23, 2012, at 3:59 AM
Photo by Saul Loeb/AFP/Getty Images.
Irony—the cheap sunglasses of writing.
But in this case, I can’t resist donning the shades: The last time our two major parties nominated presidential candidates who had not served in the military was 1944, in the midst of the most violent war the U.S. ever fought. Neither Franklin D. Roosevelt nor his Republican opponent, New York Gov. Thomas Dewey, had ever marched across a freezing cold parade ground to a lousy breakfast of mass-produced chipped beef on toast (“S.O.S.” to us insiders). Somehow, the alleged “character issue” therein never arose.
Now, almost 70 years later, we’ve done it again. None of the four horsemen of the duolypse that is our two-party system has ever served a day in the U.S. military. To my mind, that’s neither good nor bad—it’s always been a false issue. The military, as anyone who has served it in can tell you, contains some brilliant leaders, some self-promoting nuts, and a lot of average Joes and Janes besides (not to mention the odd Lee Harvey Oswald or Tim McVeigh).
Maybe we’re finally over this GI Joe fixation. Now that the generation of Americans that was subject to an active draft* is passing gradually from elective viability, maybe this ridiculous issue will just go away. Perhaps this will be the single good precedent established in the otherwise vile campaign of 2012. Because after decades of considering those whose career didn’t include a stint in uniform as akin to treason, no one seems to care much this year that we have a band of civilians, as opposed to brothers, vying to lead us.
Before we add this milestone to the Wikipedia page for 2012 presidential election, however, we need to consider a more likely reason for this reasonableness. Tactically speaking, neither President Obama nor Mitt Romney has anything to gain talking about the topic. Obama, of course, is too young to have been subject to the Vietnam draft and too much a product of the late baby boom to have considered volunteering for the demoralized, post-Vietnam force of his youth. He’ll rest on his drone record and perhaps getting U.S. troops out of Iraq. He didn’t have that luxury vying against former POW Sen. John McCain in 2008.
Romney, in the great tradition of Dick Cheney, had better things to do during the Vietnam War—in his case, a marriage, lots of kids, and a Mormon Church-sponsored evangelizing mission to Paris (that’s France, not Texas). Twice, his university studies earned him “2-S” deferments—the Vietnam-era “free pass” for people lucky enough to be able to afford (and maintain good grades) in college.
Vice President Joe Biden and Rep. Paul Ryan are a generation apart—meaning that Biden’s deferments have been attacked in the past by his unsuccessful electoral opponents. The fact that his son is an Iraq War vet sets him apart, though, from about 99.9 percent of Congress. (And was not lost on the Obama campaign, which has added former Army Maj. Beau Biden to the campaign show.)
Ryan, like Obama, never faced the prospect of a draft. To the extent that he has faced criticism for not serving—and let’s face it, any politician who regularly stands in front of town-hall crowds is eventually criticized for just about everything—there’s no significant record of it, though Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff, did allude to the fact after he felt Ryan had called him a liar at budget hearings in 2011.
It’s never been clear whether military service has been regarded as a prerequisite for a presidential candidate or whether the fact that America finds itself embroiled in war almost constantly renders this merely a question of physics. But the history can’t be denied: George Washington, Andrew Jackson, Ulysses Grant, and Dwight Eisenhower are only the highest ranking of this group. (Wikipedia maintains a full list of who did and didn’t serve.)
In the 20th century, the prevalence was even more striking. William McKinley volunteered to fight for the Union in the Civil War and fought at Antietam, to this day the bloodiest single battle in American history. His vice president (and, as fate would have it, successor) was the original Rough Rider, Teddy Roosevelt. Truman fought in World War I. Eisenhower defeated Hitler, of course, and John F. Kennedy won the Navy and Marine Corps Medal for his PT-109 bravery in the Pacific. Richard Nixon, Gerald Ford, Jimmy Carter, Ronald Reagan, and both Bushes all served, as did many of their opponents—George McGovern flew B-24 bombers in World War II; both Walter Mondale and Michael Dukakis served in the Army during the mid-1950s.
Until recently, those who did not serve appeared vulnerable. Candidates without medals, chevrons, or epaulets have tried to make up for it with their vice-presidential picks. Bill Clinton notably chose Vietnam vet Al Gore in 1992, in part because Clinton himself faced sustained attacks on his own alleged “draft dodging” during Vietnam for accepting precisely the deferments that saved Romney, Biden, and millions of others from the war.
Having a military record didn’t prevent attacks either. George W. Bush faced charges of manipulating the system and avoiding combat through his father, a congressman and eventually CIA director (not to mention the 43rd president of the United States). Reagan took flak for service that entailed making war movies in Hollywood, though historical records suggest he took no steps to avoid service overseas. (Oddly, Nixon never faced such complaints, though he spent his war largely in sunny California, too.)
Nor did exposure to combat save you from enemy fire on the home front. Enemies found ways to challenge the war records of McCain (shot down, tortured, and imprisoned by the North Vietnamese), George H.W. Bush (who ditched his carrier plane in the Pacific after being hit by Japanese anti-aircraft fire), and most notably, Kerry, a Bronze Star winner who was wounded and subject to the slanderous “swift boating” ad blitz of the 2004 race.
Hopefully, as an issue, military service will dwindle from here in as a litmus test for the presidency. In this modern age, it’s just not realistic to disqualify more than 90 percent of the U.S. population on this basis. It would be like insisting that all our presidents go to Harvard or Yale! (Sarcasm intended, by the way.) More seriously, a recent Pew Research Center survey found that, at any time, only about 1 percent of the U.S. population is serving and that citizens with a history of military service represent a decreasing share over the overall population. Seems like a rather closed gene pool from which to choose.
But it’s almost certainly true that having a military record—at least since Vietnam—has been a very mixed blessing. The sad fact is that modern campaigning and the unhinged nature of our media today seem to favor candidates with the least “searchable” previous life, whether that means the lack of a military record (Obama) or the ability to withhold tax returns (Romney). In a world that struggles to distinguish between opinion and truth, any public record can be manipulated into a disadvantage because your enemies will never present it in context.
Kerry found this out the hard way in 2004, as did George W. Bush for his Vietnam-era National Guard service. Obama got the birthers in 2008, and now Romney’s business and tax records are in the crosshairs. Had he chosen Saigon over Paris when he was young, I doubt it would matter much. There would simply be more grist for the mill.
*The actual draft had been suspended in 1973, with the U.S. withdrawal, and then Gerald Ford abolished the selective service (registration) requirement in 1975. But by the time I turned 18, Jimmy Carter had re-established it. Not many seem to know it, but it’s still a requirement of men who reach age 18 to register for the draft. (Return to the annotated sentence.)
Posted Wednesday, Aug. 22, 2012, at 12:07 PM
A group of protesters pelt a Japanese restaurant with plastic bottles as they attend a rally against Japan's claim of islands known as Senkaku in Japan and Diaoyu in China last week.
Photo by STR/AFP/GettyImages
The most recent dustup between China and Japan in their long-running dispute over ownership of the islets of the South China Sea has brought forth a new wave of coverage and vitriol, most of which is caused by the imperious way China is simply claiming as its own territory a good chunk of the coastline along the Pacific Rim of Asia.
The story hasn’t changed much in recent years, except that China has raised the temperature. China cites dodgy historical data and old maps (including the notorious “nine-dotted line” drawn up by the old Nationalist government) to lay claim to an area of open sea—some of which may sit over oil and gas reserves—the size of the Western United States. Worse, China—in keeping with Asia’s fetish for sovereignty over multinational mediation—refuses to discuss the issue and has recently escalated its disputes by naming a “regional government” for what otherwise seems more rightfully the realm of Neptune.
The wires are full of accounts of the thrust and parry of the past two weeks between Japanese and Chinese activists: Here’s a solid Associated Press look at the many conflicts in this region, along with a smarter piece from the Guardian on the cultural background.*
But what worries me is the relative complacency of the United States. As Democratic Sen. James Webb of Virginia pointed out this week in the Wall Street Journal, the United States has acted extraordinarily slowly to condemn China’s decision to raise the temperature of the dispute.
Secretary of State Hillary Clinton has raised the issue in public statements, but real pressure to convene a regional conference has been absent. Beyond lip service, there hasn’t been a grand effort to call a conference and thus China’s bluff about its willingness to settle all issues peacefully.
In part, that’s because the United States has found itself on the receiving end of a lot of warm feelings from China’s cowed neighbors, most of whom have some stake in this dispute and worry the United States is the only thing preventing China from just taking what it wants. The list of new developments that flows from this dynamic is long:
- An advance force of Marines is stationed at a new base in northern Australia*
- Exercises with the Indian, Indonesian, Thai, Malaysian, and Filipino navies have stepped up
- Obama’s decision to brave China’s wrath to sell a new more warplanes to Taiwan last year
- The governing Japanese Democratic Party has changed tune. Having come to office four years ago promising to put distance between Tokyo and Washington, the party has done an about face on issues ranging from U.S. troops in Okinawa to the support Japan would offer in case of a crisis over Taiwan.
- A lifting of '90s-era limitations on U.S.-Indonesian military ties
- The Philippines is even asking us to return to Subic Bay and Clark Air Force Base, the colonial-era facilities it ejected us from in the early 1990s.
The United States should not overestimate its influence at this point or confuse the interest smaller Asian countries are showing in American military friendship with actual trust.
Webb, a Marine combat veteran of Vietnam and former secretary of the Navy, correctly states that the huge forward deployment of the U.S. military, especially the Navy’s 7th Fleet, maintains a crucial balance in the region, “providing the so-called second tier countries in the region [read: Japan, South Korean, Taiwan, Indonesia, Australia, and the Phlippines] the opportunity to grow economically and to mature politically.
Right as far as it goes. But Webb betrays a very American cultural myopia in his otherwise wise piece when he asserts that: “Since World War II, despite the costly flare-ups in Korea and Vietnam, the United States has proved to be the essential guarantor of stability in the Asian-Pacific region ... ”
Well, maybe so, but those are some pretty goddamn huge caveats. Indeed, looked at another way (through Asian eyes, for instance), only one war since World War II in Asia, the India-Pakistan War of 1971, came anywhere near the carnage caused by the “costly flare-ups in Korea and Vietnam.”
Fatality figures are imprecise for such conflicts, but a conservative ranking would go something like this: Vietnam (1962-1973) 3.5 million deaths including 57,000 U.S. troops; Korea, (1950-1953) 3 million fatalities, including about 37,000 U.S. combat deaths; India-Pakistan, about 1 million deaths. (As you may have gleaned, most of those killed in all three wars were not in uniform).
If the United States wants to make a difference in Asia, it will need to get serious about bringing China and its neighbors to an international conference. The UN, stymied by the outdated P-5 veto, will once again prove useless (as I’ve argued over and over). But the United States needs to take steps while we still have the influence to force a multinational dialogue. Otherwise, events will be driven by zealots who dive out of small craft and plant flags on uninhabited islands—risking a chain of events that puts millions of lives at risk.
Correction, Aug. 22, 2012: This article misstated the source of an article on conflicts in the Pacific Rim. It was written by the Associated Press, not Washington Post reporters, and published in the Washington Post. Additionally, this article misstated the number of people killed in the Vietnam War. The estimatd number of deaths was 3.5 million, not 3,500.
Posted Tuesday, Aug. 21, 2012, at 8:08 AM
Photograph by Raveendran/AFP/Getty Images.
The death this week of Meles Zenawi, the longtime leader of one of Africa’s most important countries, Ethiopia, provides a good opportunity to consider the generational shift now under way on a continent that will soon have as many people as China and India combined.
Yes, that’s right, according to the United Nations Population Division, 2 billion people will live on the continent by midcentury, representing 21 percent of the global total and a doubling of its current population. This, more than any other reason, is why the succession in giant Ethiopia (population 84 million) is important to the world.
Meles’ obituaries—being written as I type—likely will touch on all the significant phases of his political life: his role in overthrowing the murderous socialist dictatorship of the Derg regime in 1991; his early tenure in the 1990s when he pledged to lead market and democratic reforms of the country, aligning it with the United States; the futile war Ethiopia fought for years with Eritrea in an (unsuccessful) effort to prevent the latter’s independence; and finally an ugly turn toward despotism after it appeared he would lose a re-election bid in 2005. His mass arrests and the shooting of hundreds of demonstrators that year permanently scarred his reputation, even in Washington, for whom Meles proved a willing proxy in battling Islamists in neighboring Somalia.
For all the ups and downs, he did some things very well, and in the face of the surge of humanity that Africa will produce in the next 40 years, it is important to remember what works today in Ethiopia as a result of Meles even as we hope his successor will emerge peacefully and ultimately embrace a more open form of government.
Meles should be remembered primarily for the transformation of Ethiopia’s economy. The very definition of a basket case when he took power in 1995, it has grown an average of 11 percent every year since 2004. Ethiopia still has poverty, and its fragile ecosystem leaves it vulnerable to drought. Meles, by opening the economy and insisting on rational monetary and fiscal policies, has made a recurrence of the famines that scarred his nation for centuries much less likely.
To the extent that Ethiopia evokes anything in the average American’s mind, it remains primarily an association with the horrific 1983-85 famine that killed about 1 million people. Maybe one of the country’s fantastic distance runners caught your eye during the Olympics, or perhaps you’ve had a sweetly talkative Ethiopian taxi driver with an advanced engineering degree, but more likely it is the Live Aid/Band Aid rock-and-roll famine relief movement that introduced the country to average folks.
Today, decades later, Africa has made progress in improving the metrics outsiders use to measure its economic and social welfare. Annual GDP and per-capita GDP growth are both up drastically. Infant mortality, malnutrition, HIV/AIDS, and malaria rates are all down drastically. A middle class is emerging that is holding governments accountable. Africa’s financial markets and banking sector have become a serious destination for global investors, requiring upgrades in regulation and transparency that have helped curb corruption.
Understand, all the old ills still exist. It’s just that the trajectory is almost all positive. One of my favorite statistics: Transparency International ranks Nigeria a poor 2.4 (out of 6) on its corruption perceptions index for 2011. Clearly, work to be done. But that’s the same rank as Russia (tied for No. 143 globally), and just barely worse than the Philippines, Vietnam, and Mexico, all serious targets of Western investment.
Setting the bar low, of course, hardly constitutes a success. Nigeria’s finance minister, Ngozi Okonjo-Iweala, has a book coming out titled Reforming the Unreformable about her success in reducing corruption in Nigeria’s banking system. And that system is, indeed, regarded as an increasingly attractive target for foreign investment. But as her title suggests, she’s under no illusion that the job is done—only that it is, in fact, doable and that progress has been made.
So, why do we care? The speed with which Africa will double its population over the next 40 years demands that we care. Some thinkers—Paul Collier, for instance, or Robert Kaplan of the Atlantic—have long worried about Africa’s sustainability, with Collier (like me) on the optimistic side.
Kaplan, in his 1994 essay “The Coming Anarchy,” foresaw a world that would essentially implode and become the incubus of mass dislocation and violence all over the planet. This was just in the wake of the collapse of the U.S. intervention in Somalia and amid the carnage of Rwanda’s genocide.
“Given that oil-rich Nigeria is a bellwether for the region—its population of roughly 90 million equals the populations of all the other West African states combined—it is apparent that Africa faces cataclysms that could make the Ethiopian and Somalian famines pale in comparison,” Kaplan wrote. “This is especially so because Nigeria's population, including that of its largest city, Lagos, whose crime, pollution, and overcrowding make it the cliché par excellence of Third World urban dysfunction, is set to double during the next twenty-five years, while the country continues to deplete its natural resources.”
Nearly two decades later, you could argue that Democratic Republic of Congo or Afghanistan proved his thesis. But Nigeria has defied his dire predictions, so far digesting its admittedly daunting population and religious challenges. Even including the violence in Liberia and Sierra Leone, nothing approaching the disasters of Ethiopia or the state collapse of Somalia has recurred. Liberia, Rwanda, Sierra Leone are on the mend and growing. Even Congo has stabilized somewhat—though it remains torn by warlordism.
Still, in most African countries now, a very clear picture of progress can be traced through World Bank or U.N. or IMF or WHO statistics—or conversations with anyone who visited the region 20 years ago and again today.
The wider view would be to remember that, at various points in the history of virtually every nation state, the process of settling scores domestically makes waves internationally, whether that’s a medieval schism within Christianity, the U.S. Civil War, the Russian Revolution, or the well-meaning dislocation of Muammar Qaddafi, which has unleashed a flood of military-grade weaponry to the tribal and Islamic militants who roam Africa’s Sahara. Why would the 300-year process endured by Africa—slave trading, colonization, proxy wars, and resource merchantism—be any different?
Even the worst places present a happier prospect today than Kaplan’s early '90s view—which, I’m quick to add, was quite reasonable at the time. Zimbabwe will succeed eventually once Mugabe dies. Libya, I would argue, is better off without Qaddafi in spite of the chaos the rising sowed. Sudan and South Sudan have so far avoided the worst in their divorce, and recently signed the oil-sharing agreement we’ve all been waiting for that should lay the basis for growth in both. Somalia, still troubled, nonetheless continues to exist in spite of our “failed state” label. A new book by my former BBC colleague Mary Harper, Getting Somalia Wrong, points out how life goes on even there.
In Mali, DR Congo, and elsewhere, of course, troubles persist. But they seem containable today, and one great sign is that unlike earlier periods of phony “solidarity,” their African neighbors appear to understand the stakes. Mali has been suspended from regional groups because of its coup, and there are continuing threats of an intervention if new elections don’t restore democratic government soon.
And so, back to Meles. A man of his generation—the generation that casts off foreign rule, or overthrows a bloody dictatorship—does not often turn out to be a model democrat. He put on a good act in the 1990s during the post-Berlin Wall euphoria, but his stupid war with Eritrea cost him domestic support, and then his real instincts took over. Awful decisions, especially when war with Eritrea began in 1998, cost thousands of lives. But on the life-and-death issue of creating a viable economy for his nation, he got it just about right, arguably saving millions of lives. If we want to avoid Kaplan’s world, other quasi-democratic states around the continent would do well to take note of Meles' victories as well as his mistakes.
Posted Thursday, Aug. 16, 2012, at 10:23 AM
An interview with Michael Moran by Reuters Digital Editor Chrystia Freeland for the Freeland Files.
Posted Thursday, Aug. 16, 2012, at 9:12 AM
Photo by Natalia Kolensnikova/AFP/Getty Images.
Mother Russia? Really? Does this strike anyone as an anachronism? Maybe it’s time, what with “Mother Russia” about to jail a bunch of lingerie-wearing, punk-rocking female political activists, for us to arrange a moniker trade. Germany—“the Fatherland”—seems wildly misnamed right now, for instance. With Angela Merkel’s finger controlling the flow of blood through Europe’s economic jugular, das Mutterland might be more suitable. Then Vladimir Putin and his boys can fulfill their judo-chopping, bare-chested destiny and become, at last, a mythical Slavic Vaterland.
I hear you—“What was in his cereal this morning?” you’re asking. Well, it’s all down to Pussy Riot, the female punk band facing seven years in prison in the latest, and most self-defeating, of the Kremlin’s overreactions to those who just want a little democracy in their country.
Many of my American readers will know Pussy Riot primarily because the stars of stage and screen, including a bevy of Hollywood actors, Yoko Ono, Bono, et cetera, have petitioned for their release. The three women, all in their 20s, are charged with hooliganism and sacrilege for performing an anti-Putin song last spring at the Christ the Saviour Cathedral, the seat of the Russian Orthodox Church in Moscow.
The Kremlin was not amused, and within days a kangaroo court will issue a verdict that could be up to seven years in Russia’s tuberculosis-infested prison system.
For some reason, Madonna’s protest on the band’s behalf during a Moscow concert was the thing that really pissed the Russians off. Madonna, of course, is no stranger herself to charges of sacrilege. But Russia’s former ambassador to NATO, summing up the private sentiments of Russia’s ruling elites, denounced her in Twitter a “moralizing slut.” That’s Russian for “Madonna whore,” I think, which is the whole point of her persona, right?
“Either take off the cross or put on panties," he tweeted in his best diplomatic prose.
You really can’t make that stuff up.
Pussy Riot are no angels—and, again, that’s the point, right? But they’ve comported themselves in captivity in the finest tradition of patriots.
Nadezhda Tolokonnikova, a 22-year-old guitarist for the band, put it this way in her closing statements last week in Moscow: “This is a trial of the whole government system of Russia, which so likes to show its harshness toward the individual, its indifference to his honor and dignity. If this political system throws itself against three girls … it shows this political system is afraid of truth.”
You can hear the manly belly laughs emanating from the Kremlin.
Of course, there are wonderful Machiavellian (or should we say, Rasputineque) opportunities here for Putin. By pardoning the “criminal” rock band, he can show that, underneath that iron exterior, there beats the heart of a bear (a bear mommy, of course, as this is still the Motherland). “Mother Bear gets quite angry sometimes when her cubs are naughty,” he might announce through the Kremlin’s spokesman. “But mommy loves you. Remember, anything I do to you is for your own good, darlings.”
Or, he can let the Russian Orthodox Church hold the bag. The church—allegedly the “offended party” here—has given no quarter, Christian charity be damned. The bearded zealots (and that’s just the women) feel violated by Pussy Riot’s sacrilege and want the full force of the law to rain down upon their harlotty heads in this world, before their thongs are burned off of them in a fiery, eternal afterlife in hell.
This is the “mother church” of Orthodox Christianity. Nice.
Whatever course Putin’s government chooses—and don’t kid yourself, his writ sways the courts, too—the Kremlin seems determined to drive home its point: A Pussy Riot is a spectacle, but for real, unadulterated violence and vindictiveness, a penis still helps.
Posted Wednesday, Aug. 15, 2012, at 12:07 PM
I'm gonna win one for The Flipper!
Photograph by Marc Piscotty/Getty Images
Is there no way to stop Republicans from pretending to be Ronald Reagan? One of the most painful and least valid aspects of the introduction of Rep. Paul Ryan as Mitt Romney’s running mate has been the clumsy but predictable effort to don “the mantle of Reagan.”*** Can we retire the man’s coat, already? If there is one thing that is clear about the modern Republican Party it is that none of the characters who have shuffled on and off its “frontrunner” list in this cycle have the shoulders for that garment.
Certainly not Mitt Romney. In spite of the historical revisionism about The Gipper since his death, he was neither Rambo nor a man who refused to raise taxes. Reagan was a conciliator, and his leadership style was practical even when his rhetoric was overblown. And Reagan was a creature of the midwestern middle class – the kind of guy whose “mantle” would be a “respectable Republican cloth coat,” as Richard Nixon described it so many years ago. Mitt’s bespoke silk-lined wardrobe hardly fits that humble standard.
Enter Paul Ryan, legitimately a son of the Midwest and of America’s middle class. Back in May, with a speech at the Reagan Presidential Library, he placed his own bid on the Ronald’s Technicolor Dreamcoat, promising that if Mitt Romney won in November, the GOP would turn 2013 into 1981 all over again.
“Romney Ratifies Reagan with Ryan Pick,” intoned Fox News.
With Romney offering so little detail on his own budget priorities to date, one has to assume at this point – whatever Romney may say to the contrary – that the that the budget Ryan championed as House Budget Chairman first in 2011 and then again earlier this year represents the campaign’s baseline at the moment.
So, if a Romney-Ryan ticket were to win, would an effort to ram the Ryan budget through Congress in 2013 satisfy the most important litmus test of the modern GOP: “What Would Reagan Do?
Set aside the question of whether the 1981/2013 comparison is valid or not (the case is dubious at best) and let’s look at what Reagan actually did when he took office. In fact, rather than attack government spending with the Sword of Damocles, or even the Pen of Ryan, as the sepia-toned remembrances of party lore, Reagan actually worked quite deftly with a Congress in which Democrats held power in both chambers.
Faced with a difficult economy and divided government, Reagan chose to mix tax cuts with stimulus spending and, yes, tax increases, too. Bruce Bartlett, an economic advisor to Reagan and a Treasury official under George H. W. Bush, professes amazement at the twisting of the historical record by those too young to remember the factual Reagan (as opposed to the guy in the red cape).
As I wrote in my book earlier this year, Bartlett reminds us, “The cumulative legislated tax increase during his administration came to $132.7 billion as of 1988 [$367 billion today]. This compared to a gross tax cut of $275.1 billion. Thus Reagan took back about half the 1981 tax cut with subsequent tax increases.”
Indeed, David Stockman, Reagan’s budget cutting OMB director denounced Ryan’s numbers as a “fairy tale” this week.
Imagine that! The alleged God of Tax Cuts actually raised taxes when he thought progress, political, economic or otherwise, required him to do so.
Fat chance we’ll see a Romney White House governing so practically. Certainly, that would not be the approach favored by Ryan. And most certainly, the House GOP – assuming as I do it remains a stronghold of the Tea Party – would never allow it. Reagan would have been denounced as a traitor, or a socialist or worse. (Was he really born in Illinois? And what about that nickname – Dutch! We better check the birth certificate).
Sadly, many Democrats, too, prefer fantasy to fact—though perhaps more as political strategy than economic concept. Back when Paul Ryan first appeared on the scene, in early 2011, Democrats completely abandoned any effort to put Medicare’s trust fund on a sustainable course when polls revealed that Ryan’s proposal to reform it was political poison. Talk of compromise on this vital topic was replaced by tongue-in-cheek bumper stickers like Vote Republican, End Medicare.
Once again, one of our two parties confused partisan interest with national interest. While Ryan’s plan does indeed ultimately end Medicare in its current form and represents a radical view, it could have also been the basis for a real discussion of how to save Medicare. No one should operate under the illusion that the program—or, for that matter, Social Security and Medicaid—can continue unreformed without destroying the US economy’s ability to grow. But politics, even two years before the 2012 presidential election, won the day.
Today, Medicare looks even less sustainable than it did in 2012. What’s the Democratic plan for saving it? So far, it seems like the plan is to confine any sentient conversation to two syllables so that Florida newspapers can write good headlines. If you wonder why American voters seem to be veering from right to left and back again in one election after another, this is part of it: “Victory” to both parties is identified in terms of votes rather than what is good for the United States and its citizens. What a shame its citizens aren’t just a bit harder to fool.
***An entertaining way to spend a few moments is to Google “mantle of Reagan.” The usual suspects struggling to don it, of course, but also some real laughers.