The Future of American Power

Sept. 10 2012 12:05 PM

Watch on the Rhine: A German Decision on the Euro

143404824
Business as usual on the Rhine - for now.

This week brings another inconvenient intervention by “national institutions” in the workings of the multinational Eurozone, that portion of the even larger European Union that uses the euro as its currency. On Thursday, Germany’s constitutional court will rule on whether the creation of a EUR 700 billion “rescue fund” to bail out tottering member states like Ireland and Greece is a violation of Germany’s own national constitution, known as the Basic Law.

More than 12,000 Germans signed a petition asking to high court to rule the bailout fund – the European Stability Mechanism (ECM) – which they argue violates a constitutional clause requiring Germany’s parliament – and not an outside body -- to decide whether to expose the country to long term debts.

 This has the potential to cause an enormous legal train wreck – in essence, a restart moment for the European debt crisis that would be a major shock to already jittery global markets. Many legal experts expect a fudge, assuming the justices know an outright ruling against the ECM would cause havoc. Markets would tumble, the euro with it, and it could just be the straw that breaks the Eurozone’s back.

But these experts have no way of knowing how much tolerance is left in German minds for the endless bailouts already undertaken, nor how that may be weighing on judges.

Such surprises have become commonplace in recent years.  The French rejected in a referendum a proposed EU constitution in 2005 that would have created a much more powerful European government. Then Irish voters rejected its successor, the so-called Lisbon Treaty) in 2008. Since unanimity was required, those votes scrapped the initiatives.

What is striking, looking back at these two referendum campaigns, is that none of the reforms proposed in either of those documents would have addressed what has turned out to be the EU’s true shortcoming: the lack of a common fiscal policy.

The EU Constitution, which was never ratified, contained clauses providing greater powers for European versus national law, provisions for a common foreign policy and lofty words about shared European values.

Like the later Lisbon Treaty, the constitution said nothing about sharing the pain of a national bankruptcy, or issuing common “Eurobonds” to fund deficits, or enforcement or ejection provisions to be used if a country flouted the limits on government spending meant to keep the bloc solvent.  

So what is a German constitutional justice to do? Unlike US Supreme Court justices, Germany’s highest court has regularly backed the notion that international laws and treaties can override constitutional concerns in some cases. The principles laid down at Nuremberg after World War II – the idea that just because it’s the law doesn’t mean it is right - still weigh heavily on the country’s legal code, and German law bends to international will at times.

But Germany is reaching a tipping point. Whatever is wrong or right about its decision to force radical austerity on its weaker EZ partners – and much is wrong, to my mind – Germans themselves are asking when they get a vote in how their tax money is spent.

Early in the Eurozone crisis, the German electorate was in a more generous mood: the German economy was defying gravity and exports booming. But German GDP has barely budged since the start of 2011 – not a recession, but certainly a fall back to Earth. While the jobless rate is 5.5 percent – not bad, compared to 25 percent Spain, but exports and order books look slack.

Politically, there is no doubt that the bailouts of Greece and other countries are viewed by most Germans as an unfair handout to less hard working cultures. This is not entirely fair, as I’ve frequently written. The fact is, like the US and its Chinese creditor, there is a co-dependency here. German banks would not have overextended themselves in the “periphery” if German industry wasn’t making a killing there. The enlarged euro currency zone turned parochial German manufacturing companies into regional and then global powerhouses, pushing national favourites into bankruptcy. So there’s plenty of blame to go around.

Still, the ugly mood and German exasperation makes Thursday’s ruling a crucial, potentially critical moment. While the ESM rescue pool is jointly funded, Germany’s share is much larger than others at 27 percent. Since the bailouts of weaker EU states began in 2010, according to the German think tank Ifo, Germany has exposed itself to some $831 billion worth of liabilities to prevent Eurozone members Greece, Ireland and Portugal from bankruptcy – and this doesn’t include the many billions in dicey loans to these countries made by Germany’s leading banks, which would have to be bailed out, too, if a default were to occur.

No end is in sight, either. Last week, the cost of financing Spain’s much larger debts rose once again toward 7 percent – levels that are unsustainable. The Eurozone’s central bank, the ECB, intervened with soothing words about supporting Spain and other troubled EZ countries by buying their sovereign bonds – in effect, lending them money.

But the long-term solution has to involve reforms that give the central bank more authority to discipline member states that overspend (or lie about it, as Greece regularly did). Otherwise, Spain will eventually need a bailout, and the Germans know they will again be asked to pay.

Back at the constitutional court, located near the Rhine in Karlsruhe -- far from Berlin as a symbol of post-war Germans’ determination to prevent over-centralization of their government ever again -- over-centralization is again at issue. Political issues are supposed to be irrelevant, of course. Jurists will argue that Germany’s constitutional judges will decide the ESM case on legal merits. And so they might. Indeed, you can argue yourself in circles trying to figure out how applying a more nationalist filter might affect a judge’s thinking: Is it more dangerous to blow up the ESM and risk the quick, sharp pain of default that would follow, or is death by a thousand cuts, with Germany doling out rescue funds for decade or more – the darker scenario?

The absurdity of this dilemma illustrates another point: the EU, or at least the EZ (Eurozone), suffers not only from a “democratic deficit,” but an acute lack of leadership, too. The debate right now should not just be about cutting more spending in Greece, Portugal, Spain and Ireland, which have already cut severely. Yes, flabby labor laws and tax codes need reforming, but these economies cannot cut their way to health while remaining tied to a currency as powerful as the euro. Who’s going to buy a Greek anchovy when the Tunisian ones are just as salty and cost half the price?

Greece, and possibly others, need selective reintroductions of their old national currencies, giving them the ability to  devalue, cheapen their exports, lower labor costs and regain competitiveness -- all with a path back to rejoin the larger bloc if they meet certain goals. At very least, this gives them ownership of their own problems, a first step to incentivizing real solutions.

This penalty box approach (which I wrote about back in June) has complications, of course, that make it only a bit less painful than doing nothing. But so far, doing nothing – the Eurozone’s current policy – is only putting off the day of reckoning.

Video Advertisement

Sept. 4 2012 10:34 AM

Better Off Than Four Years Ago? Depends on How You Feel About Bubbles.

144039429
President Barack Obama delivering an address in Washington last May.

Photograph by Kristoffer Tripplaar-Pool/Getty Images

A great deal more will be written on the question of whether an incumbent president who presided over an entire term of disappointing jobs and GDP growth deserves re-election. This is, of course, precisely the kind of question—devoid of context and precooked to solicit a particular answer—that mass market democratic contests usually turn on.

Are we better off today than we were four years ago?

Economically, the answer is not in doubt: Yes, we are. But there are lots of caveats.  

First, the view that President Obama wants to emerge from Charlotte: Four years ago the country was sliding over the edge of an economic cliff. Today, we’ve got one leg back on top, and even with the Republican congressional caucus holding onto the other leg and screaming, “I’d rather fall to my death than climb back onto that debt-strewn precipice”—we’re clawing our way to safety.

All this is true enough.

But is everything all right? No.

Is the country happier than it was four years ago? Hardly.  

Is the country more certain of its future? Perhaps, but if so, we don’t like what that future looks like, and we’re desperate to blame someone else for our misfortunes (China, immigrants, Muslims, bankers, socialist Kenyans).  

Are the structural economic problems that caused the 2008 debacle solved? Mostly no.

Are we getting back to normal? Well, of course not: Times were not normal to start. To get back to that normal would be national suicide—an asset bubble fueled normal more unsustainable than anything either of our political parties is flirting with today.

The "good old days" is another staple of dumbed-down campaigning, and it is in full view today. In reality, “normalcy,” the alleged “promised land” that Warren G. Harding offered his Republican constituents in the 1920 presidential race, is always unattainable.*

In Harding's day, it disappeared when combustion engines replaced horses. Harding knew people pined for that simpler time—when don’t they? But it turned out there was no going back to the days before World War I then, and there’s no going back to the false prosperity of the bubbly Bush years now, either. Harding’s promise was the sham, and the absentee governance of his administration, and those of his two GOP successors, Calvin Coolidge and Herbert Hoover, set the stage for the collapse that followed.

Do we really pine for the bubble years? Remember, folks, the “prosperity” now implied by those who as about “four years ago”  were fuelled by a runaway financial system that treated peoples’ homes, jobs and lives like so many chips in a casino.

Would we be “better off” if the bubble loomed over us again?  No, we’d be walking toward an even deeper cliff.

The fact is, a “balance sheet recession”—a recession caused by reckless spending and our unwillingness to raise the revenues necessary to cover expenses—takes longer than other downturns to mend. It’s the difference between the headache that results from having a few too many glasses of red wine and the medically induced coma that comes from mixing your red wine with steroids, Extacy, West Texas moonshine and anything else high-spending lobbyists put in your hands. That latter scenario does real damage, a near death experience with a long recovery time and no guarantee that you will ever again be the same.

That said, Americans have plenty of reason to be disappointed in Obama’s performance. He took the high road too often on economic policy in the beginning, failing to lay out for Americans a realistic timeline for recovery or explain the extent of the hole they were in. “Yes We Can” can-doism led to candy-coated politics. He should have been driving home the lesson of the Bush years (and, to be fair, Clinton’s second term): deregulation of markets is not a religion, it is a theory. It has limits, that those limits must be monitored or we will destroy the economy.

But he did not do that. Today, as he prepares for his second nomination speech, there will be no Gospel-like refrain that calls to mind those days when most everyone was glad to see him—a "pay any price, bear any burden" kind of phrase harkening back to a time when the whole country was yearning for leadership.

So yes, he steered us away from the cliff. But he seems to have fallen asleep shortly thereafter at the wheel.

Ironically, his economic policies are not the real problem. Again, this was always going to take a long time to solve. We can argue whether there should have been more stimulus (I think so). But on the finer economic points, the general direction has been correct.

The Economist, usually inclined toward centrists of either party, cannot bring itself to completely condemn Obama, just as it cannot seem to believe what Mitt Romney has morphed into. The magazine noted this week in a clever “End-of-Term” report on Obama’s first term, “His handling of the crisis and recession were impressive. Unfortunately, his efforts to reshape the economy have often misfired. And America’s public finances are in a dire state.”

Recessions, as Europe demonstrates every single day, are no time to cut government spending: the result is a vicious circle in which austerity kills growth and deficits become nearly insurmountable (especially in countries that have to fund them on the open market). So even if deficits rise during a recession, the idea is to hasten the return of growth that, in the end, is the only real solution to such gaps.

As the Economist points out, “normal standards of fiscal rectitude have not applied in the past four years. When households, firms and state and local governments are cutting their debts, the federal government would have made the recession worse by doing the same.”

How is it a man who rode to office on such a raft of soaring rhetoric has failed to put that last point in simple language for the American voter? Whatever the depth of the damage he inherited or the obstructionism of his opponents, it could turn out that the lack of a mantra sinks him. At such times, few choice words uttered under pressure—“nothing to fear but fear itself”—“morning in America”—are worth a million lines of legislation.

Correction, Sept. 5, 2012: This blog post originally referred to Warren G. Harding's 1924 presidential race. Harding ran for president—and won—in 1920.

Sept. 3 2012 12:27 PM

Can We Get Big Money Out of Our Campaigns? Yes We Can (but Not Right Now)

"All questions in a democracy [are] questions of money."

--Mark Hanna, 1896 campaign manager for GOP candidate William McKinley

Well-meaning just doesn’t cut it anymore. The history of attempts to prevent money from perverting our politics is as old as the Republic, but only in 1896 did Mark Hanna, a manipulative genius of the highest magnitude, turn political fundraising into a science.

The reaction of well-meaning people has been behind the curve ever since. Money finds a way.

As Jack Beatty noted in a wonderfully prescient piece in the Atlantic in 2007, William McKinley’s talent for channeling corporate money into his 1896 campaign won him the White House but also thoroughly blackened his name and compromised his administration.

After an anarchist murdered McKinley in 1901, his successor, Teddy Roosevelt, demanded a complete ban on corporate giving to presidential campaigns—something he got from Congress in 1907.

The Taft-Hartley Act in 1947 (passed by a Democratic administration, to continue the irony), extended the ban to labor unions. Campaigns, unions, and corporations continued to seek loopholes—Nixon famously had “bagmen” collect money from plutocrats like Yankees owner George Steinbrenner, helping spawn the Federal Elections Commission in the reform-minded early 1970s. But for years, Taft-Hartley and TR’s 1907 reforms did for U.S. political campaigns what the Glass-Steagall Act had done for markets: ensured that efforts to manipulate them would be illegal.*

Alas, money found a way.

Eventually, in part due to the grandiosity of these “well-meaning” efforts, each one of these reforms failed, all falling afoul of those who claim their right to free speech was being unconstitutionally curbed because (as they argued) spending money on campaigns is a form of speech. Add to the list the post-Watergate effort to ensure only public money was available for campaigning, efforts in the 1980s to ban soft money, the largely failed campaign to get candidates to foreswear PAC money in the 1990s and 2000s, and of course the McCain-Feingold effort that began in 2002.

Well-meaning people still haven’t taken the lesson, though. Current efforts to curb out-of-control super PACs and corporate and labor donations still rely on high-minded constitutional arguments. Americans Elect, that late-night dorm-room attempt to inject reason into our politics, died on the vine. In fact, the solution is much simpler: Attack the conflict of interest that exists at the heart of these transactions, not the transactions themselves.

Rather than legislating what types of money are or are not dirty or attempting to define which class of citizen (corporate or otherwise) deserves free speech, Congress should simply create incentives for civic-minded behavior.

How? Well, start with corporations: The law should make political neutrality a requirement for doing business of any kind with the U.S. government and reserve certain privileges—like the right to classify all one’s income as investment income or qualification for Federal Deposit Insurance, for instance.

Corporations could always defy that and eschew government business and benefits—that’s their choice as “people.” Or, if they truly love democracy, they can choose to donate to the federal public campaign financing fund (the one funded by the little $3 check box at the top of your tax return). Banks could do the same, but interest rates on the cheap money being poured into their balance books by the Fed would go up. Ditto defense contractors, think-tank professionals, members of federal employee unions. Freedom, after all, isn’t free.

This at a stroke would eliminate most of the corporate money and special interest money in our politics today. Indeed, it would eliminate the donations of a good number of the unions (whose pension funds are backed by U.S. pension insurance), and a whole lot of so-called nonpartisan and bipartisan groups (that avoid taxes with those phony monikers).

But wouldn’t that leave the country at the mercy of the billionaires and their super PACs—figures the right love to hate (Warren Buffett, George Soros) and their counterparts hated by the left (Sheldon Adelson, Foster Friess)? Not if the law were properly configured.

To discourage the kind of political vandalism our moneyed class is funding this year, I would advocate changes to U.S. tax law that tie excess campaign donations to the tax code. In effect, anyone donating beyond the $50,000 individual limit—itself a huge amount to most people—would no longer be able to avail themselves of itemized deductions or special tax vehicles of any kind. The cost of doing that kind of business on our democracy, in effect, would be applying the 1040EZ formula to plutocrats. It’s a way the balance the budget, too!

One can imagine how Romney would feel about this. But don’t be unfair: The Democrats, in this cycle, may not be as rich, but they are at least as hypocritical.

Look at what’s going on in Charlotte, N.C., this week. Little real business will be conducted—we all known President Obama will leave at the top of the ticket, and that Smokin’ Joe Biden will be beside him.

But thanks to the failure of one BIG IDEA after another on campaign finance reform, there will be plenty of money changing hands in Charlotte, too. Even though the Democrats nominally banned corporate money from their convention back in February, like most such vows these days, this one was not worth the paper it was printed upon.

By now, the activities of New American City Inc., the shell company created by a bunch of banks and other companies to get around President Obama’s corporate money ban, is as famous on the right as Romney’s super-secret (or nonexistent!) tax returns are to the left.

When AFL-CIO President Richard Trumka chose not to pony up extra funds for the convention—a protest against the DNC’s “right to work” choice of North Carolina at the host state—it may have put a crimp in party coffers. But, as Bloomberg notes, that will hardly put a crimp in the lobbying.

From airlines to investment banks to Disney, corporate lobbyists and senior executives will be present in force, sponsoring cocktail parties, shuttle services, raw bars, and after-hours entertainment. And, of course, many of the valued “super delegates” owe their very careers to these same folks who “educate” them with campaign money whenever times get tough. No questions asked—really—they do this just because they love democracy.

The Republican convention was, if anything, even more of a Gomorrah. But this is all part of the free market, of course, and since corporations are people, in Mitt Romney’s famous phrase, what’s the fuss about these “people” rubbing elbows with their elected representatives?

Google had a lavish media lounge, free copies were available from Xerox, free Cokes from Coca-Cola. Somehow, freed from having to pretend not to be in the thrall of corporate America, all the Wall Street versus Main Street rhetoric of the Tea Party got somehow lost in the officially sponsored confetti.

Corporate receipts, according to Public Citizen, amounted to $55 million for the GOP gathering—added to $50 million the federal government ponies up for security and another $17.7 million in taxpayer funds available to, well, fund just about anything. (Now there’s a spending cut I can get behind). The Democrats are still claiming it’s all done with small donations, but they have refused to comment on New American City Inc., which is, of course, not affiliated with the DNC. Yes They Can!

How can we let this happen? The answer, simply put, is that well-meaning people do not fare well in our overly litigious society. In our system, money will find a way as sure as water will enter a basement. The tactics we have pursued to shut down corporate money have been shut down on the grounds of First Amendment objections and high-minded notions of a city on a hill. It’s time to take a more incremental, targeted approach. Because if it’s money that’s the problem, chances are that money is the solution too.

(For two nice backgrounders on this travesty, here’s Steve Horn of Truthout and a more sober and specific look at the history of convention funding by Public Citizen.)

Correction, Sept. 4, 2012: This article originally misspelled the Glass-Steagall Act. (Return to the corrected sentence.)

Aug. 28 2012 4:58 AM

At Last, the Left's Africa Gloom Fetish Is Lifting

Over the past few years as I've been one of those touting a new, more optimistic trajectory for Africa, the predictable knee-jerk response from the left has been to call me either a fool or an apologist for Western neocolonialism. But like a handful of others, I persisted, pointing out that Western approaches to foreign aid often do as much damage as good, and also that increasingly Africans are saying "no thanks" to our advice.

The geoeconomics of it all are clear: sub-Saharan Africa last year grew faster than any other region, and some of its best performing countries—while still poor—have been growing faster than 7 percent a year for more than a decade. That's a pace that changes realities and brightens futures.

My employer, Renaissance Capital, has skin in this game to be sure, with stakes in major housing, infrastructure, agricultural, and mining operations around the continent. Among other things, the bank's development arm, Rendeavour, is funding the construction for middle-class housing on the outskirts of Nairobi, Kenya and Ghana's capital, Accra.

They typical knee-jerk response to this is to ask, "How can you build middle-class housing when so many Africans lack basic housing?" The answer: because Africa has the fastest growing middle class in the world. As the Observer (the Guardian's Sunday title) noted this weekend, almost one-third of Africa's 1 billion people are in or emerging into the middle class. Do they not deserve a decent living?

Ian Birrell, an old Africa hand and columnist at the Guardian, did a good job of providing a mea culpa (or nos culpae, to be more precise) on behalf of the left with a column entitled "Our Image of Africa Is Hopelessly Obsolete." This was the kickoff to a series called New Africa in which Guardian correspondents explore these changes.

Thanks to the Guardian for this and for the corrective from Birrell. The motives for focusing on Africa's destitution was never in question—but the inability of anything else to get through the murk has created real obstacles to progress. So welcome to the club.

Now, if we can just get the American right to look at Africa as something other than a base for drones and potential haven for al-Qaida ...

(On a related note: a bit of an update on an African topic explored by a guest blogger on The Reckoning a few months back—contraception in Africa. As always, the imbeciles came out of the woodwork to harangue my friend, Randi Hutter Epstein, a doctor and medical journalist of 20 years. Here's a debate between a Gates Foundation expert and a Catholic Nigerian obstetrician, taking admittedly predictable positions on the topic. But the point is—the debate is valid, no matter what Slate's snarky commentariat may say over their brie and Malbec.) - MM.

Aug. 27 2012 6:42 AM

Let's Play "Name That End-of-Summer Holiday!"

150559289
Juvenile gulls fly above Blackpool, England on August 21.

Photo by Dan Kitwood/Getty Images

I don't remember the first time I heard the term Bank Holiday Weekend, but I definitely remember the light bulb that ignited when I realized this was not simply a generic term for a day when banks don't open.

In 1993, the first year I lived in Britain (during my first stint here as U.S. affairs analyst at the BBC), my colleagues kept asking me what I was planning to do during the Bank Holiday weekend, and I suppose only after the first few times that odd phrase was used did it dawn on me that this might actually be the official name of Britain's end-of-summer long weekend.

Today, those of us who live and work on this island, as I do at the moment, have a day off—the end of a long weekend that takes place each year a week before America's own Labor Day (or Labour Day, as they would have it here).

This got me to thinking: In a nation with mellifluous surnames like "Mrs. Tiggywinkle" and colorfully named streets (Seven Sisters), squares (Piccadilly), insurance companies (Scottish Widows), and even Shadow Chancellors of the Exchequer (Ed Balls!!!), how can the populace stand a banality like "Bank Holiday Weekend"?

Indeed, there's even another one—the robotically named "Early May Bank Holiday," roughly equvalent to our Memorial Day. Never mind that the Left calls it May Day, the fact is "Bank Holiday" seems to be about the best they could come up with. (The English Wikipedia page on this topic makes for interesting reading).

My fellow Americans, and the rest of you, too, our British friends need our aid! Let's face it, given what "banks" have done for our countries of late, naming the two best secular holidays of the year after the scoundrels makes precious little sense. So, the nominations are opened. I'll start with two of my own, but please let's have at it, as the Brits would say.

And to all my British friends, Happy Bank Holiday weekend. May it be the last!

My nominations:

MAY HOLIDAY: Liberation Weekend. On June 4, 1944, U.S., British, and Canadian troops begin the Western allies' part of the liberation of Europe in World War II, D-Day. The risk that this date will go the way of Dec. 7, 1941, (largely forgotten today in spite of its alleged "infamy") is all too real now with the generation that lived it fading away. Not a bad idea to remind the French, too, whose tourist industry benefits from a smaller, more casual British invasion about this time each year anyway.

AUGUST HOLIDAY: Mountbatten Weekend. On Aug. 27, 1979, the IRA managed to plant a bomb in a small fishing boat used by the Earl of Mountbatten—the Queen's cousin—killing the storied Earl along with his grandson, a royal relative and a young employee of the boat company. Mountbatten, whose career was checkered with quasi-disasters like the partitioning of India and the poorly planned Raid on Dieppe in 1942, nonetheless hardly deserved to die like this. But beyond the man, the holiday could also serve to remind people of the foolishness of the imperial era and the importance of an achievement that remains as fragile as it is undervalued here: the ending of the Northern Ireland conflict.

Aug. 23 2012 3:59 AM

The 2012 Field: A Band of Civilians

Irony—the cheap sunglasses of writing.

But in this case, I can’t resist donning the shades: The last time our two major parties nominated presidential candidates who had not served in the military was 1944, in the midst of the most violent war the U.S. ever fought. Neither Franklin D. Roosevelt nor his Republican opponent, New York Gov. Thomas Dewey, had ever marched across a freezing cold parade ground to a lousy breakfast of mass-produced chipped beef on toast (“S.O.S.” to us insiders). Somehow, the alleged “character issue” therein never arose.

Now, almost 70 years later, we’ve done it again. None of the four horsemen of the duolypse that is our two-party system has ever served a day in the U.S. military. To my mind, that’s neither good nor bad—it’s always been a false issue. The military, as anyone who has served it in can tell you, contains some brilliant leaders, some self-promoting nuts, and a lot of average Joes and Janes besides (not to mention the odd Lee Harvey Oswald or Tim McVeigh).

Maybe we’re finally over this GI Joe fixation. Now that the generation of Americans that was subject to an active draft* is passing gradually from elective viability, maybe this ridiculous issue will just go away. Perhaps this will be the single good precedent established in the otherwise vile campaign of 2012. Because after decades of considering those whose career didn’t include a stint in uniform as akin to treason, no one seems to care much this year that we have a band of civilians, as opposed to brothers, vying to lead us.

Before we add this milestone to the Wikipedia page for 2012 presidential election, however, we need to consider a more likely reason for this reasonableness. Tactically speaking, neither President Obama nor Mitt Romney has anything to gain talking about the topic. Obama, of course, is too young to have been subject to the Vietnam draft and too much a product of the late baby boom to have considered volunteering for the demoralized, post-Vietnam force of his youth. He’ll rest on his drone record and perhaps getting U.S. troops out of Iraq. He didn’t have that luxury vying against former POW Sen. John McCain in 2008.

Romney, in the great tradition of Dick Cheney, had better things to do during the Vietnam War—in his case, a marriage, lots of kids, and a Mormon Church-sponsored evangelizing mission to Paris (that’s France, not Texas). Twice, his university studies earned him “2-S” deferments—the Vietnam-era “free pass” for people lucky enough to be able to afford (and maintain good grades) in college.

Vice President Joe Biden and Rep. Paul Ryan are a generation apart—meaning that Biden’s deferments have been attacked in the past by his unsuccessful electoral opponents. The fact that his son is an Iraq War vet sets him apart, though, from about 99.9 percent of Congress. (And was not lost on the Obama campaign, which has added former Army Maj. Beau Biden to the campaign show.)

Ryan, like Obama, never faced the prospect of a draft. To the extent that he has faced criticism for not serving—and let’s face it, any politician who regularly stands in front of town-hall crowds is eventually criticized for just about everything—there’s no significant record of it, though Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff, did allude to the fact after he felt Ryan had called him a liar at budget hearings in 2011.

It’s never been clear whether military service has been regarded as a prerequisite for a presidential candidate or whether the fact that America finds itself embroiled in war almost constantly renders this merely a question of physics. But the history can’t be denied:  George Washington, Andrew Jackson, Ulysses Grant, and Dwight Eisenhower are only the highest ranking of this group.  (Wikipedia maintains a full list of who did and didn’t serve.)

In the 20th century, the prevalence was even more striking. William McKinley volunteered to fight for the Union in the Civil War and fought at Antietam, to this day the bloodiest single battle in American history. His vice president (and, as fate would have it, successor) was the original Rough Rider, Teddy Roosevelt. Truman fought in World War I. Eisenhower defeated Hitler, of course, and John F. Kennedy won the Navy and Marine Corps Medal for his PT-109 bravery in the Pacific. Richard Nixon, Gerald Ford, Jimmy Carter, Ronald Reagan, and both Bushes all served, as did many of their opponents—George McGovern flew B-24 bombers in World War II; both Walter Mondale and Michael Dukakis served in the Army during the mid-1950s.  

Until recently, those who did not serve appeared vulnerable. Candidates without medals, chevrons, or epaulets have tried to make up for it with their vice-presidential picks. Bill Clinton notably chose Vietnam vet Al Gore in 1992, in part because Clinton himself faced sustained attacks on his own alleged “draft dodging” during Vietnam for accepting precisely the deferments that saved Romney, Biden, and millions of others from the war.

Having a military record didn’t prevent attacks either. George W. Bush faced charges of manipulating the system and avoiding combat through his father, a congressman and eventually CIA director (not to mention the 43rd president of the United States). Reagan took flak for service that entailed making war movies in Hollywood, though historical records suggest he took no steps to avoid service overseas. (Oddly, Nixon never faced such complaints, though he spent his war largely in sunny California, too.)

Nor did exposure to combat save you from enemy fire on the home front. Enemies found ways to challenge the war records of McCain (shot down, tortured, and imprisoned by the North Vietnamese), George H.W. Bush (who ditched his carrier plane in the Pacific after being hit by Japanese anti-aircraft fire), and most notably, Kerry, a Bronze Star winner who was wounded and subject to the slanderous “swift boating” ad blitz of the 2004 race.

Hopefully, as an issue, military service will dwindle from here in as a litmus test for the presidency. In this modern age, it’s just not realistic to disqualify more than 90 percent of the U.S. population on this basis. It would be like insisting that all our presidents go to Harvard or Yale! (Sarcasm intended, by the way.) More seriously, a recent Pew Research Center survey found that, at any time, only about 1 percent of the U.S. population is serving and that citizens with a history of military service represent a decreasing share over the overall population. Seems like a rather closed gene pool from which to choose.

But it’s almost certainly true that having a military record—at least since Vietnam—has been a very mixed blessing. The sad fact is that modern campaigning and the unhinged nature of our media today seem to favor candidates with the least “searchable” previous life, whether that means the lack of a military record (Obama) or the ability to withhold tax returns (Romney). In a world that struggles to distinguish between opinion and truth, any public record can be manipulated into a disadvantage because your enemies will never present it in context.

Kerry found this out the hard way in 2004, as did George W. Bush for his Vietnam-era National Guard service. Obama got the birthers in 2008, and now Romney’s business and tax records are in the crosshairs. Had he chosen Saigon over Paris when he was young, I doubt it would matter much. There would simply be more grist for the mill.   

*The actual draft had been suspended in 1973, with the U.S. withdrawal, and then Gerald Ford abolished the selective service (registration) requirement in 1975. But by the time I turned 18, Jimmy Carter had re-established it. Not many seem to know it, but it’s still a requirement of men who reach age 18 to register for the draft. (Return to the annotated sentence.)

Aug. 22 2012 12:07 PM

Our Olympus Syndrome in Asia

chinese protest
A group of protesters pelt a Japanese restaurant with plastic bottles as they attend a rally against Japan's claim of islands known as Senkaku in Japan and Diaoyu in China last week.

Photo by STR/AFP/GettyImages

The most recent dustup between China and Japan in their long-running dispute over ownership of the islets of the South China Sea has brought forth a new wave of coverage and vitriol, most of which is caused by the imperious way China is simply claiming as its own territory a good chunk of the coastline along the Pacific Rim of Asia.  

The story hasn’t changed much in recent years, except that China has raised the temperature. China cites dodgy historical data and old maps (including the notorious “nine-dotted line” drawn up by the old Nationalist government) to lay claim to an area of open sea—some of which may sit over oil and gas reserves—the size of the Western United States. Worse, China—in keeping with Asia’s fetish for sovereignty over multinational mediation—refuses to discuss the issue and has recently escalated its disputes by naming a “regional government” for what otherwise seems more rightfully the realm of Neptune.

The wires are full of accounts of the thrust and parry of the past two weeks between Japanese and Chinese activists: Here’s a solid Associated Press look at the many conflicts in this region, along with a smarter piece from the Guardian on the cultural background.*

But what worries me is the relative complacency of the United States. As Democratic Sen. James Webb of Virginia pointed out this week in the Wall Street Journal, the United States has acted extraordinarily slowly to condemn China’s decision to raise the temperature of the dispute.

Secretary of State Hillary Clinton has raised the issue in public statements, but real pressure to convene a regional conference has been absent. Beyond lip service, there hasn’t been a grand effort to call a conference and thus China’s bluff about its willingness to settle all issues peacefully.

In part, that’s because the United States has found itself on the receiving end of a lot of warm feelings from China’s cowed neighbors, most of whom have some stake in this dispute and worry the United States is the only thing preventing China from just taking what it wants. The list of new developments that flows from this dynamic is long:

  • An advance force of Marines is stationed at a new base in northern Australia*
  • Exercises with the Indian, Indonesian, Thai, Malaysian, and Filipino navies have stepped up
  • Obama’s decision to brave China’s wrath to sell a new more warplanes to Taiwan last year
  • The governing Japanese Democratic Party has changed tune. Having come to office four years ago promising to put distance between Tokyo and Washington, the party has done an about face on issues ranging from U.S. troops in Okinawa to the support Japan would offer in case of a crisis over Taiwan.
  • A lifting of '90s-era limitations on U.S.-Indonesian military ties
  • The Philippines is even asking us to return to Subic Bay and Clark Air Force Base, the colonial-era facilities it ejected us from in the early 1990s.

The United States should not overestimate its influence at this point or confuse the interest smaller Asian countries are showing in American military friendship with actual trust.

Webb, a Marine combat veteran of Vietnam and former secretary of the Navy, correctly states that the huge forward deployment of the U.S. military, especially the Navy’s 7th Fleet, maintains a crucial balance in the region, “providing the so-called second tier countries in the region [read: Japan, South Korean, Taiwan, Indonesia, Australia, and the Phlippines] the opportunity to grow economically and to mature politically.

Right as far as it goes. But Webb betrays a very American cultural myopia in his otherwise wise piece when he asserts that: “Since World War II, despite the costly flare-ups in Korea and Vietnam, the United States has proved to be the essential guarantor of stability in the Asian-Pacific region ... ”

Well, maybe so, but those are some pretty goddamn huge caveats. Indeed, looked at another way (through Asian eyes, for instance), only one war since World War II in Asia, the India-Pakistan War of 1971, came anywhere near the carnage caused by the “costly flare-ups in Korea and Vietnam.”

Fatality figures are imprecise for such conflicts, but a conservative ranking would go something like this: Vietnam (1962-1973) 3.5 million deaths including 57,000 U.S. troops; Korea, (1950-1953) 3 million fatalities, including about 37,000 U.S. combat deaths; India-Pakistan, about 1 million deaths. (As you may have gleaned, most of those killed in all three wars were not in uniform).

If the United States wants to make a difference in Asia, it will need to get serious about bringing China and its neighbors to an international conference. The UN, stymied by the outdated P-5 veto, will once again prove useless (as I’ve argued over and over). But the United States needs to take steps while we still have the influence to force a multinational dialogue. Otherwise, events will be driven by zealots who dive out of small craft and plant flags on uninhabited islands—risking a chain of events that puts millions of lives at risk.

Correction, Aug. 22, 2012: This article misstated the source of an article on conflicts in the Pacific Rim. It was written by the Associated Press, not Washington Post reporters, and published in the Washington Post. Additionally, this article misstated the number of people killed in the Vietnam War. The estimatd number of deaths was 3.5 million, not 3,500.

Aug. 21 2012 8:08 AM

The Man Who Saved Ethiopia

The death this week of Meles Zenawi, the longtime leader of one of Africa’s most important countries, Ethiopia, provides a good opportunity to consider the generational shift now under way on a continent that will soon have as many people as China and India combined.

Yes, that’s right, according to the United Nations Population Division, 2 billion people will live on the continent by midcentury, representing 21 percent of the global total and a doubling of its current population. This, more than any other reason, is why the succession in giant Ethiopia (population 84 million) is important to the world.

Meles’ obituaries—being written as I type—likely will touch on all the significant phases of his political life: his role in overthrowing the murderous socialist dictatorship of the Derg regime in 1991; his early tenure in the 1990s when he pledged to lead market and democratic reforms of the country, aligning it with the United States; the futile war Ethiopia fought for years with Eritrea in an (unsuccessful) effort to prevent the latter’s independence; and finally an ugly turn toward despotism after it appeared he would lose a re-election bid in 2005. His mass arrests and the shooting of hundreds of demonstrators that year permanently scarred his reputation, even in Washington, for whom Meles proved a willing proxy in battling Islamists in neighboring Somalia.

For all the ups and downs, he did some things very well, and in the face of the surge of humanity that Africa will produce in the next 40 years, it is important to remember what works today in Ethiopia as a result of Meles even as we hope his successor will emerge peacefully and ultimately embrace a more open form of government.

Meles should be remembered primarily for the transformation of Ethiopia’s economy. The very definition of a basket case when he took power in 1995, it has grown an average of 11 percent every year since 2004. Ethiopia still has poverty, and its fragile ecosystem leaves it vulnerable to drought. Meles, by opening the economy and insisting on rational monetary and fiscal policies, has made a recurrence of the famines that scarred his nation for centuries much less likely.

To the extent that Ethiopia evokes anything in the average American’s mind, it remains primarily an association with the horrific 1983-85 famine that killed about 1 million people. Maybe one of the country’s fantastic distance runners caught your eye during the Olympics, or perhaps you’ve had a sweetly talkative Ethiopian taxi driver with an advanced engineering degree, but more likely it is the Live Aid/Band Aid rock-and-roll famine relief movement that introduced the country to average folks.

Today, decades later, Africa has made progress in improving the metrics outsiders use to measure its economic and social welfare. Annual GDP and per-capita GDP growth are both up drastically. Infant mortality, malnutrition, HIV/AIDS, and malaria rates are all down drastically. A middle class is emerging that is holding governments accountable. Africa’s financial markets and banking sector have become a serious destination for global investors, requiring upgrades in regulation and transparency that have helped curb corruption.

Understand, all the old ills still exist. It’s just that the trajectory is almost all positive. One of my favorite statistics: Transparency International ranks Nigeria a poor 2.4 (out of 6) on its corruption perceptions index for 2011. Clearly, work to be done. But that’s the same rank as Russia (tied for No. 143 globally), and just barely worse than the Philippines, Vietnam, and Mexico, all serious targets of Western investment.

Setting the bar low, of course, hardly constitutes a success. Nigeria’s finance minister, Ngozi Okonjo-Iweala, has a book coming out titled Reforming the Unreformable about her success in reducing corruption in Nigeria’s banking system. And that system is, indeed, regarded as an increasingly attractive target for foreign investment. But as her title suggests, she’s under no illusion that the job is done—only that it is, in fact, doable and that progress has been made.

So, why do we care? The speed with which Africa will double its population over the next 40 years demands that we care. Some thinkers—Paul Collier, for instance, or Robert Kaplan of the Atlantic—have long worried about Africa’s sustainability, with Collier (like me) on the optimistic side.

Kaplan, in his 1994 essay “The Coming Anarchy,” foresaw a world that would essentially implode and become the incubus of mass dislocation and violence all over the planet. This was just in the wake of the collapse of the U.S. intervention in Somalia and amid the carnage of Rwanda’s genocide.

“Given that oil-rich Nigeria is a bellwether for the region—its population of roughly 90 million equals the populations of all the other West African states combined—it is apparent that Africa faces cataclysms that could make the Ethiopian and Somalian famines pale in comparison,” Kaplan wrote. “This is especially so because Nigeria's population, including that of its largest city, Lagos, whose crime, pollution, and overcrowding make it the cliché par excellence of Third World urban dysfunction, is set to double during the next twenty-five years, while the country continues to deplete its natural resources.”

Nearly two decades later, you could argue that Democratic Republic of Congo or Afghanistan proved his thesis. But Nigeria has defied his dire predictions, so far digesting its admittedly daunting population and religious challenges. Even including the violence in Liberia and Sierra Leone, nothing approaching the disasters of Ethiopia or the state collapse of Somalia has recurred. Liberia, Rwanda, Sierra Leone are on the mend and growing. Even Congo has stabilized somewhat—though it remains torn by warlordism.

Still, in most African countries now, a very clear picture of progress can be traced through World Bank or U.N. or IMF or WHO statistics—or conversations with anyone who visited the region 20 years ago and again today.

The wider view would be to remember that, at various points in the history of virtually every nation state, the process of settling scores domestically makes waves internationally, whether that’s a medieval schism within Christianity, the U.S. Civil War, the Russian Revolution, or the well-meaning dislocation of Muammar Qaddafi, which has unleashed a flood of military-grade weaponry to the tribal and Islamic militants who roam Africa’s Sahara. Why would the 300-year process endured by Africa—slave trading, colonization, proxy wars, and resource merchantism—be any different?

Even the worst places present a happier prospect today than Kaplan’s early '90s view—which, I’m quick to add, was quite reasonable at the time. Zimbabwe will succeed eventually once Mugabe dies. Libya, I would argue, is better off without Qaddafi in spite of the chaos the rising sowed. Sudan and South Sudan have so far avoided the worst in their divorce, and recently signed the oil-sharing agreement we’ve all been waiting for that should lay the basis for growth in both. Somalia, still troubled, nonetheless continues to exist in spite of our “failed state” label. A new book by my former BBC colleague Mary Harper, Getting Somalia Wrong, points out how life goes on even there.

In Mali, DR Congo, and elsewhere, of course, troubles persist. But they seem containable today, and one great sign is that unlike earlier periods of phony “solidarity,” their African neighbors appear to understand the stakes. Mali has been suspended from regional groups because of its coup, and there are continuing threats of an intervention if new elections don’t restore democratic government soon.

And so, back to Meles. A man of his generation—the generation that casts off foreign rule, or overthrows a bloody dictatorship—does not often turn out to be a model democrat. He put on a good act in the 1990s during the post-Berlin Wall euphoria, but his stupid war with Eritrea cost him domestic support, and then his real instincts took over. Awful decisions, especially when war with Eritrea began in 1998, cost thousands of lives. But on the life-and-death issue of creating a viable economy for his nation, he got it just about right, arguably saving millions of lives. If we want to avoid Kaplan’s world, other quasi-democratic states around the continent would do well to take note of Meles' victories as well as his mistakes.

Aug. 16 2012 10:23 AM

Video: Chrystia Freeland Interviews Michael Moran

An interview with Michael Moran by Reuters Digital Editor Chrystia Freeland for the Freeland Files.

Aug. 16 2012 9:12 AM

Pussy Riot v Mother Russia

Mother Russia? Really? Does this strike anyone as an anachronism? Maybe it’s time, what with “Mother Russia” about to jail a bunch of lingerie-wearing, punk-rocking female political activists, for us to arrange a moniker trade. Germany—“the Fatherland”—seems wildly misnamed right now, for instance. With Angela Merkel’s finger controlling the flow of blood through Europe’s economic jugular, das Mutterland might be more suitable. Then Vladimir Putin and his boys can fulfill their judo-chopping, bare-chested destiny and become, at last, a mythical Slavic Vaterland.

I hear you—“What was in his cereal this morning?” you’re asking. Well, it’s all down to Pussy Riot, the female punk band facing seven years in prison in the latest, and most self-defeating, of the Kremlin’s overreactions to those who just want a little democracy in their country.

Many of my American readers will know Pussy Riot primarily because the stars of stage and screen, including a bevy of Hollywood actors, Yoko Ono, Bono, et cetera, have petitioned for their release. The three women, all in their 20s, are charged with hooliganism and sacrilege for performing an anti-Putin song last spring at the Christ the Saviour Cathedral, the seat of the Russian Orthodox Church in Moscow.

The Kremlin was not amused, and within days a kangaroo court will issue a verdict that could be up to seven years in Russia’s tuberculosis-infested prison system.

For some reason, Madonna’s protest on the band’s behalf during a Moscow concert was the thing that really pissed the Russians off.  Madonna, of course, is no stranger herself to charges of sacrilege. But Russia’s former ambassador to NATO, summing up the private sentiments of Russia’s ruling elites, denounced her in Twitter a “moralizing slut.” That’s Russian for “Madonna whore,” I think, which is the whole point of her persona, right?

“Either take off the cross or put on panties," he tweeted in his best diplomatic prose.

You really can’t make that stuff up.

Pussy Riot are no angels—and, again, that’s the point, right? But they’ve comported themselves in captivity in the finest tradition of patriots.

Nadezhda Tolokonnikova, a 22-year-old guitarist for the band, put it this way in her closing statements last week in Moscow: “This is a trial of the whole government system of Russia, which so likes to show its harshness toward the individual, its indifference to his honor and dignity. If this political system throws itself against three girls … it shows this political system is afraid of truth.”

You can hear the manly belly laughs emanating from the Kremlin.

Of course, there are wonderful Machiavellian (or should we say, Rasputineque) opportunities here for Putin. By pardoning the “criminal” rock band, he can show that, underneath that iron exterior, there beats the heart of a bear (a bear mommy, of course, as this is still the Motherland). “Mother Bear gets quite angry sometimes when her cubs are naughty,” he might announce through the Kremlin’s spokesman. “But mommy loves you. Remember, anything I do to you is for your own good, darlings.”

Or, he can let the Russian Orthodox Church hold the bag. The church—allegedly the “offended party” here—has given no quarter, Christian charity be damned. The bearded zealots (and that’s just the women) feel violated by Pussy Riot’s sacrilege and want the full force of the law to rain down upon their harlotty heads in this world, before their thongs are burned off of them in a fiery, eternal afterlife in hell.

This is the “mother church” of Orthodox Christianity. Nice.

Whatever course Putin’s government chooses—and don’t kid yourself, his writ sways the courts, too—the Kremlin seems determined to drive home its point: A Pussy Riot is a spectacle, but for real, unadulterated violence and vindictiveness, a penis still helps.

READ MORE STORIES