Justice Department Likely to Recommend Blocking Comcast-Time Warner Cable Merger
Lawyers at the Justice Department are getting ready to recommend blocking Comcast’s proposed $45.2 billion bid to merge with Time Warner Cable, sources “familiar with the matter” tell Bloomberg. The staff attorneys will reportedly submit their review of the deal as early as next week. From there, the Justice Department will decide whether to file a federal lawsuit to block the buyout. From Bloomberg:
The Justice Department lawyers have been contacting outside parties in the last few weeks to shore up evidence to support a potential case against the merger, one of the people said.
Furthermore, officials at the antitrust division and the Federal Communications Commission, which is also reviewing the deal, aren’t negotiating with Comcast about conditions to the merger that would resolve concerns, such as selling parts of its business or changing practices, said two people familiar with the situation.
A representative for Comcast told Bloomberg that there’s “no basis for a lawsuit to block the transaction” and that the merger will significantly benefit consumers. Obviously Comcast is going to say that. Then again, the company behind the worst customer service call of all time might not be the best authority on that matter.
Poor Children May Have Smaller Brains Than Rich Children. Does That Tell Us Anything?
Social scientists have found that by the time children enter kindergarten, there is already a large academic achievement gap between students from wealthy and poor families. We still don't know exactly why that's the case. There's a sense that it at least partly has to do with the fact that affluent mothers and fathers have more intensive parenting sytles—they're more likely to read to their kids, for instance—and have enough money to make sure their toddlers grow up well-nourished, generally cared for, and intellectually stimulated. At the same time, poor children often grow up in chaotic, food-insecure, stressful homes that aren't conducive to a developing mind.
A new study in the journal Nature Neuroscience adds an interesting biological twist to this issue. Using MRI scans of more than 1,000 subjects between the ages of 3 and 20, it finds that children with poor parents tend to have somewhat smaller brains, on some dimensions, than those grow up affluent. Specifically, low-income participants had less surface area on their cerebral cortexes—the gray matter responsible for skills such as language, problem solving, and other higher-order functions we generally just think of as human intelligence. Poorer indviduals in the study also fared worse on a battery of cognitive tests, and a statistical analysis suggested the disparities were related to brain dimensions.
How big a difference are we talking about? According to the researchers, children whose parents earned less than $25,000 per year had 6 percent less surface area on their cortex than those whose parents earned at least $150,000.
This is the largest published study of its kind, and it could well help us understand more about why low-income children start off behind academically. But it's also a little early to be drawing much in the way of conclusions from the paper, especially about the causes behind its findings. Lead author Kimberly Noble, a professor at Columbia University's medical school and Teacher's College, believes that physical differences between rich and poor kids' brains may trace back to the environments in which they grow up, and is beginning a new research project to test that theory. But that didn't stop Charles Murray, the conservative author of the discredited book The Bell Curve, from basically telling The Washington Post that the gaps must the result of genetic inheritance:
"It is confidently known that brain size is correlated with IQ, IQ measured in childhood is correlated with income as an adult, and parental IQ is correlated with children’s IQ,” Murray wrote in an e-mail. “I would be astonished if children’s brain size were NOT correlated with parental income. How could it be otherwise?"
That there is what one calls irresponsible science commentary.
It's also important to note that while Noble and her co-authors found a statistically significant correlation between income and brain size, it was not particularly strong. As you can see on the graph below, there were plenty of low-income subjects with relatively large brains, and lots of high-income subjects with relatively small brains. The relationship between income and neural growth seems to be tighter at the very bottom of the income distribution, where children may well be subject to extreme degrees of deprivation. But as Noble put it to me, “You would never be able to look at a child’s family income and from that information alone predict their cortical surface area."
In other words, when it comes to brain development, poverty isn't destiny.
AmEx’s Terrible Year Is Getting Even Worse
American Express shares are plunging after the company reported first-quarter revenue below expectations Thursday afternoon. The slump doesn’t bode well for AmEx, which could really use a bit of good news. AmEx took a big hit in February after announcing that its 16-year partnership with Costco would come to an end. The news caught investors by surprise, and on Feb. 12 the stock tumbled 6.4 percent or $5.53, its biggest single-day percentage loss since August 2011.
With shares already off 5 percent on earnings, Friday is shaping up to be another ugly trading session for AmEx, as you can see in the Yahoo Finance chart below:
American Express CEO Kenneth Chenault said in the company’s earnings release that the quarter showed “solid core performance” despite “an impact from several of the headwinds we’re confronting.” Those headwinds include the strong U.S. dollar, which is broadly hurting corporate profits, as well as continued fallout from the Costco loss. (According to Bloomberg, the Costco partnership accounts for one in 10 AmEx cards and 20 percent of its loans.) AmEx did beat quarterly estimates on profit, with earnings per share that came in at $1.48 versus an expected $1.37.
All in all, AmEx is having a really rough year. Its stock is down 17 percent since January, putting it among the worst performers in the Dow Jones Industrial Average. In late January, the company said it would eliminate 4,000 jobs later this year as a cost-cutting measure. And JetBlue Airways is also dropping its AmEx partnership to team up with Barclays and MasterCard instead.
AmEx cards tend to carry higher fees than those from competitors Visa and MasterCard, which hasn’t helped the company convince more merchants to accept it as a payment. Analysts are also concerned that any investments AmEx is making in improving its offerings will hurt before they can start to turn the company around. “My faith in management has been shaken,” Walter Todd, chief investment officer of Greenwood Capital Associates, an AmEx shareholder, told Bloomberg late last month. “They need to have more realistic goals about what their revenue-growth potential is.” If AmEx doesn’t, chances are Wall Street will.
Etsy Waited 10 Years to Go Public. Is That Slow or Fast?
Etsy's IPO was roughly 10 years in the making. Is that a long time or a short time?
Not surprisingly, the answer depends on which set of initial public offering data you look at—and how you look at it.
For example: A glance at the historical chart from Renaissance Capital, manager of IPO-focused ETFs, reveals that 10 years is actually the average age of all 63 companies that have filed for IPOs so far in 2015.
Not only that, but the average age of 10 years is the youngest average age in recent years—by far:
Year Average Company Age in Years at IPO
2010 16 (253 companies)
2011 15 (258 companies)
2012 20 (140 companies)
2013 16 (256 companies)
2014 16 (365 companies)
2015 10 (63 companies so far)
Granted, 2015 is still young. But if the trend established in the first quarter holds up, then it's accurate to say Etsy's 10-year span from birth to IPO is fast, compared with historical IPO ages.
And it's also accurate to say Etsy's age-at-IPO is average for the IPO class of 2015.
Mind you, all this is just one set of IPO data. The numbers change in an interesting way when, instead of looking at all IPOs, you look only at the subset of venture capital–backed IPOs. (Etsy, having raised nearly $100 million in angel and VC cash prior to its IPO, is among that group.)
When you look at VC-backed IPOs, you can see another clear trend: The mean time in years for VC-backed startups to exit via IPO has increased. In 2010, the mean time was 5.9 years. In 2013, the mean time was 8.1 years. One important note: These numbers reflect time from first funding, as opposed to company inception.
Year Avg. Years to IPO No. of IPOs Avg. IPO Amount
2010 5.9 70 $111 million
2011 7.0 51 $210 million
2012 7.8 49 $438 million
2013 8.1 81 $137 million
Source: 2014 NVCA Yearbook, p. 73
These data make it clear that it's taking longer for VC-backed companies to IPO. And the trend appears to have held steady in 2014. In a column for Forbes, Bruce Booth dissected the ages of 2014's VC-backed IPOs in biotech and software. The mean age (again, from first funding to IPO) of VC-backed biotech companies who had an IPO in 2014 was 7.4 years. For software companies, it was 8 years.
So any way you slice it, you're still talking about at least a seven-year span between first funding and IPO.
In some prominent 2015 IPOs, the span was longer than seven years. Etsy, first funded in 2006, had a nine-year span. Box, the online storage provider that went public Jan. 22, also had a nine-year span. It was founded (and backed by Mark Cuban) in 2005, and VCs first funded it in 2006.
Of course, there are some recent IPOs whose spans were shorter than seven years. Shopify, the e-commerce software company that went public earlier this week, was founded in 2004, but its first funding came in 2010. Likewise, Hortonworks and New Relic—two VC-backed software companies that went public late last year—also had shorter-than-seven-year spans from first funding to IPO.
So if you're wondering whether Etsy's time to IPO is long or short, the answer is just three words long: compared with what?
Which Workers Are the Most Depressed?
Gallup recently decided to rank the occupations in which workers are most likely to suffer from depression. The takeaway: Your boss is probably in a pretty decent psychological place. Managers, executives, and officials were the least likely to have ever been diagnosed with depression. The clerical and office staff who toil beneath them, on the other hand, were diagnosed at somewhat higher rates. Meanwhile, workers in manufacturing or service industries were the most likely to have ever been depressed. The drudgery of clocking in on an assembly line or at a cash register apparently takes a toll, just in case you were wondering.
Anyway, remember kids: Money buys happiness.
How the Bush Administration Pointlessly Screwed Over Student Borrowers
There has never really been a good reason to bar Americans from discharging their student loans in bankruptcy. Back in the 1970s, a spate of newspaper stories claimed that unscrupulous college kids and law school grads were borrowing money from the government without planning to pay it back, knowing that they could just go to court and weasel out of their debts before they had any real assets to lose in the bargain. But, unsurprisingly, the reporting turned out to be mostly anecdotal trash that was later debunked in a study commissioned by Congress.
Didn't matter. In 1978, Capitol Hill passed a bankruptcy reform bill that, for whatever reason, limited borrowers' ability to relieve their federal student loan obligations. Over time, lawmakers tightened the rules to make it even tougher.
A similar story more or less repeated itself during the Bush administration. Major private lenders claimed they needed Congress to stop their customers from filing opportunistic bankruptcies. Despite the notable lack of evidence that this was actually happening, lawmakers listened, and inserted a clause into the 2005 bankruptcy reform bill making private student loans nondischargeable unless someone could demonstrate they posed an "undue burden" on their finances—a vague standard which the courts have subsequently interpreted as an incredibly high bar.
So, was it worth it? Is there any sign, in retrospect, that the Bush bankruptcy bill needed to single out student debtors? According to a new working paper from economists at the Federal Reserve Bank of Philadelphia, no, there is not. The researchers looked at how bankruptcy rates for private student loan borrowers changed after the reform bill went into effect, then compared them with the bankruptcy patterns for federal student loan borrowers and debtors without any education loans, who should not have been affected by the new law. If private borrowers had been filing for Chapter 7 in order to wiggle away from their debts pre-2005, you would expect their bankruptcy rates to fall significantly faster than they did for those without student loans or people who borrowed from the feds. That didn't happen, as shown on the graph below.
"Although the 2005 bankruptcy reform appears to have reduced rates of bankruptcy overall, the provisions making private student loan debt nondischargeable do not appear to have reduced the bankruptcy filing or default behavior of private student loan borrowers relative to other types of borrowers at meaningful levels," the authors write. "Therefore, our analysis does not reveal debtor responses to the 2005 bankruptcy reform that would indicate widespread opportunistic behavior by private student loan borrowers before the policy change."
So the 2005 bankruptcy bill effectively made life a bit more miserable for hundreds of thousands of Americans in order to deal with an imaginary scourge. Worse yet, it may have encouraged the sort of risky private student lending that mirrored the subprime mortgage boom, with financial institutions shoveling debt at marginal students who were poorly positioned to ever pay it back but had no recourse in the bankruptcy courts.1
Now, there is some academic evidence that meeting the "undue burden" necessary to discharge student loans might be somewhat easier than the media has projected, especially if you're unemployed or have a medical condition. Princeton University Ph.D. student Jason Iuliano has found that of all bankruptcy filers who have student debt, just 0.1 percent try to have it wiped out during the proceeding. But of those who do, almost 39 percent are successful. Of the more than 239,000 Americans with student debt who filed for bankruptcy in 2007, he believes there were about 69,000 who stood a decent chance of winning at least a partial discharge. More debtors need to at least give it a shot.
Still, discharge shouldn't take a special effort. The "undue burden" standard was unnecessary to start with. We'd all be better off scrapping the thing.
1Just to rant and rave about this at a little more length: In 2005, banks claimed that the nondischargeability rule was necessary to encourage more private student lending. But it is not at all clear that extra private student lending, especially to marginal students, is at all socially desirable. College students as a group are really bad borrowers. They default at high rates, in part because they often drop out of school. And while the federal government offers a number of forgiving loan-repayment programs that help troubled debtors, those protections are basically absent from the private sector. By eliminating dischargeability in bankruptcy, you're basically spurring banks to lend to high-risk individuals who have already maxed out their federal Stafford Loan limits (or, I should say, hopefully maxed them out, because there's no good reason for most students to pick a private lender over the federal government). I'm not sure who that's really helping.
Why It’s Absolutely Crazy That We Don’t Ask Millionaires to Pay More Taxes
This is just a stray, late-on-April 15 thought, but isn’t it kind of insane that we don’t ask millionaires to pay more in taxes? I mean, much, much more? Today, the top marginal income tax rate is 39.6 percent. Why not go to 50? Or higher? Some economists think we could go as high as almost 90 percent.
Obviously, this is not a politically viable idea. We live during a time in which a supposedly serious presidential candidate can propose eliminating all taxes on capital gains or inheritances with a straight face, as if supporting the Hilton and Walton families were an existential national concern. Even undoing the Bush tax cuts for roughly the top 1 percent of households took a herculean political effort on the part of President Obama and Senate Democrats. But just from the perspective of rational self-interest, it seems goofy that, somehow, soaking the rich is barely part of the national policy conversation (the largely unheralded efforts of the Congressional Progressive Caucus aside). At some point, the federal government is going to need more revenue in order to support the social welfare programs that the vast majority of Americans know and love. Obviously, not all of that money can come from inside the top 0.5 percent. But at least some of it can.
And it’s not at all clear that steeping the wealthy, so to speak, would significantly slow down the economy. I mean, it could. Maybe. Researchers generally do think that a major tax hike would be a sap on growth. But it’s a more complicated issue than many assume. Theoretically, there are at least two big, opposing forces at play. On one side, you have the so-called substitution effect, the idea that people work less when the IRS snatches more of their paycheck, because each hour of labor suddenly earns them less money, making it more attractive to spend time finally learning guitar or crafting bird houses or otherwise chasing their Zen. On the other side, you have the “income effect”—the idea that when taxes go up, some people might actually work harder and longer in order to maintain their standard of living.
Which is stronger? When the Congressional Budget Office reviewed the literature a few years back, it concluded that substitution effects were a little more powerful, and that big-earners like doctors and executives didn’t act vastly different than the rest of us. Thus, we should expect higher tax rates to make the rich (and thus, the country) a bit less ambitious and productive as a whole. In theory.
In reality, however, it’s just not clear how strongly taxes influence the overall direction of the economy, given how many other factors are at play. As any mildly snarky liberal will remind you, the country seemed to do just fine in the Eisenhower era, when marginal income tax rates topped out at a confiscatory 92 percent. It also fared pretty well after Bill Clinton raised rates to close the deficit in the early 1990s. If you’re looking for a slightly more formal source, when the Congressional Research Service looked at the issue in 2012, it found that there was no statistically significant correlation at all between top marginal tax rates and real GDP growth.1
A conservative will counter here that the astronomical tax rates of midcentury America were basically a fiction, or as commentators put it at the time, “a colossal illusion” riddled with loopholes. This is somewhat true. For one, capital gains taxes have always been significantly lower than the top rate on labor income, which gave well-to-do stock and bond owners an enormous break. One roughly contemporary analysis suggested that, in 1953, when top marginal rates were still hovering around all-time highs, households that earned more than $1 million were only really paying about 49 percent of their adjusted gross income in taxes. (That’s $1 million unadjusted for inflation, by the way. We’re talking the super-rich of the time.)
Still, the evidence suggests that America’s wealthiest faced a significantly higher tax burden during the country’s years of midcentury prosperity. Thomas Piketty and Emmanuel Saez, for instance, find that once corporate and estate taxes are added into the mix, the top 0.1 percent of earners paid 71.4 percent of their income to the IRS in 1960, compared with 34.7 percent in 2004. Reaching further back and using slightly different methodology, the Congressional Research Service finds that 0.1 percenters paid an average effective personal income tax rate of 55 percent in 1945, compared with around 25 percent during the late 2000s. The tax code really was more progressive back in the day—and more aggressive.
So, what would the ideal top marginal rate on the rich be now? That depends on your goals, and some of your beliefs about human behavior. But unless you’re philosophically opposed to government spending or the welfare state, chances are the magic number is quite a bit higher than today’s.
Let’s say your only interest is in maximizing the amount of revenue the Feds collect. Conservative guru Art Laffer became famous for pointing out that, at some point, raising taxes becomes counterproductive, because people either stop working or find ways to hide their income. Thankfully, we’re probably nowhere near that point. In their most recent work on the subject, co-authored with Harvard University’s Stefanie Stantcheva, Piketty and Saez conclude that governments would net the most money from a top marginal rate somewhere between 57 percent and 83 percent (that includes state taxes, too).* Why the range? The three researchers acknowledge that, when taxes go up, the rich seem to earn less on the job. If you think that’s entirely because they choose to work less, then 57 percent is your number. However, Piketty, Saez, and Stantcheva argue that lower taxes don’t seem to spur executives and other highly paid professionals to work harder so much as they encourage them to bargain harder for extra pay, whether it’s from their board of directors or their partners at a law firm. Negotiating a bigger paycheck for yourself doesn’t actually add anything to the economy. So, if you believe taxes simply discourage that kind of tough bargaining without making star workers much less productive, then 83 percent is your figure.
What if your goal isn’t just to maximize revenue? What if you want to maximize people’s standard of living by balancing taxes, spending, and economic growth? In a 2014 working paper exploring that question, economists Dirk Krueger of the University of Pennsylvania and Fabian Kindermann of the University of Bonn came up with an even larger number than Piketty and Saez. According to their model’s calculations, the bottom 99 percent of Americans would be best off if the top 1 percent paid an 89 percent top marginal rate. In their model, the high taxes do discourage top earners from working and lead to lower economic growth. But as a trade-off, the government can afford far more social spending (or more tax cuts) for the 99 percent, improving their overall welfare. As Krueger put it to me, “The total pie shrinks, but it produces more food for the poor and fewer for the rich—so to speak.”
Like Piketty & co.’s, Krueger and Kindermann’s paper is just a modeling exercise—and models, as rough mathematical approximations of reality, are both frequently wrong and subject to revision. But it should give us a sense of how much room we likely have to raise rates, should Washington ever want to. It also shows that if we have to trade a bit of economic growth for a bigger safety net or lighter tax burden on the working class—to exchange efficiency for equity, as economists might put it—the deal might well be worth it.
So, why did I start off talking about taxing millionaires, and not just 1 percenters? That comes back to Krueger’s paper as well. One of the dangers it notes is that, over time, high tax rates might not only discourage people from putting in extra hours at the office, but also change people’s long-term career and education decisions. If you’re looking at a 60 percent or 80 percent marginal tax rate once your household starts earning $391,000, that might make seven years of medical school or a law degree somewhat less appealing. Over time, dissuading people from pursuing advanced degrees or from entering the workforce at all if, say, a spouse makes a great deal of money, would probably begin to undermine the economy in some nasty ways. But while plenty of people go to grad school with the expectation of making mid–six figures, not that many sign up because they expect to make a million. Those who get lucky and do, well, they can afford to pay a bit more to Uncle Sam.
1 One notable empirical study by University of California–Berkeley economists David and Christina Romer did find that certain kinds of tax increases, such as those meant to deal with an old budget deficit, “are highly contractionary.” But some of their results have been challenged. Meanwhile, when Thomas Piketty, Emmanuel Saez, and Stephanie Stantcheva looked across developed countries, they found that cutting the top marginal tax rate didn’t seem to boost growth—though it did lead to greater income inequality growth
*Correction, April 16, 2015: This post originally misspelled the last names of economists Stefanie Stantcheva and Fabian Kindermann.
This Tumblr Mocks Brands for All Those Annoying Requests to “Share Your Story”
You never forget your first #FlonaseStory. You know, your favorite childhood memory involving nasal spray.
Wait—are you saying brands don’t play an integral role in your personal story? That you don’t think back fondly on all the bleachable moments you’ve had over the years, thanks to Clorox, or that you don't have a Purina Cat Chow story inside you just begging to be heard? How sad for brands, and for you.
As the Tell Us Your Story Tumblr demonstrates, brands these days want to be a part of your story. They want you to write in, tweet, and engage with them, and then they want to repurpose your content into marketing that will engender further good feelings toward their products. Copywriter Brian Eden started the Tumblr to collect examples of brands getting caught in the act, our favorites of which we’ve posted below.
Forget Steak and Seafood: Here’s How Welfare Recipients Actually Spend Their Money
Red-state lawmakers have been on a rather unnecessary crusade lately to stop welfare and food stamp recipients from spending their government aid on luxuries like cruises and supermarket king crab legs. This has, thankfully, led to some discussion about how low-income families actually use their money—which is to say, not all that differently than the rest of us. (More of their budgets generally go to food, because people have to eat.)
This all reminded me of one of my favorite graphs on this subject. In 2013, Ann Foster and William Hawk of the Bureau of Labor Statistics used data from the Consumer Expenditure Survey to analyze the spending habits of families who receive public assistance, including food stamps, cash welfare, housing aid, or Medicaid. Unsurprisingly, their budgets tend to be quite modest. Their big budget items are housing, transportation, and food, spending on which came out to about $6,460 per year, or about $124 per week. That's for an average family of 3.7 people—meaning roughly $33 per mouth to feed. Based on some brief online searching, king crab legs cost about $34 a pound these days (though bulk discounts might be available).
Here are those expenses broken down into weekly totals, which might be a bit more comprehensible.
The point of these charts isn't that food stamp and welfare recipients never overspend, or make what might seem to be poor financial decisions. (Personally, I would love to see a distribution curve showing the range of spending patterns among families). Nor am I suggesting that these programs are 100 percent free of fraud; believe it or not, investigators found cases in California where welfare beneficiaries withdrew their benefits on cruise ships (the state later banned them from doing so). The point is, these are fringe cases, and they're used to demonize a group of people who are often working extremely hard just to get by.
Southwest Is Making Its Plane Seats a Fraction of an Inch Less Squished
While the state of the airline industry is broadly declining, Southwest had a bit of good news for customers on Tuesday: Seats are getting wider. Yes, the incredible shrinking airline seat will finally begin to unshrink a bit beginning in mid-2016 when Southwest rolls out new seats on the Boeing 737-800. At 17.8 inches wide, the new seats will be seven-tenths of an inch roomier than their economy-section predecessors and the “widest economy seats available in the single-aisle 737 market,” according to Southwest executive vice president Bob Jordan.
Competition among airlines and the never-ending quest for profits has caused the seats sold to passengers to shrink steadily in recent decades. A once standard 18 or 18½ inches has diminished to 17 or even 16½ inches on some of the narrowest carriers. Legroom has also dwindled, from something between 32 and 36 inches in the mid-1980s to a dismal 30-ish inches (or a tight 28 inches on Spirit Airlines) today. Those increasingly cramped quarters might help explain the popularity of space-protecting devices like the Knee Defender, and the bout of “recline rage” that seemed to sweep flights last summer.
Southwest says the new seats are lighter than what it currently uses, which will help improve fuel efficiency. A representative for Southwest said in an email that the updated economy seats will recline and come with 32 inches of pitch—the space between seats when they’re in an upright position. Southwest doesn’t plan to increase the number of seats on its 737-800 aircrafts from its current 175.
Of course another way to make seats lighter is to eliminate the recline mechanism altogether. Ultra-low-cost carriers like Allegiant Air and Spirit already opt for nonreclining seats in part to keep costs down and in part to keep customers from angrily reclining on one another. Slate’s Dan Kois has previously made the case that reclining airline seats are nothing less than “pure evil,” so perhaps eliminating them is something Southwest should consider for a future update. If nothing else, it would at least be a better idea than surprise concerts in the skies.