McDonald’s Is Finally Phasing Out Antibiotics in Chicken. Thank Chipotle.
The battle by McDonald’s to bring customers back into its stores has been long and fraught. Don Thompson, the company’s former CEO as of March 1, couldn’t turn the chain around after more than two years at the helm. Nor could Ronald McDonald’s makeover or a transparency campaign featuring TV star Grant Imahara. Chances are the “Pay With Lovin’ ” campaign didn’t do it, either. But the latest attempt from McDonald’s to woo consumers isn’t a cheap publicity stunt, or a terrifying mascot rebranding, or even a limited-time offer of the McRib. Instead, it’s something an overwhelming number of consumers desire and seek out—and you should thank Chipotle for that.
McDonald’s said Wednesday that over the next two years it plans to phase out all “antibiotics that are important to human medicine” in the chicken it serves. Already, the move is being praised by advocates of sustainable and responsibly produced meat. McDonald’s is one of the biggest purchasers of chicken in the United States and bought up an estimated 3 to 4 percent of the nation’s 39 billion pounds last year. If the company decides it’s done with certain antibiotics, you can bet that the agriculture industry is going to listen.
The funny thing is that McDonald’s decision really didn’t start with McDonald’s—it began with fast-casual chains like Chipotle. The Mexican grill chain that won the hearts and stomachs of America has gone to great lengths to distance itself from the traditional fast-food sector. And it’s accomplished that largely by emphasizing its sustainability and leading the charge into antibiotic-free meat. In May 2012, NPR reported that the still-tiny antibiotic-free meat industry was receiving a sudden burst of attention because of Chipotle. Since then, consumer spending on chicken raised without antibiotics has surpassed $1 billion and retailers as mainstream as Walmart and BJ’s have begun stocking it.
Of course, Chipotle can’t claim all the credit. Lots of other restaurant chains—smaller than McDonald’s but still significant—have hopped on the antibiotic-free train, including Panera and Chick-Fil-A. At the same time, Chipotle has probably been the most instrumental in raising consumer awareness of antibiotics in meat. “I don't think that Chipotle has directly put pressure on McDonald's,” says Darren Tristano, executive vice president of restaurant industry research firm Technomic. “I think that Chipotle’s use of proteins that don't have antibiotics has educated consumers and raised consumers’ expectations about what kind of food they find healthy.”
The World Is Running Out of Places to Store All of Its Oil
The world is now pumping so much more oil than it needs that corporations are apparently running out of space to store the stuff. If the globe were a giant gas tank, its meter would be getting close to full. Here's how the Wall Street Journal sums up the situation in numbers today:
U.S. crude-oil supplies are at their highest level in more 80 years, according to data from the Energy Information Administration, equal to nearly 70% of the nation’s storage capacity. A key U.S. storage hub in Cushing, Okla., is expected to hit maximum capacity this spring. While estimates are rough, Citigroup Inc. believes European commercial crude storage could be more than 90% full, and inventories in South Korea, South Africa and Japan could be at more than 80% of capacity.
The main cause here, again, is that oil production is still outstripping demand. But the problem is being exacerbated because the crude market has entered what's known as contango, which is when buyers are willing to pay more for oil delivered a few months from now (when supplies might finally drop and bring up prices) than they are for oil delivered today. Investors have responded by snapping up cheap crude now, putting it in storage, and locking in futures contracts that amount to guaranteed money. (Good news for them: There's even talk of the market hitting "super contango.") As a result of all this activity, the Journal reports that the cost of storage itself is rising, which is leading to the creation of the brand new trade in oil storage futures. Weird things are happening.
As storage space becomes ever more scarce, it could ultimately force prices lower, as drillers find themselves with fewer customers who have the capacity to hold onto the crude. That's one reason Citibank analysts have suggested the cost of a barrel could potentially drop to around $20. That said, if the situation gets bad enough, drillers may finally just choose to leave their oil in the ground. Also, companies are presently building more storage capacity, which could alleviate the problem a bit.
But anyway, the main takeaway here is: If you happen to have a large backyard swimming pool, or just a really big ditch, put oil in it. Lots of oil. It's a sure bet. No, no need to thank me for the advice. That's just what we're here for at Moneybox.
There’s Nothing Quaint About Etsy’s IPO
Etsy, the artisanal online marketplace and standard-bearer of the “quaint economy” and burgeoning Brooklyn startup scene, did a very unquaint thing on Wednesday: It applied to be listed on the Nasdaq Stock Market in an initial public offering that, according to preliminary documents, could raise as much as $100 million.
Etsy, which values itself at about $1.7 billion, has yet to turn a profit. In 2014, it recorded a net loss of $15.2 million, according to its prospectus. The two years before that, it lost $796,000 and $2.4 million. At the same time, its revenues have grown substantially. In 2012, Etsy said it produced $74.6 million from its marketplace, seller services, and “other” things; in 2013 that figure climbed to $125 million and in 2014 it reached nearly $200 million.
Where is all this revenue coming from? Etsy says it had 1.4 million active sellers and 19.8 million active buyers as of Dec. 31, 2014, and that year the average seller on it platform contributed $1,400 of revenue. That said, it also defines these things rather loosely. An active buyer is someone who had made “at least one purchase in the last 12 months”; an active seller “has incurred at least one charge from us in the last 12 months.” Etsy makes money by charging 20 cents to sellers for each product listed on its platform and taking a 3.5 percent cut of each transaction.
With its IPO plans, Etsy sees itself capitalizing on several promising economic trends, or what is best summed up as the quaint economy (TQE). These include increased consumer interest in local or handmade products, a trend toward freelance work, and of course a ballooning market for mobile and online shopping. Etsy notes in its prospectus that failing to maintain the image of an “authentic, trusted marketplace” that values “unique offerings” and “handmade goods” could be a risk to its business.
And yet, sellers on Etsy’s platform are among the first to admit that the company long ago departed from its authentic and handmade roots. In late 2013, Etsy changed its policies to allow sellers to partner with outside manufacturers to produce their goods. The shift angered many of Etsy’s earliest users, but some still found it hard to leave because of the company’s huge customer base. “I think Etsy has already lost some of its quaintness, but as long as buyers can differentiate between the genuine handmade goods and the imported impostors, then it’s OK,” Rachel Pfeffer, a jeweler on Etsy, wrote me in an email.
It’s anyone’s guess whether Etsy will be able to hold onto what’s left of its image as it goes public. The important point would seem to be that to succeed it might not have to, despite what the risk factors say. The inherent irony in the quaint economy, after all, is that so much of “quaint” is tied up in keeping things small, and so much of “economy” is about scaling up to just the opposite. There’s nothing quaint about an IPO being led by Goldman Sachs, Morgan Stanley, and Allen & Company. And similarly, there’s nothing quaint about a $1.7 billion valuation, no matter how many ceramic water-lily coasters are behind it.
Ringling Brothers Will Drop Its Elephant Act
It's 2015, and compassionate human beings are beginning to feel queasy about watching charismatic megafauna poked and prodded to perform tricks for children's amusement (see SeaWorld's crashing attendance and financial troubles). So today, Ringling Bros. and Barnum & Bailey Circus is announcing it will retire its famous elephant act by 2018, according to the Associated Press. "There's been somewhat of a mood shift among our consumers," an executive at Feld Entertainment, the circus' parent company, told the wire service. "A lot of people aren't comfortable with us touring with our elephants."
This is a win for animal-rights groups that have long accused Ringling of abusing its pachyderms. Much of the controversy has focused on the circus' use of bullhooks, the long, steel-tipped rods that handlers wield to control and train the elephants, and look a bit like large fire pokers. PETA, for instance, has released undercover video of the animals seemingly being beaten with the instruments. Ringling and its supporters insist that the hooks don't inflict pain thanks the the elephants' tough skin, and are mostly used to nudge and guide the animals around. But some of the film can be a bit rough to watch.
In 2000, a former Ringling employee named Tom Rider, backed by animal activists, sued the circus alleging that its treatment of elephants violated the Endangered Species Act. But the case ended rather poorly when it finally went to trial nine years later. A federal judge concluded that Rider, who had received at least $190,000 in support from activist groups, was a "paid plaintiff" and ruled for the circus, which later brought a racketeering case against its accusers.1 By 2014, the Humane Society of the United States, the Society for the Prevention of Cruelty to Animals, and other organizations ended up paying $25 million to Feld Entertainment in order to settle various claims.
While their courtroom efforts can only be described as a misbegotten legal disaster, animal groups have been successfully pressing their case in public. Cities including Oakland, California, and Los Angeles have passed bans on the bullhooks, which Ringling said would prevent them from bringing the elephants—or the rest of the circus—to town. Feld Entertainment President Kenneth Feld told the AP that the proliferation of those laws was a major reason the company is dropping its elephant act:
Another reason for the decision, company President Kenneth Feld said, was that certain cities and counties have passed "anti-circus" and "anti-elephant" ordinances. The company's three shows visit 115 cities throughout the year, and Feld said it's expensive to fight legislation in each jurisdiction. It's also difficult to plan tours amid constantly changing regulations, he said.
Whether the bad publicity was cutting into Ringling's profits is a bit difficult to say. Feld Entertainment was reporting record earnings and attendance as recently as 2012. But the private company is an entertainment conglomerate that also stages shows like Disney on Ice, Monster Jam truck rallies, and Marvel Universe Live! Whether bad press was leading to empty seats at the circus is a bit unclear. Though, it's worth noting that its wildly successful competitor, Cirque du Soleil, is animal-free. It seems reasonable to guess that audience tastes have been changing.
So, what happens to the elephants once they're no longer marching around in a circle? Feld Entertainment says that they'll be moved to its Center for Elephant Conservation in Florida, a 200-acre property where it already keeps some of its animals. I guess there are worse places to retire.
1In order to bring the suit, Rider was required by federal law to demonstrate that he had an emotional attachment to the elephants. The cash, suffice to say, raised questions about the purity of his intentions.
All Hail Austin, Texas, the Boomingest Big City of All
Stories about America's urban renaissance have become something of a cliché by now. But there's a reason for that—they're true! Big cities are growing faster than the country as a whole, which is basically for the best (dense urban areas tend to be more efficient and economically productive, after all). And today, the Census Bureau shared its estimates of which locales have expanded quickest in these post-recession years. Among the 25 largest cities in the country, top prize goes to Austin, Texas, which experienced a 12 percent population surge between 2010 and 2013.
It's not much of a mystery why Austin has fared so well. The city was only lightly affected by the recession, thanks in part to the fact that Texas was generally spared a housing bust, and its local economy is anchored by a state government, a massive university, and a tech scene. And yes, it's fun and still at least a tiny bit weird (RIP Leslie). But what's interesting here is that all over, large cities are outpacing the U.S. writ large. It's not even about regional migration: The South and West grew at 3.3 and 3.2 percent rates, slower than cities like Denver, Phoenix, and San Diego.
Of course, there are exceptions. Los Angeles, Chicago, and Philadelphia are lagging behind the national growth rate. And poor Detroit.
To be clear, meanwhile, we are talking about cities specifically here, not metro areas that encompass the suburbs. The Census Bureau is specifically analyzing growth patterns in "incorporated places," the legal entities you and I know as cities, towns, and so forth.
One other interesting tidbit: As a group, the largest cities, with populations exceeding 1 million, are growing far, far faster than before. During the entire first decade of the 21st century, they expanded by 2.1 percent. Between 2010 and 2013, though, they've bulked up by 3.1 percent. Though the Census Bureau doesn't break down the numbers on this front, my guess is that mostly has to do with fast-growing cities in the South and West crossing the 1 million threshold then continuing apace.
Anyway, well done Austin. Have a Lone Star.
Just Over Half of Americans Want Congress to Fix Obamacare if the Supreme Court Wrecks It
The Supreme Court is hearing arguments right now in King v. Burwell, a case that could thoroughly wreck Obamacare by nixing the insurance subsidies provided by the law for Americans in the 37 states that didn't set up their own health care exchanges. Earlier this week, I noted that a number of Republicans were getting nervous about the political ramifications of such a decision and were suggesting that Congress pass a "transitional" bill to keep the subsidies alive temporarily in order to avoid voter outrage over nightly news stories about sick people losing their coverage. If the justices gut the law, I argued, political expedience might save it for a while.
I might have spoken a little too soon. According to a new NBC/Wall Street Journal poll, just 54 percent of Americans say they want Congress to pass a law fixing the subsidies should the court strike them down. Thirty-five percent said Congress definitely should not. The rest said "it depends" or weren't sure. But crucially, there was a severe partisan split: Eight in 10 Democrats want lawmakers to restore the subsidies, while only 1 in 4 Republicans want them to. Since House GOP members are by far most concerned with placating their base and avoiding primary challenges, that suggests they won't have much reason to take action in the wake of a court ruling against the administration. Maybe public opinion will shift once voters actually witness the results of eliminating the subsidies, but that's obviously a hypothetical.
Samsung Can Mock Apple All It Wants, but the iPhone Is Officially Outselling It
Over the weekend at the Mobile World Congress in Barcelona, Spain, Samsung unveiled two new smartphones, the Galaxy S6 and the Galaxy S6 Edge. Samsung’s announcement came with a sleek product video, but also with several jabs at Apple. The company reportedly poked fun at Apple’s “Bendgate” scandal and aspects of the iPhone 6 Plus’ camera. Samsung also rolled out the slogan “design with a purpose”—presumably a dig at what it sees as frivolous aesthetic choices by Apple.
Well, Samsung can mock Apple all its wants, but the numbers tell a different story. In the latest quarter, Apple overtook Samsung as the world’s top smartphone seller for the first time since 2011.
Technology research firm Gartner said Tuesday that Apple sold 74.8 million smartphones in the fourth quarter of 2014—narrowly beating out Samsung’s 73 million units. That gave Apple the biggest share of the global smartphone market at 20.4 percent, followed by Samsung at 19.9 percent. The third-place seller, Lenovo, isn’t even close, with a 6.6 percent market share on 24.3 million units sold.
For the year, Samsung’s numbers still top Apple’s. Samsung sold 307.6 million smartphones in 2014 compared with Apple’s 191.4 million. Still, the iPhone has been on a tear lately. The fourth quarter was Apple’s best ever—not only shattering records for iPhone sales but also for quarterly revenue and net profit. For the first time, the iPhone also became the best-seller in China—the world’s largest smartphone market—according to an estimate from research firm Canalys. Over the seven previous quarters, Apple had never ranked higher than the No. 4 spot.
Interestingly, both Apple and Samsung ended up losing market share overall in 2014 from the previous year as smaller manufacturers—such as Lenovo and Huawei—gained ground. But if we’re just comparing Apple to Samsung, the gap is shrinking. In 2013, Samsung had a roughly 15 percentage-point lead on Apple for market share. In 2014, that fell to 9 percentage points. It’s anyone’s guess what will happen in 2015. But for now, Samsung should realize that jokes alone aren’t going to cut it.
There Are No Rich People in America
Americans do not often admit to being rich. There are, of course, prominent exceptions. Warren Buffett will tell you he is in fact affluent. Kid Rock will too, and really exuberantly. Guys like Tom Perkins, the venture capitalist, will talk about how they're tired of being envied for their money and make unfortunate Nazi Germany analogies in order to illustrate their persecution complexes. But as a rule, we don't own up to our wealth. Just 1 percent of Americans say they are "upper class," according to the Pew Research Center. Everyone else thinks those words describe somebody richer than they are.
Richard Reeves of the Brookings Institution, who has made this quirk of our class identities into a hobby horse of sorts, recently put together a nice illustration of what he calls our national, "Me? I'm not rich!" problem. In 2011, Gallup asked Americans how much income they needed to be "rich." In general, they answered some amount that was higher than whatever they made. Most people who earned $30,000 a year or less thought you could be rich making under six figures. A majority of those who earned between $30,000 and $99,000 thought you needed to cross the $100,000 threshold. You get the idea.
None of this is especially surprising. People don't generally think about living standards in absolute terms. They think about them relatively, and tend to compare themselves with their peer group. And because most of us know at least a few jerks with a bigger house, nicer car, and more interesting-looking vacation photos, it's easy to conclude that, no, we ourselves are not truly rich. I'm making fun of Americans for it, but my guess is it's near universal. It just happens to be a problem in the U.S. because, as Reeves writes, those of us who think we should be paying taxes at all tend to believe the rich should be the ones shelling out. That's a political problem when there are apparently no rich people to be found.
Via Danielle Kurtzleben at Vox
JPMorgan Adds Another $50 Million to Its Gigantic Settlements Tally
JPMorgan Chase, the bank that has already paid out more than $27 billion in settlements over the past two years, is adding another $50 million to that tally over a “robo-signing” scandal.
The Justice Department said Tuesday that JPMorgan will pony up $50 million in cash, mortgage loan credits, and loan forgiveness after it failed to properly review more than 50,000 documents filed in bankruptcy court. The term in this case is robo-signing, and it means basically what it sounds like—employees robotically signing off on documents without actually reviewing them. In some cases, this is because robo-signers assume whatever they’re signing is correct, and don’t bother to go over it. In others, the problem is that the signers just aren’t qualified to be reviewing the stuff in the first place.
With JPMorgan, the bank admitted to filing more than 50,000 payment-change notices in bankruptcy court that were signed in such a way between 2011 and 2013. At least 25,000 of those were signed by former employees or employees who had “nothing to do with” reviewing the documents, the Justice Department said.
“It is shocking that the conduct admitted to by Chase in this settlement, including the filing of tens of thousands of documents in court that never had been reviewed by the people who attested to their accuracy, continued as long as it did,” Stuart Delery, acting associate attorney general, said in a statement. “Such unlawful and abusive banking practices can deprive American homeowners of a fair chance in the bankruptcy system, and we will not tolerate them.”
While $50 million is a drop in the bucket for an institution like JPMorgan, the bank is starting to feel the weight of its fines. During the latest quarter, JPMorgan posted a 6.6 percent decline in profit amid more than $1 billion in legal costs. The institution's investment-banking side has suffered in particular, with a 16 percent drop in profits last year despite cost-cutting efforts. JPMorgan's CEO Jamie Dimon has declared banks “under assault.” The homeowners whose documents were robo-signed probably felt the same way.
Some Law Schools Will Now Accept Students Who Didn’t Take the LSAT. That’s an Awful Idea.
Recently, two law schools announced that they will begin accepting a select number of students who have not taken the LSAT, the much-loathed exam that has traditionally served as the make-or-break measuring stick for J.D. applicants and punctuated the academic career of many an aimless history major. As Bloomberg Business reported last week, "The State University of New York-Buffalo Law School and the University of Iowa College of Law said they would admit students from their respective undergraduate colleges" based on their grades and scores on other standardized tests. They're the first institutions to pounce on a major rule change by the American Bar Association that will let law schools start filling 10 percent of their classes with non-LSAT takers who meet other academic requirements. But chances are that other schools will follow suit.
Mostly, this is yet another example of just how desperate law schools are to fill their classroom seats. Enrollments have plunged—Buffalo's first-year class has shrunk 18.3 percent since 2011, while Iowa's is down 21.7 percent—and that has created financial stress on institutions. Buffalo's dean says the changes are aimed at saving qualified students the pain and expense of taking the LSAT (the test requires a $170 fee, and usually a pricy prep course) but admits to Bloomberg that it might also help goose applications, "to the extent that they remove what is, for some students, an obstacle for applying to law school."
This is all very problematic.
I doubt that Buffalo or Iowa will end up admitting significantly weaker students as a result of these changes. Theoretically, it might let them recruit students who would have tested poorly without hurting their all-important ranking in U.S. News, which factors in LSAT scores. But the new ABA rules are still reasonably strict about academics. Law schools can only admit LSAT abstainers if they obtained a bachelor's degree at the same university or are applying to a joint degree program like a J.D./MBA. They also need a decent GPA and strong scores on either the SAT or a grad-school admissions test like the GRE. We're probably not talking students who would typically fill out the bottom of Buffalo's 1L class.
The real issue is that applying to law school shouldn't be convenient. It should be hard—about as hard as possible, in order to scare off the kids who don't really belong. I don't think I can make this point better than Elie Mystal already has at Above the Law, but in short: Going to law school is a wildly expensive, life-altering choice that can lead to years and years of misery (even if things go well). Most people shouldn't go unless they really, truly want to be a lawyer, or have a crystal clear idea of the alternative career that a law degree is going to land them.1 And if you don't want it badly enough to take the LSAT, chances are you don't want it enough to spend three years of your life amassing tons of student debt to pursue that goal, either. As Mystal writes, if "you can’t be bothered to spend $170 for a test, then you shouldn’t be allowed to borrow $150,000 for a degree. Anybody who tells you different is trying to sell you something."
1 In which case, I still say now is a great time to go.