Dollar General Is Making a Hostile Bid for Dollar Store Dominance
Dollar General is done playing nice. After being rejected twice by Family Dollar in its bid for the smaller company, Dollar General on Wednesday set in motion a hostile takeover by bringing its $9.1 billion offer directly to Family Dollar's shareholders.
Drama has heated up at America's dollar stores throughout the summer: First Dollar Tree planned to buy the struggling Family Dollar, and then Dollar General came looking for a share. Dollar stores are a $48.2 billion market in the U.S. and are projected to grow 18 percent over the next five years, according to data from Euromonitor International, making them a lucrative slice of the retail market. Dollar General is by far the biggest of the three chains, with $17.5 billion in annual sales—more than twice the $7.84 billion that Dollar Tree brought in last year and well over Family Dollar's $10.39 billion. Both Dollar General and Family Dollar sell their items at a cheap range of prices, while Dollar Tree is a "true" dollar store that sells everything for $1.
The cash tender offer from Dollar General is to purchase all outstanding shares of Family Dollar for $80 apiece. That well exceeds the $74.50 per share or $8.5 billion buyout offer made by Dollar Tree and also tops Family Dollar's current share price of around $78.60. Shareholders will have about a month to consider the proposal: Dollar General said that the current offer is set to expire at 5 p.m. on Oct. 8.
Approaching Family Dollar shareholders allows Dollar General to begin crucial discussions with the Federal Trade Commission over how it could get a deal by regulators without being blocked on the antitrust front. In its repeated rejections of Dollar General, Family Dollar has argued that the proposal is riddled with antitrust concerns. More than 1,500 of Family Dollar's 8,000-some stores are located in areas where the only significant competition comes from a neighboring Dollar General. Thousands more compete with a Dollar General and a Walmart. If Family Dollar really is a crucial check on Dollar General prices in 1,500 or more of those locations—and possibly the ones with a Walmart as well—then it's not hard to see how a merger could be considered anti-competitive.
Scott Hemphill, a professor of antitrust and regulation at Columbia Law School, notes that antitrust issues with retail mergers can often be resolved on a case-by-case basis. So while the FTC might find Dollar General's proposed takeover to be anti-competitive in some parts of the country, it could be completely fine in others. "In retail we think these things are fixable, because you can sell stores in areas where there's a problem," he explains.
Dollar General has already attempted to go this route, initially proposing to sell 700 stores and saying on Wednesday that it would be willing to divest up to 1,500 stores if needed. For the rest of its locations, it seems to be betting that some combination of local competitors and Walmart will be sufficient to overcome regulators' concerns. Dollar General has also accused Family Dollar's CEO Howard Levine of hiding behind the antitrust line to look out for his own job. But if Levine is taking a risk to preserve his standing, so is Dollar General in initiating a hostile takeover—as the Wall Street Journal notes, the company has made its offer without even seeing Family Dollar's confidential financial information.
Take Away Harvard’s Nonprofit Status
The world’s richest university just got a little richer. On Monday, Harvard announced that it has received its largest-ever gift of $350 million and it will rename its school of public health after its benefactor’s father.
Public health is a wonderful and worthy cause, of course, and Harvard has a stellar program dedicated to it. But this gift—like so many other mega-gifts to mega-endowments—has a hint of the ludicrous about it.
There's an old line about how the United States government is an insurance conglomerate protected by an army. Harvard is a real-estate and hedge-fund concern that happens to have a college attached. It has a $32 billion endowment. It charges its rich students—and they are mostly from rich families, with many destined to be rich themselves—hundreds of millions of dollars in tuition and fees. It recently embarked on a $6.5 billion capital campaign. It is devoted to its own richness. And, as such, it is swimming in cash.
“But, wait!” you might say. “That $350 million is going to support an educational institution with tremendous public spillover! Harvard does basic scientific research! It teaches doctors! It studies cells and stars and history and it educates underprivileged youths!”
True, true, all of it. Still, from a purely utilitarian perspective, there are causes that need that $350 million more. Groups like GiveWell are devoted to figuring out where a dollar does the most good. It recommends initiatives like deworming in very low-income countries. Harvard, at the same time, is spending a billion dollars upgrading its coeds’ convenient, riverfront housing. If it wanted to maximize its $32 billion worth of utility, it could, say, admit more students, especially poor ones, reduce its focus on property development, and double down on its focus on research, which currently makes up $800 million of its $4.2 billion in annual operating expenses.
But there is a way to encourage the university to do that, or at least to ensure that it is also contributing more to the public good. That is to take away some of Harvard’s tax exemptions, as suggested by legislators in Washington and Massachusetts, as well as a number of economists. The idea is that such mega-rich schools hoard funds and real estate, tax-free, to the detriment of local communities or federal coffers, a situation that could be remedied with a wealth tax on endowments over $1 billion, property taxes, or a tuition sales tax.
Harvard and other universities are the Platonic ideal of nonprofits in a technical sense, and thus exempted from paying taxes like businesses do. The school does not distribute its operating profits as dividends. It has no shareholders. It does not exist to make anyone rich. Yet nevertheless, it is rich, as are several other universities in its cohort. Harvard manages about $32 billion, Stanford about $19 billion, and Yale about $21 billion.
All that money can have some perverse effects. Harvard, for instance, has purchased a tremendous amount of land in Cambridge and the surrounding towns—pushing up real-estate prices without contributing much, if anything, in property taxes. If the school lost its nonprofit status, it would owe the state of Massachusetts $80 million a year. (The $350 million donor owns $100 million of real estate in Harvard Square, by the way.) For now, it owes close to nothing on its land or its investment portfolio.
Were it to face the burden of some taxes, Harvard might shift its strategy. And “if donors knew that the money they bequeathed to their alma mater would partially or fully be heading to the state,” wrote the Harvard Crimson debating the issue back in 2008, “they would think twice before writing that check.” Exactly.
When College Grads Earn Like High School Grads
For the average graduate, going to college is a wonderfully profitable investment. The evidence is unambiguous. Even after subtracting tuition and all the years of foregone salary, the pay boost from a degree will still pay for itself, and then some. The problem is that the "average" college student doesn’t really exist; she’s an imaginary amalgam of state school grads and Ivy League alums, of education majors and engineering nerds.
Once you ignore averages, and start looking across the entire earnings spectrum, the question of whether higher education is financially worthwhile for everybody becomes more complicated. Recently, researchers from the Federal Reserve Bank of New York noted that the bottom 25 percent of college degree holders basically earn no more than the median worker who ended his or her education after high school.
“While we can’t be sure that the wages of this group wouldn’t have been lower if they had never gone to college,” the New York Fed’s researchers wrote, “this pattern strongly suggests that the economic benefit of a college education is relatively small for at least a quarter of those graduating with a bachelor’s degree.”
Other researchers have made similar observations. In a lengthy review of the literature on the economic value of college, for instance, Philip Oreopoulos and Uros Petronijevic point out that among Americans between the ages 30 and 50, the median college graduate earns less annually than the top 10 percent of high school grads.
And what about over a lifetime? A widely cited study by economists Christopher Avery and Sarah Turner found that, among men, some exceptionally well-paid high school grads could expect to make more money during the course of their career than lower-earning bachelor’s recipients. This summer, writer and economist Allison Schrager mimicked their analysis in Bloomberg Businessweek, but included women and part-time workers in the math. She found that, long-term, a college graduate at the 25th percentile of income should expect to earn less than the typical working high school grad.
The bottom line: A large minority of college grads earn like high school grads. So what do we make of that fact?
One possible interpretation is that college simply isn’t worth it for a chunk of students, even if they graduate. That may be true to a degree. There are some young adults who would probably end up wealthier if they learned a trade or pursued a technical degree from a community college than they would by majoring in communications at, say, a nonselective state school.
But there are other issues to consider. The New York Fed says “we can’t be sure” if low-earning college graduates would have made even less money without their degree. But we can be close to sure. The strongest academic evidence suggests that regardless of their background, any given individual does boost his pay by going to college, even if he’s only a so-so student. The question is whether that earnings bump will outweigh the cost of school. In one important study of marginal students who were barely admitted to Florida’s state university system, the answer was yes. But at a more expensive private institution, the return on investment might turn out to be nil (or negative).
The useful question isn’t whether college is worth it to some students. It’s whether individual colleges offer a worthwhile return to the population of students they serve. The fact that some college graduates earn less than high school grads is one more reason for us to worry about the cost of tuition, not to doubt the value of higher education as a whole. And we should call B.S. whenever a school markets itself to potential undergrads based on the average income that a college grad can make.
Why Spirit Airlines and Allegiant Air Don’t Bother With Reclining Seats
Reclining seats on planes are all the rage these days—literally. People are either enraged at the jerks who try to prevent fellow airline riders from reclining their seats, or enraged that airlines were audacious enough to introduce reclining seats in the first place. The question—to recline or not to recline—has been debated furiously ever since a United Airlines flight from Newark to Denver was forced to divert to Chicago in late August after two passengers on board got into a fight over legroom. Over the next several days, two additional flights—one from American Airlines and one from Delta—were forced to do the same after similar legroom squabbles broke out.
Recline rage might be sweeping the skies, but two U.S. airlines are notably immune. On both Spirit Airlines and Allegiant Air, there are no reclining seats to be found. Allegiant was the first to convert to stationary seats in 2006; Spirit followed in 2009. The business reasons behind the decision are straightforward. First, nonreclining seats are simpler to construct and far less likely to break and be taken out of service, leading to costly repairs and lost revenues on that seat. Second, taking the recline mechanism out of seats makes them lighter and allows the plane to carry additional weight in passengers and spend less on fuel. Allegiant estimates that eliminating reclining seats saves it $3.5 million a year on maintenance and conserves 110,000 gallons of fuel (or about $350,000 worth, depending on prices).
Personal preference has also factored in to the decision. "I don't like reclining seats," says Andrew Levy, president and chief operating officer of Allegiant Travel and Allegiant Air. He recalls traveling in Europe on Ryanair about 10 years ago on a flight that had smaller seat sizes—or a tighter "pitch," as people in the industry term it—but no recline mechanism. "I sat in the seat and realized that yes, you might have a little less pitch, but what really affects the space is when the seat in front of you reclines. That's when it kind of just hit me that this makes perfect sense, and so we decided to build our own seats, design our own seats, and start putting them in our aircrafts."
Paul Berry, a spokesman for Spirit Airlines, said the company weighed similar business considerations—gas and maintenance—in its decision to cut reclining seats. And after five years of flying like that, he adds that customers don't seem to mind. "Almost every poll we've done or taken or seen shows that people would prefer not to have reclining seats if they had the choice," Berry says. Indeed, an October 2013 report from Skyscanner found that 91 percent of 1,000 people surveyed said short-haul flights should either ban or set time limits on reclining seats. Levy also says the nonreclining seats have been a "non-issue" for Allegiant's customers.
If people are so overwhelmingly in favor of banning reclining seats, why do the majority of planes still use them? Because they always have. "I think reclining seats have just been part of airlines for a long time," says Berry. "And meals were a part of airlines for a long time and when airlines started taking meals off planes consumers complained about it, but then they got used to it. When we started charging for checked bags people complained, but then they got used to it."
For now, it's unclear if other major U.S. airlines will follow Allegiant's and Spirit's lead and ditch their own reclining mechanisms. So long as they don't, Berry and Levy are in agreement on one last point: If an airline sells a seat that reclines, you should be able to recline it.
The Fast-Food Strikes Have Been a Stunning Success for Organized Labor
For the seventh time in nearly two years, fast food workers around the country walked out of their restaurants last week to demand a pay raise to $15 per hour and the right to unionize. In New York City, 21 workers were arrested for sitting in the middle of the street outside the McDonald’s in Times Square. Organizers said more than 50 protesters were arrested for similar acts of civil disobedience in Detroit. Another 50 were detained in Chicago.
And so the most interesting—and most successful—American labor push in recent memory rolls on. The strikes, which began in November 2012, have been organized by the group Fast Food Forward and bankrolled by the Service Employees International Union, which according to the New York Times has spent more than $10 million on the cause. These walkouts haven’t led to any unionized McDonald’s or Taco Bell franchises yet. But at this early date, it’s more useful to think of them as the spearhead of a broader living wage movement that has also seen retail workers at stores such as Walmart protest for better pay. Framed that way, the effort has been startlingly effective. For the cost of a few Super Bowl ads, the SEIU and some dedicated fast food workers have managed to completely rewire how the public and politicians thinks about wages.
Consider the numbers. Over roughly the past two years, 13 states have increased their minimum wage, as have 10 city and county governments, according to a tally by NBC News. Seattle voted to raise its citywide minimum to $15 an hour by 2018; San Francisco residents will vote on whether to do the same in November. The mayors of New York, Los Angeles, and Chicago have all backed a $13 wage floor. The president has come out in favor of a $10.10 national minimum. And just in case you were looking for a rough barometer of overall public interest in the issue, even Google searches for the phrase “minimum wage” have been consistently more common since the start of 2013. You don’t have to think a $15 minimum wage is a brilliant idea (personally, I don’t) to admire the efficacy of the effort.
All this has transpired at a time when organized labor has had little else to celebrate. The Tea Party wave of 2010 brought anti-union right-to-work laws to the industrial Midwest. The United Auto Workers’ longstanding dream of organizing a foreign car-maker in the South suffered yet another crushing setback when employees at a Tennessee Volkswagen plant voted against joining the union. Overall, union membership has continued its long decline.
The fast-food strikes are an exceptional bright spot, even if they don’t fit the typical notion of what labor unions are for. The movement has been patterned loosely on the SEIU’s Justice for Janitors campaign, which has managed to organize 225,000 building cleaners across the U.S. and Canada since it began in early-1990s Los Angeles. Steven Ashby, a professor at the University of Illinois School of Labor and Employment Relations, explained to me that like restaurant workers, janitors were considered especially hard to unionize, in part because buildings didn’t employ their own cleaning staff. Instead, they hired cleaning contractors whom they could legally terminate at the first rumblings of union activity. But by orchestrating citywide strikes and drumming up public outrage—much as fast food workers are attempting to do now—the SEIU eventually managed to win contracts for the janitors.
But whereas Justice for Janitors focused on improving conditions for one particular group of workers, the fast-food strikes have become part of a broader political project. The SEIU is also asking home health care workers, for instance, to march in the protests. As the New York Times’ Steven Greenhouse reports, the “union hopes that if thousands of the nation’s approximately two million home-care aides join in it would put more pressure on cities and states to raise their minimum wage.”
This is all part and parcel of an important shift in the way organized labor has begun to view its role. Since the mid-20th century, unions have largely existed to win higher wages and benefits for their members, while collecting plenty of dues for the effort. Unions won’t ensure their long-term survival without finding ways to expand their rolls. But back in 2012, Harold Meyerson wrote in the American Prospect that “unions can no longer confine their organizing to workers they hope to represent in collective bargaining.” At the time, unions like the SEIU and AFL-CIO had begun focusing more of their resources on grassroots political organizing.
Crusading for higher pay for all has been a logical next step. And the results so far show that, while the unions have plenty of problems, they also still have a bit of life left.
Taco Bell Tacos Keep Getting Less and Less Taco-Like
Some five months have passed since Taco Bell debuted its new breakfast menu complete with its star item, the waffle taco. The syrup-drenched fold of meat and eggs shot Taco Bell into the fast-food spotlight and quickly became the chain's most talked-about item. In recent months, though, the allure of the waffle taco has started to wear off and interest in Taco Bell (at least as tracked by Google Trends) has declined. Which might explain why the Yum! Brands company is now testing a fresh take on its earlier success.
The biscuit taco, as the new item is called, is rolling out in a handful of locations near Orange County and Los Angeles. A company briefing describes it as a "warm, flaky, golden brown biscuit that happens to be shaped in the form of a taco." The biscuit taco can come with egg, cheese, and bacon or sausage for a breakfast flavor or crispy chicken with gravy or honey for a Southern take. The chicken varieties also get a drizzle of Taco Bell's "NEW signature jalapeño honey sauce," which we can only assume is the gelatinous orange goop displayed in the sample photo atop this post.
Whether the biscuit taco is in fact "warm" and "flaky" is hard to judge from the photo (having not tasted one, Slate will withhold judgment). But one thing can be sure: The biscuit taco is evidence that Taco Bell's taco-branded products are becoming less and less taco-like. Could this signal an existential crisis for the Taco Bell taco? Sure, the waffle taco was a stretch. But the fast-food breakfast wars were raging and Taco Bell needed something that stood out. And with its more pliant shell, fold of meat, and topping of eggs and syrup, at least the waffle taco kind of looked like a taco. The biscuit taco does not. In fact, it sounds like something that would be better off on the morning menu at fellow Yum! Brands chain KFC, perhaps alongside the a.m. original recipe platter or the waffle and eggs. The biscuit taco is many things. A taco is not one of them.
Barclays Introduces a Finger Vein Scanner to Access Your Online Banking
Barclays is rolling out a new cybersecurity service for its corporate clients next year: finger vein readers. The biometric scanners are designed to scrutinize the unique vein patterns of customers' fingers so that they can access online bank accounts and make financial transactions without a password or PIN.
In the short and somewhat creepy demo video that Barclays included with its announcement, a woman sits down at her computer and, when prompted on her laptop's screen, places her finger into the biometric scanner, which looks a lot like the pulse oximeters they clip onto you at the doctor's. You can watch for yourself below:
Barclays assures clients that the finger scanner will "not hold the user's vein pattern and there will be no public record of it." It also notes that "unlike finger prints, vein patterns are extremely difficult to spoof or replicate." The FT puts it a little more bluntly: "blood vessel patterns are much more difficult to replicate and the scanned finger must be attached to a live human body."
Then again, if we're going to go down that grisly path, it's worth nothing that a dismembered finger won't unlock the iPhone 5S scanner, either.
Wearable Technology Goes Couture
All those Clydesdale struts down runways this coming week will be measured by more than just the approving nods they earn from certain magazine editors and boutique buyers, because this is the year wearable tech goes couture. Or at least makes a concerted attempt.
A dozen or more wearable companies—among them Intel, Fitbit, and Google Glass—are forging partnerships with fashion brands to earn time on the runways at New York Fashion Week, which begins today and runs through September 11. Similar companies aim for fashion week visibility through novel marketing moves, such as appearing in models' Instagram feeds. It's the next generation of the 2012 Google Glass-Diane von Furstenberg partnership, which saw bespectacled models on the runways, and Google co-founder Sergey Brin seated in the front row of the Lincoln Center show.
Perhaps the mainstream collision of fashion and tech is overdue. If you visited a classroom at New York's FIT, or a graduate student showcase at New York University's Interactive Telecommunications Program over the past few years, you'd have seen prototypes of LED-powered skirts and dresses, hats and other headwear—even backpacks!—and bracelets and rings equipped with cameras, trackers, and other chip-powered embellishments.
Already the runways have seen wearable technologies that sync with one's smartphone--but that don't necessarily fit into the "quantified self" movement of biometric tracking. They're more for peacocking than fitness monitoring. A London fashion house, CuteCircuit, designs light-up dresses, skirts, and jackets for women, which can be controlled through an iPhone app. (You may have seen such glowing garments on Katy Perry or U2.) The company's show is one of the very first of this year's fashion week, and you can expect plenty of buzz.
According to a study by tech-research firm IDC, nearly 20 million wearable devices will be shipped worldwide in 2014. That number is expected to climb to more than 100 million in four years' time. That's a lot of awkward plastic-y armbands. But if the fashion world has anything to say about it, that look might get a lot more graceful.
Marie Claire creative director Nina Garcia called wearing a fitness tracker "a badge of honor, whether you're healthy or not," in the Wall Street Journal recently. And just last week, tech and fashion debuted a combined effort at the U.S. Open tennis championships. The product: A Ralph Lauren sports shirt, with knitted-in sensors that can read the wearer's heartbeat and respiration. It looks—on the surface, like a fitted black crew-neck shirt. Upon close inspection, a band of thicker fabric mid-torso is apparent, and it houses a Bluetooth transmitter, an accelerometer, and a gyroscope by OMSignal, a biometric-tracker company.
Much of Silicon-Valley tech's toeing the runways this year is not so much about standing out as about blending in.
"Wearables need to take account for varying tastes, not least between genders," Nick Spencer with ABI Research told the Washington Post. "One size, or even design form—touch-screen designs, for example—doesn't fit all as it does to a much larger degree in consumer electronics. Designers need to play a key role here."
BaubleBar's co-founder put it less subtly in an interview with the New York Times: "There's a reason we all make fun of someone wearing a Bluetooth or a BlackBerry holster. Is it useful? Of course it is. Do I look like a tool? Yeah. I'm not going to wear it."
We've already seen fitness-tracker bracelets go from tech-gaudy to approach wearability in non-gym settings. Nike this year released the FuelBand SE with embellishments nodding to aesthetics, rather than athletics: gold, rose gold, and silver. The Basis watch—beloved by techies, but frankly clunky-looking—debuted a leather strap and slightly more stylish chrome frame this year. It's called Carbon Steel. The Misfit Shine—a sleep and fitness tracker—is being marketed as not just a workout companion, fit to clip onto a swimsuit or basketball kicks, but also as a sleek accessory: Pin it to your tux or don it as a pendant on a necklace with that gown. (It comes in not just black and silver, but also an array of colors one might find in an Anthropologie catalog, such as "coral," "wine," and "sea glass.")
What we'll see on the runways this year is a bit of embellishment on these themes. Fitbit, the Jolly Rancher-sized fitness tracker that's worn as either a bracelet or a clip-on, has been collaborating with luxury brand Tory Burch. Together, they have created a small golden cage for the Fitbit that doubles as a pendant for a necklace, and a similar Fitbit cage for the wrist, in the form of a hinged metal bracelet, which retails for $195. (A silicone bracelet, available in pink or blue, is $38.) Look out for these on models at Tory Burch's September 9 show.
Not all these wearables are for measuring one's heart-rate and daily footsteps. A fascinating collaboration between Intel and noted New York-based fashion brand Opening Ceremony, along with the Council of Fashion Designers of America and Barneys, is resulting in a device that appears inspired by Dynasty colliding with Sherlock Holmes. From the outside, it's a large, stone-encrusted metal cuff. Open it up, and it reveals a curved display, useful for "communications purposes," according to Ayse Ildeniz, Intel's head of business development and strategy for its New Devices Group.
If that sounds cryptic, it is: Intel is debuting the bracelet on the runway of the Opening Ceremony show at Fashion Week, but it isn't releasing the exact purpose of the bracelet-shaped communications device yet. However, Ildeniz told Inc. that it may be most useful for reading one's social-news feeds and being "in touch with your loved ones." The device uses radio waves to communicate, so it requires no smartphone pairing. It already has a name, though: Mica, an acronym for "my intelligent communications accessory."
Ildeniz said for Intel, the partnership with fashion brands was a significant learning experience, one in which Intel let the designers lead, in order to focus on not just the technology, but also what consumers actually want from an aesthetic perspective.
"If we are to make wearables available to not just a few people, but to hundreds of millions of people, our philosophy is that the fashion industry needs to be in the drivers seat, not technology," she said.
Google Glass has been working with massive eyewear-maker Luxottica to design more mainstream-looking face computers. And Google's relationship with Diane von Furstenberg is deepening: It launched a collection called "DVF | Made for Glass" that includes five styles sold on website Net-a-Porter for roughly $1,500 to $1,800 each.
Far off the runway—but with auspicious timing—is Apple. The company made Sept. 9 its official launch date, for what's widely expected to be a new iPhone and also its hotly-anticipated smart watch. That date, of course, lands the event right in the middle of fashion week, albeit at the company's headquarters in Cupertino, California. (Critics speculate that what's been dubbed the "iWatch" may actually be more of a fashion accessory for the iPhone that provides extra health and fitness information to its wearer.)
Perhaps closest to something women might buy due to a perfect mix of its aesthetic appeal and technological usefulness is a Rebecca Minkoff bracelet that's debuting this week. It's one of several accessories the brand is announcing that will double as USB caves and cell-phone notification accessories. They're also relatively affordable: $40 to $120.
Still, these are no Harry Winston-level jewels. (Although, in fairness, a French jewelry designer who's worked with both Harry Winston and Louis Vuitton worked also on an attractive—and sparkly!—bracelet for tracking sun exposure, by tech company Netatmo.) While we'll see plenty of Tory Burch, Intel, and Fitbit armbands on the runways, and in fashion-magazine pages, it may still be years before a biometric device truly becomes a fashion statement—if it ever does.
See also: Why Good Leaders Fire People
California Wants to Be an Olive Oil King
First California came for Europe's dominance in wine. Now the state has its eye on olive oil. According to a recent piece in the Los Angeles Times, California growers and producers are in hot pursuit of the $5.4 billion olive oil market and are seeking new regulations that would help them compete with European importers. If new rules were approved, they could eliminate deliberately vague descriptors such as "light" and "pure" and require testing oil for purity and quality. ("Light," for example, does not indicate an oil with fewer calories, but rather one that contains low-quality oil refined through chemical processing.) Californians are betting that their olive oil is better than most of the stuff made in Europe, and think new rules and testing will demonstrate that.
While California oil makers currently account for less than 1 percent of global production and only slightly more of U.S. consumption, that share is growing. Since 2007, U.S. olive oil production has increased tenfold to 10,000 metric tons. To put that in perspective, Americans consumed 293,000 metric tons of olive oil in 2013, most of which came from European countries such as Spain and Italy. The dream for California producers is to create a better, higher-quality oil than their European competitors—and in doing so, convert their fellow Americans to choosing California olive oil first. New labeling standards could help make that dream a reality. They could also help attune Americans to the still-surprising fact that most extra-virgin olive oil made in Italy is neither extra-virgin nor made in Italy.
So far, European sellers are not taking kindly to the idea of stricter testing and labeling standards. The L.A. Times reports that the European Union warned in a letter to the California Department of Food and Agriculture that "the standards would be burdensome and confusing for consumers." But that line of argument sounds a little hollow coming from the same regulators that impose strict rules to govern the use of "geographical indications" in marketing and labeling on some of their countries' most prized products. Champagne, to take the most famous example, is only Champagne if it comes from the Champagne region of France; everything else is simply sparkling wine. So seriously are these restrictions taken that over the summer European winemakers vehemently protested the Internet adding .vin and .wine to its Web addresses because they thought it would make it easier for unethical wine sellers to hawk inauthentic and low-quality goods to unsuspecting buyers.
Jeff Colombini, an olive grower at Lodi Farming in Northern California, told the L.A. Times that Europe's hostility toward changes in labeling and testing is rooted in fear that new rules could give American production an edge in the market. "The importers know that if we establish ourselves as the premier, authentic producers of olive oil, we'll cut into their business over time," he said. "They're running scared." Given that Europeans certainly like authenticity guidelines that work in their favor, Colombini has a point.
The Job Market’s Hot Streak Just Snapped
It's all over, folks. Payrolls grew by just 142,000 in August, according to the Bureau of Labor Statistics, well below expectations. Previously, the economy had added at least 200,000 jobs for six months straight, something that hadn't happened since 1997. The streak was nice while it lasted.
Meanwhile, the unemployment rate (6.1 percent) and labor force participation rate (62.8 percent) barely changed. On the bright side, average hourly earnings are up 2.1 percent for the year.
It's always important to take the long view on the jobs report, which will be revised in the coming months—though probably not enough to put August over the 200,000 mark. But that view is pretty dull. The three-month rolling average of job creation has fallen back into the same range it's been hovering around for roughly the past two years. It's the same old story, plodding along.