A blog about business and economics.

Oct. 15 2014 1:42 PM

Uber Rival Gett Is Making a Risky, Clever Play in the Ride-Sharing Game

Sometimes it's easy to forget that there are contestants in the on-demand car-service game other than Uber and Lyft. One of them is Gett, and starting today it's making a big move to get more recognition from arguably the most important players: for-hire drivers.

Gett announced on Wednesday that it will begin paying drivers on its platform a flat rate of $0.70 per minute—an amount it says will double the typical payout for drivers on competing services UberX and Lyft. The $0.70 will come after sales tax and commission, and is separate from any money drivers make in customer tips. Under the new payment system, Gett claims, drivers working 40 hours per week will be able to touch six figures for the year. "Gett more money than ever before" is the tag line.

The $0.70 rate for drivers is rolling out just over a month after Gett announced a similarly attractive incentive for New York City riders: flat $10 fees to take any trip in Manhattan. There's no price-surging and no added cost for time ticking away as you sit in midtown traffic. "We don't try to cut corners to find a cheaper price point," says Ron Srebro, Gett's chief executive. "We always play fair, we always work only with licensed drivers."

In chasing that goal, the company is stepping into uncertain economic territory. Gett has now drastically cut fares on the consumer side of its platform and presumably increased expenditures on the driver side. That doesn't make a whole lot of sense—how can it offset higher costs to drivers while cutting prices for consumers? Srebro was reluctant to elaborate on how Gett plans to wrangle the finances of all this but admitted that Gett has not tested this particular pay-and-fare model in any of its other markets. (Gett is active in 24 cities including London, Moscow, St. Petersburg, Jerusalem, Tel Aviv, and Haifa.)

"I do think categorically that this is totally sustainable," Srebro says. "The $10 promotion for us is sustainable. The driver pay is sustainable. And even for us if we need to change that balance a little, it will be sustainable." It's certainly possible that in the short term, Gett's markets abroad—which Srebro says are bigger, more established, and more competitive with Uber—will help subsidize losses in New York. "This is a growth business so no, we're not losing a lot of money," he adds.

What's most interesting about Gett's decision is the implicit understanding that drivers are every bit as important—if not more so—than customers on its two-sided platform. So far, Uber and Lyft have largely played to the consumers' interests as they undercut each others' fares and pushed down driver pay as well. But lately drivers have grown frustrated with this system. When Uber tried to force its high-end SUV drivers to take cheap UberX fares, the drivers protested until Uber backed down. More and more, drivers are shuttling back and forth between Uber and Lyft and other on-demand car services to work for the one that currently pays the best rate and treats its workers the fairest.

Gett isn't as big as Uber and Lyft, and it certainly isn't as well-known in the U.S. But if Gett can draw enough drivers from its competitors to slow down their services without going broke in the process, it has a shot at carving out a bigger space for itself in the ride-sharing market.

Video Advertisement

Oct. 15 2014 11:57 AM

HBO Finally Gives In, Says You Can Get HBO Go Without Paying for Cable Next Year

Cord cutters of the world, rejoice! HBO says that starting in 2015, fans will be able to sign up for a standalone version of its online streaming service, HBO Go, in the United States without paying for cable TV. The news comes directly from CEO Richard Plepler, who was speaking at an investor presentation for HBO's parent company, Time Warner. Per Peter Kafka at Re/code:

Plepler said the company will launch a “standalone, over the top” version of HBO in the U.S. next year, and would work with “current partners,” and may work with others as well. But he wouldn’t provide any other detail.
Even that vague statement is a milestone for the HBO, Time Warner, and the TV business in general. For years, Time Warner and HBO have said they’re happy with the existing system, where HBO is sold to consumers by TV providers, and is usually only available to customers who are already buying another bundle of TV networks.

HBO has been served very well by its old model—as Kafka notes, it earned $4.9 billion in revenue last year. So why toy with it and risk the wrath of cable companies like Comcast and Time Warner Cable that are now so integral to HBO's business? A lot of the answer probably lies in this chart.

netflix_stock

Yep, no-cable-required Netflix is doing pretty well right now, and it presents some nasty competition for HBO. In any event, broke twentysomethings will soon no longer be able to cite the lack of a standalone streaming package when they steal their parents’ HBO password.

Oct. 15 2014 11:23 AM

No, Mainstream Economists Did Not Just Reject Thomas Piketty’s Big Theory

Did the economics profession just resoundingly reject Thomas Piketty’s most famous theory? You might have thought so if you were hanging around Twitter yesterday afternoon.

Every so often, the University of Chicago’s Initiative on Global Markets asks a panel of top economists to weigh in on hot public policy issues, such as the effect of stimulus spending, whether Uber and Lyft are good for consumers, or if the minimum wage kills jobs. It's a way of gauging mainstream opinion in the field. On Tuesday, it released a poll on Piketty’s trademark argument that inequality will increase when the return on capital exceeds the rate of economic growth, shorthanded as r>g.

Specifically, it asked economists whether they agreed or disagreed with the following statement: "The most powerful force pushing towards greater wealth inequality in the US since the 1970s is the gap between the after-tax return on capital and the economic growth rate.”

Or to simplify: Does Piketty’s theory explain why the wealth gap has been rising for the past 40 years? Overwhelmingly, the panel’s answer was no, with only one out 36 panelists agreeing with the statement.

Afterwards, a number of journalists, economists, and other wags took to Twitter and blogs to talk about how Piketty had just gotten a black eye. Sample headline courtesy of the American Enterprise Institute's James Pethokoukis: “Survey: 81% of top economists disagree with the Piketty inequality argument.”

Except they didn't really. Piketty’s Capital in the Twenty-First Century never suggests r>g is the main reason behind the recent rise of inequality. Rather, it theorizes that, in the absence of government intervention, r>g ensures the future concentration of income and wealth. Tellingly, Berkeley professor Emmanuel Saez, one of Piketty’s longtime collaborators, disagreed with IGM's statement, noting: “Income and savings inequality increases are now fueling US wealth inequality. Down the road [r>g] will be central as predicted by Piketty.”

As Justin Wolfers wrote in the New York Times yesterday, some of the panelists did explicitly reject Piketty’s real argument. For instance, MIT’s Daron Acemoglu wrote in a comment, “Theoretically and empirically the case that [r>g] is a major determinant of inequality or even top inequality is weak.” Acemoglu has co-authored an entire paper making the point. Ultimately, though, IGM was asking economists to opine on an argument that nobody was making in the first place.

I found myself wondering: How would Piketty himself weigh in?

“Well,” he told me in an email this morning, “I think the book makes pretty clear that the powerful force behind rising income and wealth inequality in the US since the 1970s is the rise of the inequality of labor earnings, itself due to a mixture of rising inequality in access to skills and higher education, and of exploding top managerial compensation (itself probably stimulated by large cuts in top tax rates), So this indeed has little to do with r>g.”

In short, you can add Piketty to the "Disagree" column, too.

Oct. 15 2014 8:56 AM

Jimmy John’s Makes Its Employees Sign a Ridiculous Noncompete Agreement

Jimmy John's must think it knows an awful lot about the art of the sandwich, because it's doing its utmost to keep employees from taking their sandwich-making skills elsewhere. According to a noncompete clause the Huffington Post dug up, Jimmy John's makes low-wage employees such as sandwich-makers and delivery drivers agree not to work for competing establishments for two years after leaving the company. Noncompetes tend to be used with managers or high-ranking employees with inside information about the business, but it's not even the strangest part of this situation. The strangest part would be Jimmy John's definition of a competitor.

According to the noncompete clause, employees must agree that for two years after leaving Jimmy John's, they will "not have any direct or indirect interest in or perform services for ... any business which derives more than ten percent (10%) of its revenue from selling submarine, hero-type, deli-style, pita and/or wrapped or rolled sandwiches and which is located within three (3) miles of either [their current place of employment] or any such other Jimmy John's Sandwich Shop." The clause, which has come up because of a lawsuit filed against Jimmy John's this summer, has drawn criticism from employees for being broad and "oppressive." Kathleen Chavez, the lawyer representing employees in the case, told HuffPo that the terms of the noncompete would prevent a former Jimmy John's employee from working in 6,000 square miles in 44 states and Washington, D.C.

Rules on noncompete agreements vary by state, but generally such clauses are considered enforceable only when they are appropriately narrow and designed to protect a legitimate business interest. California, which is particularly in favor of competition and the employee's right to change jobs, has a blanket rule that noncompetes are unenforceable. Since Jimmy John's clause is so broad and seems to have little or no vital business justification behind it (what state secrets do those sandwiches contain?), it's likely to be considered unenforceable pretty much anywhere. That said, an unenforceable clause is still problematic if it's scaring employees who don't know any better into thinking they can't work at another sandwich shop—or another restaurant of any sort with a trade in sandwiches—for the next two years.

Oct. 14 2014 8:11 PM

The Decline and Fall of the San Francisco Bay Guardian

Another alt-weekly is dead. But this one wasn’t just another alt-weekly. It was the San Francisco Bay Guardian, one of the most venerable, staunchly independent, and defiantly weird of America’s great alternative weekly newspapers.

The paper, founded in 1966, is shutting down for “financial reasons,” the San Francisco Chronicle reported on Tuesday. The decision was made by San Francisco Media Co., the parent company that bought it in 2012 from its founder, Bruce Brugmann. “It is the hardest decision I’ve had to make in my 20-year newspaper career,” the San Francisco Media Co.’s publisher, Glenn Zuehls, said in a statement. A notice on the Bay Guardian’s website says the final issue will be published on Wednesday, although one of the paper's fired leaders vowed to SFist that it will "live on in some form."

The loss of the paper in its current incarnation may not deal a great blow to San Francisco. The Guardian, like many alt-weeklies, had been sliding for years as readers and local advertisers turned from print media to the Web.

Over the decades, though, the Guardian as an institution had come to stand for something more than just convenient concert listings, seamy classified ads, and snarky coverage of the local political scene. It was an embodiment of a certain vision of the city—a vision of San Francisco as a haven for artists, immigrants, eccentrics, hobos, bohos, gays and lesbians, and any extant members of that perennially endangered species, the local working-class family.

Today, as the city’s booming technology industry drives housing prices beyond the means of even the upper-middle class, that vision has begun to take on a sepia tone.

It was, in many ways, Brugmann’s vision. An inveterate gadfly, he wielded the Guardian as a bullhorn in his endless battles with the city’s business interests, real-estate developers, and his personal bête noire, the Pacific Gas and Electric Co. The paper also reflected the old-school leftist ideology of longtime editor Tim Redmond, who left in 2013 after 30 years rather than acquiesce to the new owners’ plans to slash editorial staff. It was a brawling, muckraking paper whose reporters doggedly pursued injustice wherever their editors agreed it existed—which is to say, wherever the perpetrators were rich or powerful or aligned with the forces of capitalism.

San Francisco Bay Guardian
The Guardian railed against the "Manhattanization of San Francisco."

Photo by Justin Sullivan/Getty Images

PG&E, which Brugmann saw as an illegitimate private monopoly run at the public's expense, was not the only target. In the 1960s and ’70s, the paper led a long-running activist campaign against high-rises and new public transit lines in the city's downtown, which it decried as "the Manhattanization of San Francisco." In the 2000s it was home to investigative reporter A.C. Thompson, whose Guardian stories were credited with exonerating two local men wrongly convicted of murder. 

It was also the purveyor of such hard-hitting features as an annual special issue on nude beaches.

The Guardian’s politics were often glossed as progressive, but critics reckoned that was a misnomer. Progress in the sense of modernization or urbanization had little place on its agenda. It could actually be rather reactionary in its defense of what it saw as San Francisco’s historic character. In a city that leaned left to begin with, even former employees admit the Guardian’s editorials often amounted to “preaching to the choir.”

That predictability left it vulnerable to competition from upstart rivals, most notably the SF Weekly, which at times leaned libertarian but was generally more catholic in its outlook. (Former Slate media columnist Jack Shafer was the Weekly’s editor for a brief period in the mid-1990s. Now at Reuters, Shafer wrote an astute piece last year about the forces behind alt-weeklies’ long decline.)

In the 2000s, the rivalry flared into open combat, with the Guardian suing the Weekly for selling ads below cost in a bid to drive the Guardian out of business. The case dragged on for years in what the New York Times’ David Carr called “one of the last great newspaper wars.” The Guardian kept winning in court, but the Weekly and its corporate owners refused to give up—or pay up. At one point the Guardian got so frustrated that it seized two of the Weekly’s delivery vans.

In the end, they both lost. Locked in brutal battle with one another, they failed to meet the greater challenge of the media’s move online, which has punctured the business models of newspapers nationwide. The young readers who once turned to alt-weeklies now turned to blogs.

Many of the alt-weeklies that are now failing in cities around the country will scarcely be mourned. While they trained generations of young reporters and occasionally produced inspired journalism, most were formulaic copies of a handful of innovators like the Village Voice and the Chicago Reader. The Bay Guardian was never quite in that class, but it was a genuine article nonetheless, capturing sides of the city’s character that the broadsheet Chronicle overlooked.

“For all its bluster, the Guardian did a lot of important civic journalism, and a lot of investigative journalism,” said Tali Woodward, who wrote for the Guardian from 1999 to 2006 and is now director of the Master of Arts program at the Columbia Journalism School. “It was the opposite of today’s blog-heavy, commentary-style journalism in a lot of ways, because even though it had a viewpoint and was very political, it was really a reporter’s paper.”

Even one of the Guardian’s former sworn enemies couldn’t muster any joy in the news of its passing. “I feel weirdly bummed out,” admitted Tommy Craggs, the editor in chief of Deadspin, who wrote for the SF Weekly from 2002 to 2006. “It’s just shitty to see what’s happening with alt-weeklies.”

Years ago, Craggs gleefully poured gas on the Weekly/Guardian conflagration with a column in which he subjected a hand-written letter from Brugmann to psychoanalysis by a local graphologist. Yet the Guardian’s demise had him feeling uncharacteristically wistful on Tuesday. “It should have been a fun rivalry,” he said, “but it had a nastiness behind it, especially on the Weekly’s side.”

The saddest thing about the Guardian’s downfall is that the paper has faded from relevance at a time in San Francisco’s history when the high-flying tech sector is threatening to engulf everything about it that the Guardian and its loyal readers held dear. There is a backlash transpiring in the streets, on blogs, and in magazines. For once, the Guardian is not leading the charge.

Oct. 14 2014 3:49 PM

New Converse Lawsuit Says There Can Only Be One All Star

Will the real Converse All Star please stand up? First popularized as iconic basketball sneakers and inextricably linked to counterculture figures and the high-schoolers who imitate them, the Chuck Taylor All Star is one of the world’s most famous sneakers. It’s no surprise that imitators have cropped up over the years, and now Converse is trying to knock imposters off the racks with a lawsuit against 31 companies including Walmart, Skechers, and Kmart.

The New York Times reports that Converse is accusing the companies of trademark infringement and seeking monetary damages; Converse is also pursuing a complaint with the International Trade Commission to bar look-alike counterfeits from entering the country.

The design infringements include one or two black stripes and, probably most importantly, the rubber cap above the toe, which have led the company to write more than 160 cease-and-desist letters, according to the Times, which also stresses how difficult trademark infringement cases can be in the fashion world—functional designs can’t be protected, and the attributes in question must be directly linked to the company in consumers’ minds.

In short, Converse needs to prove that when people see the black stripes on the soles and the rubber toe cap, they actually think “Converse.”

I have no legal degree, admittedly, but I did spend many years as a counterculture-obsessed teen, and when I purchased off-brand rubber-toed footwear with black stripes and flat laces, it was solely to supplement my one legitimate pair of worn-in, white canvas Chucks. In other words, I wanted more Converse, but could not afford the real thing, so I purchased an imitation.

My nonlawyer's opinion, therefore, is that the imitation evident in, say, the Faded Glory Women's Canvas Lace to Toe available at Walmart or in Skechers' Bobs Lo-Topia is borderline undeniable. The smoking gun? I found both of those shoes by searching the sites for the word "Converse."

Oct. 14 2014 3:00 PM

No, Washington, D.C., Is Not the Most Expensive City in America

Yesterday, the Washington Post reported that, according to a recent government study, the D.C. region was "the most expensive place to live in the country, ahead of the pricey markets of New York and San Francisco." I hate to be the guy complaining that somebody said something wrong on the Internet, but the study stated no such thing.

The Post article is based on a report by the Bureau of Labor Statistics that looked at how much the average household spent on housing and related expenses in 19 U.S. metro areas—meaning cities and their surrounding suburbs. And indeed, Washington metro residents shelled out the most for their homes, their utilities, and things like furniture.

highest_spending_on_houses

BLS

Here’s the problem. Looking at what the average household spends on housing doesn’t actually tell us whether a metro area is expensive. It just tells us what people spend, whether it’s because they can’t find more affordable options or because they’re well off and want to own an oversize home with a nice backyard. Washington, D.C., isn’t the cheapest city in which to rent an apartment. But the region has the highest median income among the top 25 largest metro areas. And if you’ve ever driven through its particularly affluent suburbs, you know they’re chock-full of McMansions that probably push up average spending on things like home furnishings and air conditioning bills. Washingtonians—and their suburban neighbors—spend a lot on housing in part because they can afford to.

There’s a slightly bigger point to make here. Even if Washington, D.C., did have the least affordable housing in the country, that wouldn’t necessarily make it the least affordable city. As I’ve written before, judging whether a city is relatively expensive means taking into consideration factors like transportation, since it’s much cheaper commuting every day on a bus or subway than it is to own a car. In cities with weak public schools, families have to worry about the cost of educating their children. Taxes change the equation, too. And so on. Any report that only looks at a few statistics about housing costs to declare which city is the least affordable would be misleading at best.

Oct. 14 2014 11:36 AM

Man Wants $150,000 for Ebola.com, Compares Self to a Doctor

Because no calamity is complete without a little bit of small-time profiteering, the Washington Post reports that the owner of Ebola.com now wants $150,000 for the URL. Jon Schultz is a "merchant of disease domains," according to the Post, who also owns such properties as birdflu.com and H1N1.com. He bought Ebola.com in 2008 for $13,500, and now, with thousands of infections in Africa and the entire U.S. beside itself about a few cases in Dallas, he thinks it's time to cash out. “We’re getting inquiries every day about the sale of it," he told the Post. "I have a lot of experience in this sort of domain business, and my sense is that $150,000 is reasonable."

Is there anything inherently wrong with making a little money off a public health crisis in a way that probably won't cause any lasting harm? Investors make far more money betting on the misfortune of others all the time—think about the hedge funders who made bank shorting the housing market before its collapse. But Schultz isn't his own best advocate. When asked how he felt profiting off a disease that had already killed thousands, he offered up this dazzling analogy:

But you could say the same thing about doctors. ...They can become very well-off treating very sick patients. Besides we have sacrificed a couple of thousands in parking page income to put up links about Ebola on the site. And people can also donate to Doctors Without Borders at the site.

Yes, Mr. Schultz, you are just like a doctor.

Oct. 13 2014 12:37 PM

The Nobel Economist Knew About “Too Big to Fail” 10 Years Before the Rest of Us

There’s something a bit ironic about the work of Jean Tirole, the French economist who won the Nobel Prize today for his influential research on how to regulate large and powerful corporations. Even in papers published decades ago, the subjects of his work feel ripped from today’s headlines—he was writing about the threat of too-big-to-fail banks and the hazards of bailouts all the way back in 1996. Want to talk about how to prevent another financial crisis, deal with Comcast, or think about the meaning of a monopoly in the era of free Internet services such as Google and Facebook? Triole’s your man, and has been for a long time. Yet he’s not a name you’re likely to see all the time in the New York Times or Wall Street Journal.

“Many of his papers show ‘it’s complicated,’ rather than presenting easily summarizable, intuitive solutions which make for good blog posts,” economist Tyler Cowen wrote in a summary of Tirole’s work. “That is one reason why his ideas do not show up so often in blogs and the popular press, but they nonetheless have been extremely influential in the economics profession.”

And “it’s complicated” is hardly a bad message for the Nobel Committee to be sending. “In the 1980s, before Tirole published his first work, research into regulation was relatively sparse, mostly dealing with how the government can intervene and control pricing in the two extremes of monopoly and perfect competition,” the committee wrote in its announcement. The problem is that perfect competition and total monopolies for the most part only live in econ textbooks. Instead, many markets are often dominated by a small group of very large companies—or oligopolies. To get the picture, think about today’s increasingly concentrated airline, music, or beer industries. Or the smartphone wars between Apple and Samsung. Or your local cable market, which probably has no more than two (deeply unsatisfactory) providers.

In the 1980s, Tirole and his late colleague, Jean-Jacques Laffont, started using game theory and other rigorous mathematical approaches to begin modeling these oligopolistic markets, and figuring out how regulators ought to deal with them to serve the consumer’s best interest. The answers tended to change according to the industry. This was a rebuff to the old idea that regulators should apply a few simple rules that applied broadly across the entire economy. Many of his ideas have been picked up by governments, but not all. As Joshua Gans wrote today at Digitopoly, “Want to know why there is so little competition in telecommunications and broadband service in the US? go open one of Tirole’s books; it is time you listened.”

As the Nobel Committee put it, “The best regulation or competition policy should therefore be carefully adapted to every industry’s specific conditions.” Another way to frame it: It’s not about light regulation, or heavy regulation, but smart regulation. Which is complicated.

Oct. 10 2014 5:08 PM

Wayward Olive Garden Might Start Salting Its Pasta Water Again

About a month ago, it came out that Olive Garden had been committing a culinary crime against humanity: The Italian chain was not salting its pasta water.

The scandal was first revealed in a 294-page slide presentation compiled by Starboard Value, a hedge fund and activist investor that holds an 8.8 percent stake in Darden and had been engaged in a lengthy proxy fight for control of Darden's board. Whether the pasta water or something else was the final straw we may never know. But on Friday, Starboard emerged from the battle victorious, ousting the entire Darden board and electing all 12 of its nominated directors.

As Reuters notes, Starboard's win was an unusually large one for activist investors, who typically lock up no more than a few spots on a company's board. Starboard and another activist investor, the Barington Capital Group, had spent months lobbying for Darden to spin off Olive Garden, Red Lobster, and LongHorn Steakhouse from higher-growth chains in Darden's portfolio. Then in May, Darden abruptly decided to sell Red Lobster alone for $2.1 billion over strong shareholder objections. Tensions increased even further and pushed more investors to Starboard's side.

In a statement released Friday, Starboard CEO Jeffrey Smith said that "Darden has all the right ingredients to regain the strength and prominence it once enjoyed. We look forward to continuing our hard work from inside the boardroom and working with management on a shared goal of excellence for Darden."

We can only assume those "right ingredients" will include a restored dose of salt to the Olive Garden pasta water.

READ MORE STORIES