Technology

Not So Swift

Apple has us wondering: Does the world really need new programming languages?

Swift's logo.
The Swift bird cares not for thee.

Courtesy of Swift

Apple’s new programming language for iOS, Swift, is helpful and useful and redundant and ugly and wrongheaded and a great development and a waste. If you’re wondering how those descriptors could possibly apply all to the same thing, then you probably aren’t a computer programmer.

Swift, introduced at Apple’s annual Worldwide Developers Conference on Monday, has been met with a generally positive response from the Apple community—but it was a pretty low bar to clear. Apple’s default development environment for iOS, Xcode, has been using the venerable language Objective-C since iOS was released in 2007; Apple heavily encourages use of Objective-C and the surrounding Cocoa framework APIs that were written in Objective-C, and it’s certainly the most popular, most efficient, and most supported option. But compared with Java, which is the default development language on Google’s Android platform, Objective-C is clunky, antiquated, more bug-prone, and all around painful. (Full disclosure: I used to work at Google, and my wife still does.) There’s also the issue that while Java is a popular and ubiquitous language, Objective-C is used (almost) nowhere in the known universe outside of Apple’s operating systems. (See software architect David Green finding Objective-C falling short in every measure next to Java here.)

So the news that Objective-C—a dinosaur that was costing iOS application developers a lot of time and sweat—was being replaced with anything would have been greeted with cheers. Swift indeed looks to be vastly easier to develop in, adding Java-like improvements and efficiencies. But I hesitate to applaud Apple for Swift given how long they let such a terrible application development platform endure. Work on Swift apparently started in about 2010, around the time that Android was making its largest leaps in adoption. (It passed the iPhone in 2011.) Google more or less forced Apple’s belated decision simply in making its mobile application developers less miserable than Apple’s mobile application developers.

Given all that, it may sound strange to say that as a programming language, Swift lacks a reason for existing. Yet, unless I’m missing something big, there is nothing in it that hasn’t been done before—and just as well—in another language, be it Java, Python, Ruby, Scala, Rust, Go, Clojure, OCaml, or many others. And all of those other languages offer things that Swift lacks. Unlike all of those other languages, Swift is proprietary, meaning Apple controls it lock, stock, and barrel. It’s close enough to a lot of these others that many developers shouldn’t have too much trouble picking it up. But for all the excitement over Swift, shouldn’t we ask why the world needs another language that doesn’t do anything new—much less one that Apple controls?

It doesn’t. Swift fit a very particular need, which was to create a language that was compatible with Apple’s existing infrastructure. Rust creator Graydon Hoare points out that Swift “appears to have ‘Objective-C object model interop’ as a hard design constraint.” In English, that means Apple decided that Swift still needed to look as much like Objective-C as possible, wherever possible. There were several reasons for this. One was to make it easier for developers to migrate code from Objective-C into Swift. Another was to make it easy to preserve Apple’s own existing Objective-C frameworks; when viewed as a black box, the less Swift does that is functionally different from Objective-C, the less Apple needs to do to support two different models, and the more easily they can coexist during a drawn-out transition. And needless to say, none of this would be possible unless Apple had total control over both the language and its implementation, just as it did over Objective-C.

This is fairly backwards. Cutting-edge languages should not be driven by their implementations. They should not be designed based on their particular platforms, and they certainly shouldn’t be kneecapped based on what other languages they need to be compatible with. To be sure, these considerations sometimes have to play a part: The perfect can be the enemy of the good, but let’s not cheer too hard for the wildly imperfect. Swift is a grab bag, and the absence of many of the advantageous features of those other languages owes purely to Apple’s need to support its legacy Cocoa platform, and to its inability to provide anything beyond what was relatively easy to implement. The language semantics look nice indeed—credit where it’s due!—but behind the much-needed front-end improvements, Swift is a Potemkin village, built out of what was handy rather than what was right. (Apple apologists will respond: Look, it’s still so much better than Objective-C. And they’ll be right.)

The thing is, Apple screwed up a long time ago. It needed a new language not now but in 2007, when iOS rolled out—not just because it would have saved iOS developers a lot of grief, but because it would have allowed them to make Swift a better language now. (Of course, that would have hurt iOS application development then, and, well, it’s never a good time to switch over. But the cost only gets worse over time.) Instead, Apple patched up Objective-C and XCode over the years, introducing incremental features like Grand Central Dispatch, collection literals, and blocks, making the whole edifice increasingly labyrinthine and ad hoc. By sticking with Objective-C until there was such an established iOS application base, the appeal of creating a language driven by legacy constraints won out over a language driven by design considerations.

Look at programmer Steve Streza’s list of complaints about Objective-C from February of this year, and you can see that Swift fixes some of them—that is, the ones that Objective-C’s infrastructure made it easy to fix. (The Swift manual doesn’t so much as mention concurrency, which refers to multiple sets of instructions executing simultaneously.) Apple may plan to retire Objective-C gradually and “enhance” Swift over the years with the missing features, but such an ad hoc approach didn’t work out so well with Objective-C.

The Go gophers.
The Go gophers.

Courtesy of Renee French, creative commons via http://golang.org/doc/gopher/pencil/

In contrast, the Go programming language, while developed at Google, had a clear reason for its creation: to try to solve problems of concurrency as well as dependency (the network of relationships between pieces of code that depend on each other)—problems that hadn’t been solved well by other languages. Go was released under the permissive free software BSD license and solicits public feedback. Functional programming mavens like Bryan O’Sullivan may whinge that Go lacks tuples (which group heterogeneous types into a single ordered list), but as someone who wrote part of a Java compiler in the functional ML language, I still think that Go has a lot more integrity than Swift ever will. It also has a cuter mascot than Apple’s soulless Swift bird.

So, I hope iOS developers enjoy Swift. They’ve certainly earned it. But a lot of the cheerleading for Apple is simply embarrassing, like getting excited after a night of losses at the blackjack table because you finally won a single hand. Yes, Apple threw us all a bone. If we don’t act too grateful, maybe they’ll even throw us another.