Hypercritical


Can’t help falling in love

Wise men say…

Last year, I described the iPhone as a new frontier for Mac developers, offering the promise of a clean skill transfer from Mac OS X development combined with the thrill of a completely new, uncharted world of user interface design. From the moment that giant "X" appeared on the screen during the iPhone introduction, Mac developers have been bursting with desire to get on this platform.

Yesterday’s SDK announcement brought a year’s worth of iPhone application fantasies to an explosive climax. (Sorry.) The technology and tools are better than most dared hope for in the first SDK release for a new platform: a simulator, a remote debugger, an impressive media stack, Interface Builder support—it’s all there.

Developer reaction to the announcement has revealed an aspect of iPhone application mania that was not as apparent to me during the run-up to SDK Day. It’s the subtext of the “frontier” view of the iPhone: developers see dollar signs—a lot of them.

Skilled Mac developers are uniquely positioned to be the first to market with the iPhone applications they’ve been designing in their heads since last year. They know the tools, they know the technology, they even know a lot of the APIs already, and those they don’t know look a lot like the ones they do.

Mac developers also know the audience, which means that not only can they be the first to market, they can also arrive with the best products. Aesthetically and culturally, the iPhone is platform extremely close to the Mac.

The big difference for Mac developers is the size of the market. The phone market is bigger than the PC market, and Apple has a much larger chunk. Oh yeah, and don’t forget the portable media player market, which might as well be called “the iPod market” at this point. It seems inevitable that the iPhone OS (née OS X) will sweep across all (non-Shuffle) iPod models in the coming years.

All told, the number of potential customers is staggering, especially to a Mac developer accustomed to selling to a tiny fraction of 5% of a smaller market. Cha-ching, indeed.

This heady mix of a potentially huge, but still nascent market and a year’s worth of pent-up technological and capitalistic enthusiasm is what makes the iPhone development business proposition offered by Apple pass muster with developers. The deal: all iPhone application sales must go through Apple, and Apple takes 30% of every sale. In exchange for its cut, Apple handles application hosting, automatic updates, and customer billing.

But perhaps most importantly, Apple will also put your application a few impulsive finger taps away from every single iPhone and iPod touch user by including an icon for the new App Store on the home screen of all iPhone OS devices. Customers can wirelessly pay for, download, and install your application, using their existing iTunes account and credit card information. (Developers may pause to wipe up their drool.)

Is Apple’s cut too big? As one developer put it, “70% is 100% more than you’re making on iPhone apps right now.” (Or, more bluntly, “How about we make shitloads of money at 70% and ask questions later.”) In a nutshell, it’s a land-rush mentality.

But how is this sales model going to look in a few years, when land has been claimed and lines have been drawn? For a strong argument against this kind of closed development model, look no further than the PC (and yes, Mac) software market. As Steve Wildstrom of BusinessWeek put it:

Apple is trying to establish a walled garden for applications at a time when the whole push of the industry is toward open platforms; in effect, they are just trying to displace the carriers as the keeper of the garden. Other than the iPhone, all of the smartphone platforms […] have open application development. Android is going to push that openness down into the “feature phone” market. […]

I’m not sure how Apple can march against this tide. Jobs is right; the iPhone is a little computer. But he won’t let owners treat it like a little computer and he won’t accept a computer development model. Wrong, wrong, wrong.

Is he right, right, right? My initial inclination is to agree wholeheartedly. But I can’t help but at least consider one particularly strong argument for the viability of Apple’s model: the game console business. Nintendo, Sony, and Microsoft all control which software gets sold on their respective platforms, and they all take a cut (in some form or another). And, like Apple, all game console manufacturers also create and sell software for the platform they own.

While the exact numbers differ wildly, and the console business also has an unseemly and grim brick-and-mortar retail component, it’s still a strong validation that a “gatekeeper” model can work. Now, does it work well? Ask around the game development industry and you’ll get a wide range of answers. Many times over the long history of the video game business, power has shifted too far in one direction or another—though usually towards the platform owners—and software developers have been hurt badly.

The average life expectancy of an independent game development studio should give "indie" Mac developers some pause. Granted, the financial realities of the game business are very different than those of the traditional software business, with artwork and associated digital asset management now dwarfing all other aspects of the game development process in terms of time, money, and manpower. And it’s also true that the console industry has recognized this weakness in its model and has moved to address it on many fronts. But the fact that it’s gone so far in the other direction that it requires this kind of (thus far scattershot and largely ineffectual) reaction definitely says something.

Still, from the lofty perch of the boardroom, the game industry looks pretty good on the all-time graph. Peaks and valleys aside, that line is pointing up, and the numbers on the y-axis are big. Looking up from what is potentially the bottom of such a graph, iPhone developers can be forgiven for getting a little starry eyed.

In the end, it’s a question that the iPhone development community will have to answer for itself. As a means to jump-start the market for iPhone applications, there’s no disputing the clear advantages of Apple’s “App Store” model. Today, it’s a win-win-win for Apple, developers, and consumers. But if you plan to be part of this world, ask yourself what kind of environment you’d like to live and work in five years down the road. It’s definitely something that indie Mac developers in particular should think long and hard about between cashing all those early-days iPhone application sales checks signed by Apple.


This article originally appeared at Ars Technica. It is reproduced here with permission.