Avoiding Copland 2010: Part 2
The garbage man cometh…but to what end?
I’ve gotten a lot of feedback on my earlier post describing the potential for a Copland-style crisis in Apple’s future. After seeing a much lower standard of reading comprehension demonstrated by certain readers of another popular blog in recent days, the quality and breadth of commentary from Ars readers is refreshing.
I have gotten some requests for elaboration, however, particularly on the topic of Objective-C and garbage collection. To that end, I’d like to begin by expanding upon my earlier arguments in order to give you an idea of where I was coming from. I’ll get to garbage collection in Objective-C eventually, I promise.
In Avoiding Copland 2010, I described two Apple OS crises: one in the past (the lack of memory protection and preemptive multitasking) and one that potentially lurks in the future (the lack of a memory-managed language and API). Many readers focused on the technical details of the two features, noting how one is a change to the core OS and the other is purely additive. But the analogy I was making does not hinge on the technology. It’s the circumstances that are similar.
I was looking back on the situation that led to Copland—an increasingly pressing need for Apple’s OS to have features that contemporary developers would eventually come to expect—and then looking into the future in order to determine where and how Mac OS X might find itself in a similar situation.
As with any forward-looking statements, mine had several built-in assumptions. The first is that fully automatic memory management will eventually be an expected feature of the primary application development API for a desktop OS. This assertion is based on many factors, but none more compelling or more broad than the unstoppable force of ever-increasing abstraction.
All other things being equal, when faced with two technologies, one less abstracted but higher performance, and one more abstracted but slower, the more abstracted, slower technology will always win in the long run. While hardware performance increases over time, the human capacity to deal with complexity does not—at least, not in anything approaching the same time scale as technological advancement.
Despite the performance hit, increased abstraction always wins in the end. This has happened time and time again in the PC industry, and it will hold true as long as computing power continues to increase. Despite the hand-wringing over multi-core CPUs, MHz barriers, and power/heat issues, I think performance will continue to improve for the foreseeable future.
My next assumption is that, by 2010 or so, the rest of the industry (meaning the majority of the industry, since I also assume that Apple’s market share will still be below 50%) will have adopted languages and APIs that feature fully automatic memory management. I based this assumption on the work that Microsoft and others have already put into this area: Java, C#, CLR, WinFX, etc.
I actually think the year 2010 is a bit too early, but I didn’t want to use a date that was too far in the future. People’s brains tend to switch off when faced with distant dates. (“Surely we’ll all have jet-packs and flying cars by 2020!”) So, on the timing issue, there’s definitely some leeway. Let’s just say 2015ish.
My final—and most controversial—assumption was that several existing technologies, and obvious evolutions thereof, do not adequately fill the need for a language and API with fully automatic memory management. The list of technologies I rejected is long and distinguished.
First there’s Objective-C and Cocoa as they exist today. Some people think that memory management in Cocoa is already close enough to “fully automatic” that the differences don’t matter in any practical sense. I disagree, but clearly it’s an issue that’s still hotly debated. I got about equal amounts of email on both sides of the issue, from a wide range of people.
(The productivity of Objective-C programmers vs. say, Java programmers often comes up as well, but that’s mostly a tangential issue. On the other hand, it may stave off the day of reckoning for Mac OS X, depending on how much competing APIs like WinFX and the Java library improve over the years.)
Next, there are the various bridges to Cocoa from other languages: the Java/Objective-C bridge (now dead), the Python bridge, the Perl bridge, and any other language you can imagine. I rejected these because of the “second-class citizen” problem. No one wants to invest heavily in an API using anything other than the “native” language. While bridges are handy, and a Godsend for people doing certain kinds of work, they’re not a good general solution for political reasons alone. Then there are the technical and cultural issues that arise from the differences between the bridge language and the native language. In my previous post, I simply said that “bridges suck,” which was probably a bit too succinct. I hope I’ve provided more context for that statement.
Finally, there’s Objective-C with garbage collection. A reader posted some pretty compelling evidence that Apple is at least thinking about adding garbage collection to Objective-C. Also, garbage collection has already been added to at least one implementation of Objective-C. Doesn’t this fulfill all the requirements?
Perhaps surprisingly, I also rejected garbage-collected Objective-C as a suitable solution for Apple’s OS crisis of the future, but this time my reasoning is a bit more subtle. Ignoring garbage collection for a moment, I don’t think Objective-C, in any form, will ever be suitable competition for the memory-managed languages of the future. The fact that Objective-C is a superset of C has been a tremendous strength in the past, and continues to be one today. But this strength will eventually become a weakness as developers come to expect an environment that is more, well, “managed.”
A developer working on high-level code in a GUI application is most productive when working in a development environment that provides similarly high level of abstraction. In the past, NeXTSTEP provided one of the best such environments with Interface Builder and its Objective-C frameworks. Mac OS X continues that legacy today with Cocoa. But going forward, adequate abstraction will mean “managed code,” to use Microsoft’s terminology, from top to bottom, and an assurance that things stay that way without an explicit effort.
In other words, there has to be some sort of gate between the world of managed code and the low-level, “unsafe” stuff. Java chose to go with a “pure” approach and removed that gate entirely. With C#, Microsoft left the gate, but put a latch on it. Developers can drop down to “unmanaged” code, but it takes a deliberate act by the developer to do so.
In Objective-C, on the other hand, it’s all just one big grassy field. Raw C code is just one keystroke away at all times. I think that’s a dangerous arrangement that will be perceived as a relic eventually. Again, it’s a strength today, but in the future, I think the desire for “safety by default” will win out.
There’s one more problem with garbage-collected Objective-C, and this one has almost nothing to do with the technology itself. Where is the audience for this new language variation?
One of the many blessing of the NeXT acquisition was that it came with a time-tested API and a substantial, preexisting community of extremely talented developers who were already familiar with that API. Unfortunately, the corresponding curse is that these are exactly the people who will be the least interested in garbage-collected Objective-C. Retain/release is as easy as inhale/exhale to these people. So while they may actually endorse the idea of adding garbage collection to the language, they probably don’t have any plans to actually use it.
Fine, you say, garbage collection isn’t really for the old pros anyway. It’s for the new Cocoa developers, right? Get on board the Cocoa train, it’s easy! But as in any community of this kind, all the newbies inevitably want to be like the old pros. And if the old pros retain and release, then by gum, so will Joe Newbie. Again, it’s the second-class citizen problem. “Sure, kid, you can use GC i fyou want. But 'real' Objective-C programmers do it manually. That’s the 'native' way to do it.” Ug, the kiss of death.
So, is garbage-collected Objective-C not even worth doing? That depends. In order to make it worthwhile, Apple would have to really commit to it. They can’t just put it out there, a la the Java/Cocoa bridge. If you build it, they probably won’t come.
To pull it off, Apple would have to lead by example. They’d have to build an important application of their own using garbage-collected Objective-C. They’d have to really wring out all the bugs. They’d have to test every single existing framework and make sure everything works from top to bottom.
That’s a tall order. It’s a lot of work with very little “forward progress” from the perspective of Apple management. I don’t know what Apple will actually decide to do, but it’ll be interesting to watch.
The final unwritten assumption in all of this is that, in all the areas that I didn’t mention, I think Apple is well positioned with Mac OS X. Multiple CPU support, scalable UI, stability, graphics performance, and yes, even multi-threading within Cocoa and other high-level APIs. That’s not to say that things are perfect, or even adequate in all of these areas today. But I feel like Apple is on the right track; the foundations have been laid, and people are clearly already working on what needs to be improved. The memory-managed language/API issue is much more cloudy, and therefore even more concerning.
Let me try to bring this all together. This is a long list of assumptions that were part of my Copland 2010 warning. Rejecting any one of them makes my argument less convincing. A lot of email and comments I’ve received do contest at least one of these premises. But despite this, there still seems to be an overwhelming recognition that eventually, someday in the future, “managed code” will be the norm, and Apple will need to have a competitive solution.
It could just be the hype building around C# and WinFX that’s causing Mac fans to worry about whether or not Apple has all its bases covered, and that’s not really a bad motivation. When you get right down to it, it’s just good business sense: stay aware of what the competition is doing and reassess how you stack up, both now and in the future.
In the end, that’s the most important thing to take away from the Copland 2010 warning: stay alert, reassess regularly, think long term, select a good plan, and then execute it well. That’s how to avoid another Copland. Apple seems to have the “alertness” part down, but I wonder about the procrastination factor. At some point, hopefully sooner rather than later, Apple needs to commit to something.
I worry that they’ll wait too long, and I worry that they’ll make the wrong decision. It’s happened before—not just with Copland, but even in the Mac OS X era. Witness the file system metadata detour. Mac OS X started off with a totally wrong-headed approach to file system metadata, inspired by a dangerously retro view of files as nothing more than simple streams of bytes. Four years later, Apple has started to turn that ship around. Sure, that’s hardly what you’d call “nimble,” but it’s better than nothing. (Who knows? Maybe even the Finder will get some attention someday. Ha, just kidding, of course.)
What should Apple do? Adopting any competitor’s technology wholesale is a bad idea, for what I hope are obvious reasons. And you can just forget about adopting a competitor’s API in any form. This is all the more true when the competitors are Sun and Microsoft. Yikes.
What’s worked in the past may work again, however. Instead than adopting Java or C# or CLR or WinFX, Apple could harvest the technologies it needs—only considering the “open” ones, obviously—and then combine them into its own re-branded technology.
The other alternative is to do some primary research in this area and come up with a complete solution in-house. But Apple’s not really that kind of company anymore. In days past, Apple was afflicted with NIH, or Not Invented Here syndrome. Everything was created from whole cloth. Sometimes it was good (QuickTime) and sometimes not so much (PowerTalk). These days, it’s exactly the opposite. Basic R&D has been slashed. Anything not directly contributing to an existing or future product is not worth doing. Adding value to existing, proven technologies is is now the order of the day. Call it NIE, or Not Invented Elsewhere syndrome. Like NIH, this can be both good and bad. In the Mac OS X era, it has been mostly good, so it’s hard to fault Apple for sticking with what works.
Despite my desire to see an increase in primary research at Apple, my recommendation is to take the Safari/WebKit approach to the memory-managed language/VM issue. Evaluate C#, Java, SmallTalk, Lisp, JVM, CLR, and whatever else is out there. Grab the the bits and pieces that are the best fit for Apple’s needs, assemble them into a cohesive whole, re-brand the whole shebang, and go to market.
On the API front, however, I think Apple should turn to its in-house talent. Sure, learn from WinFX, but then give the people behind Cocoa a chance to design a new API around the new memory-managed language. Let them correct the mistakes of the past and create an API based on assumptions that make sense in 2010 and beyond, rather than the late 1980s.
I recognize that my recommendation so vague that it’s nearly useless. It’d be a lot more useful to say something like, “Apple should adopt C# and CLR via Mono, then port Cocoa to C# and make that the new, official API of Mac OS X.” The problem is, I think that all such recommendations are bad ideas right now. Thus, my vague advice. I don’t envy Apple’s task when it comes to making this decision.
Speaking of which, having said (sort of) what I think Apple should do, here’s what I think they will do. I think they’ll try adding garbage collection to Objective-C, but they won’t really commit to it and it will go largely unused. Long term, as garbage-collected Objective-C is slowly shaken out, I think it’ll start to creep into the mainstream thanks to mounting pressure to at least appear to match the level of abstraction provided by an increasingly mature CLR/WinFX platform.
No clear successor to Cocoa and Objective-C (with or without garbage collection) will be invented inside Apple. If suitable “pieces” do not appear in the market for Apple to adopt and assemble, then Mac OS X will stay with Cocoa and Objective-C “forever.” That is, until the external pressure to be better finally reaches crisis proportions.
I think Apple can postpone this crisis for a long time thanks to its small market share and excellence in other areas. But eventually, the day will come when Cocoa and Objective-C just don’t cut it anymore. In the most optimistic scenario, that date could be a very long time from now indeed. I just hope Apple is considering the worst case, and perhaps reconsidering it’s ruthlessly product-focused strategy policy when it comes to the future of the Mac OS X application development environment.
This article originally appeared at Ars Technica. It is reproduced here with permission.