Hypercritical


February 2013


Annoyance-Driven Development

High Maintenance

I’ve been watching House of Cards, the new TV series available exclusively on Netflix, which reportedly outbid HBO, Showtime, and others for the rights to the show. This is part of Netflix’s ongoing effort to “become HBO faster than HBO can become us.” That quote, from Netflix’s chief content officer Ted Sarandos, neatly draws the battle lines between the old and new worlds of TV.

Once the upstart, HBO now finds itself playing catch-up with Netflix in terms of pricing and distribution. Netflix, meanwhile, is shelling out its own money to try to overcome its historic inability to offer the very best content.

I’m not ready to predict a winner in this race—though the two-year wait for HBO to add AirPlay support to its HBO Go iOS app does not inspire confidence in the old guard. I’m more interested in what Netflix offers that HBO doesn’t.

The answer is obvious to anyone who has used the service. For a fixed, low monthly fee, Netflix lets customers watch TV shows and movies whenever they want, wherever they want, on phones, tablets, “smartTVs, game consoles, streaming media boxes, blu-ray players, even personal computers—remember those?

Netflix’s decision to release the entire first season of House of Cards all at once is in keeping with its disregard for the traditional limitations of TV. This is how products and services endear themselves to consumers: remove everything that gets in the way of what we want. We want to be entertained. We don’t want to arrange our schedules around your TV show. We don’t want to watch commercials. We don’t want to be forced to use a particular device. We just want it the way we want it.

But even Netflix has been unable to escape some of the trappings of the days of video past. A TV series like House of Cards that’s released a season at a time naturally lends itself to multi-episode viewing sessions. But as I recently tweeted, watching a minute and a half of opening credits before each episode can get tiresome.

This position proved somewhat controversial on Twitter. Hard-working people deserve credit, some said. Others said that the credits set the mood for the show. Some people just plain liked the credits, with no qualifiers.

But there were also people who agreed with me, people who routinely skip the opening credits (often lamenting the limited content-skipping tools provided by their chosen Netflix viewing device). One person even read my tweet while killing time as the House of Cards credits ran in another browser tab.

To be fair to Netflix, the existence of opening credits may not be entirely under its control, even when it’s paying for a series itself, given existing union contracts for actors, directors, writers, etc. But getting bogged down in the details of this debate misses the point.

Yes, opening credits are a longstanding part of traditional TV—but so were fixed broadcast schedules, commercial breaks, and viewing all TV shows on a television set. As the delivery mechanism changes, the content itself must also adapt to its changing context.

Not everyone binges on House of Cards four episodes at a time, but the people who do really love Netflix for making it possible. Every time I fast-forward through those 90-second opening credits (made more difficult by the occasional variable-length pre-credits scene), I get the opposite feeling about Netflix. It’s an unhappy reminder of the old world of TV. No explanation of contractual obligations or artistic credit is going to convince me that I’m mistaken about my own desires. I just want it the way I want it!

This may sound comically selfish, but true innovation comes from embracing this sentiment, not fighting it. For companies looking to get the best bang for their buck out of technology, this is the way forward. Find out what’s annoying the people you want to sell to. Question the assumptions of your business. Give people what they want and they will beat a path to your door.

This brings us, perhaps surprisingly, to the PlayStation 4, the newly announced successor to the six-year-old PlayStation 3. Six years is an eternity in the world of technology. For the first few decades of console gaming, each new hardware platform surpassed the capabilities of its predecessor by leaps and bounds. There was little question about what to do with technology. More, better, faster was an end in and of itself. If you build it, the games will come.

The Wii was the first console to break that cycle, directing a large chunk of its innovation toward a novel control scheme, sacrificing raw computing power to do so. It worked. The Wii became the best-selling console of its generation, and its competitors soon followed with non-traditional control schemes of their own.

Based on what’s been announced about the PlayStation 4 so far, it seems that Sony has learned at least some of the lessons of the Wii. While the PS4 will indeed be substantially more powerful than the PS3 (and embarrassingly more powerful than its competitor from Nintendo, the Wii U), Sony has not chosen to sink millions into developing a radical new CPU architecture like the PS3’s Cell processor in the hopes that raw MIPs will inexorably lead to market dominance.

Instead, Sony has built the PS4 using a nicely balanced arrangement of existing technology. All the time, money, and energy that would have otherwise gone toward a true Cell successor has been refocused on ensuring that the PS4 does things that makes Sony’s customers happy.

Game developers are one kind of customer. There may not be many of them relative to the number of people Sony hopes will buy its products at retail, but developers can make or break a game console by choosing which games to develop for which platform, and when. And developers sure weren’t happy with the PS3, which was unlike any piece of gaming hardware that had come before it. Thanks to its familiar combination of an x86 CPU and an ATI GPU, the PS4 will be much easier to write games for.

Sony feels gamers’ pain as well. The PS4 appears to have been designed by identifying the parts of the PS3 experience that are annoying and deploying technology to eliminate them. Deciding to play a game and being delayed by 30 minutes of mandatory system updates is not fun, so Sony added a dedicated processor to handle background downloads, and a low-power state for the entire system to allow this to happen unattended. Resuming an interrupted gaming session only to find yourself back at the last checkpoint in the game is not fun, so Sony promises the ability to suspend a game’s state in its entirety and resume later at the instant you left off. Waiting an hour for a multi-gigabyte game to download before you can start playing it is not fun, so the PS4 will allow games to be played as they download.

Sony is providing new features as well. A dedicated video encoder allows gameplay to be recorded in real time with no loss of performance, and a “share” button on the controller allows that video to be uploaded (in the background, naturally), without leaving the game. That same video encoding hardware plus Sony’s game-focused social network will allow players to invite their friends to watch them play in real time. Sony even promises the ability to play games remotely. If a player is having trouble with some part of a game, he could invite one of his friends to remotely assume control for a bit to help out.

Now, anyone who remembers Sony’s promises about the PlayStation 3 knows all too well how far they can be from the eventual reality. I’m very skeptical about Sony’s ability to deliver all the announced PlayStation 4 capabilities in a competent and timely manner. And then there are all the areas where the interests of gamers and game developers may conflict (e.g., the market for used games).

But when I look at the PlayStation 4 hardware itself, I see a shrewd acknowledgement of the true nature of innovation. It doesn’t cost much to add dedicated silicon to handle background network transfers and video encoding and decoding, and it sure isn’t sexy, technologically speaking. Low-power sleep states, instant suspend/resume, progressive downloads, and remote play are all features that are a giant pain to implement and do precisely nothing to make games look, sound, or perform better. But it’s these things, not the number of CPU/GPU cores or the amount of RAM, that really have a chance of making the PS4 gaming experience stand head and shoulders above what has come before.

We nerds love technology for its own sake. Indeed, there’s always something to be gained by advancing the state of the art and providing more of a good thing. But the most profound leaps are often the result of applying technology to historically underserved areas. By all means, make everything better and faster, but also find the things that seem like minor annoyances, the things that everyone just accepts as necessary evils. Go after those things and you’ll really make people love you. Accentuate the positive. Eliminate the negative.


Don’t Stop Thinking About Tomorrow

The iPhone 5 caught some flak for being “too light.” Similarly, some consider the latest revision of the iMac to be “too thin.” You’ll find some incredulity in the articles that address this topic. It’s a little silly, right? After all, what’s the alternative? Thicker and heavier? Stagnation? But these complaints are not entirely unreasonable.

When it comes to electronics, density is often a signal of quality. A product that feels like an empty metal box seems cheap. A tiny item with surprising heft seems expensive. For handheld items, higher density can also help produce stronger, more concentrated pressure on the hand. This helps to more clearly delineate the sensations of a securely held item and an item that’s about to slip out of the hand. I’ve heard this complaint about the iPhone 5 many times: “It’s so light, I’m afraid I’m going to drop it!”

No one is holding an iMac while using it, so there’s no fear of dropping it. But if it’s not being held, why the rush to slim down? Dissatisfaction with the ever-slimming iMac is exacerbated by the removal of the optical drive in the latest revision. In all likelihood, that optical drive was going away regardless of the thickness of the iMac’s edge. (Apple’s been steadily dropping optical drives from the Mac line for years.) Still, some people can’t help but infer a cause and effect relationship, blaming Apple’s seemingly pointless drive for thinness for the loss of the slot for the spinning shiny things.

In the past, I’ve voiced my own complaints about the edge of the latest iMac and how the iPhone 5 feels in the hand. But though I might disagree with the timing and details of these changes, I fully support the broader long-term trend towards lighter, thinner hardware. Here’s why.

In technology, things that can be measured appear to exist on a smooth continuum: large to small, slow to fast. But the experiences provided by these measurable quantities often have sharp discontinuities.

Consider touch-screen user interfaces. They’ve existed for decades, but it wasn’t until the iPhone arrived that they entered widespread usage. Yes, there are many non-tech factors that contributed to this, but the responsiveness of the iPhone’s interface was an essential factor. With the iPhone, touch interfaces finally crossed the threshold from frustrating to joyful.

I’m not sure where the threshold is, or even what quantities it applies to (e.g., frames-per-second of animation, input lag, finger pressure), but it’s definitely there. It’s not a steady ramp from unacceptable to acceptable. It’s a perceived discontinuity—a leap.

Most measurable qualities of tech products have experiential discontinuities like this. In fact, there are usually multiple discontinuities. It’s human nature to think that we’re at the pinnacle of useful achievement, but it’s never actually true. Watch what happens to the experience of using a touch-screen when we go in search of the next discontinuity—what the Microsoft researcher in this video calls “a perceptual cliff."

This phenomenon is not limited to performance measurements. It extends to every aspect of a product, including size, weight, and even shape. Let’s reconsider the iPhone. The change in thickness and weight between the iPhone 4S and the iPhone 5 was very small. Using an iPhone 5 does not feel dramatically different than using a 4S. Clearly, the iPhone 5 has not yet reached the next perceptual cliff—but it’s out there.

Consider a distant-future iPhone roughly the same width and height as the iPhone 5, but as thin and as durable as a credit card. Accidentally drop such a phone and it’d flutter harmlessly to the ground. Now maybe this would be a terrible design—the edges might dig into your hand, and it might be even less secure-feeling when held—but it’d clearly change the equation when it comes to fear of dropping your iPhone (not to mention where and how to carry it, and so on).

Don’t get distracted by the details. I’m not arguing for or against a particular design. My point is that it’s important to keep making progress towards the next discontinuity, wherever it may be.

Apple has its compass trained on “thinner and lighter,” a direction that’s proven fruitful in the past. But as much as we’d all like to jump right to the next big win, you can’t just skip to the end. The original iPhone was never going to be followed by the credit-card-thin iPhone—again, ignoring whether this is actually a good idea; stay with me! Instead, it was followed by the 3G (thicker in the middle, but thinner-feeling on the edge), then the 4 (thinner overall), then the 5 (thinner still), and so on.

The same goes for the iMac, with the same caveats about the direction and endpoint. How does the iMac change as a product when it’s as thin as an iPad, or a cafeteria tray, or a credit card? Does it even need to exist at that point? Maybe the distant-future iMac is “just a big iPad.” Or maybe some new i/o device makes all of this moot.

Mistakes will be made in the march towards the future. But the worst possible mistake is neglecting to do the work required to get there because you think we’ve already arrived. There is no destination; there is only the journey. Pick a direction or get out of the way.


Apple’s 2013 To-Do List

I didn’t just lead Apple to a record quarterly profit of $13.1 billion on sales of $54.5 billion, so I don’t expect to be consulted. But were Tim to ask me, here’s what I would tell him Apple should do in 2013—in broad strokes, and in no particular order. (We’ve got people to work out the details—right, Tim?) This is not a fantasy wish list. These are things I think Apple can and should do this year. This list is not exhaustive.

Should be a cinch, right? Too bad there are only two items on this list that will help Apple’s stock price recover from its calamitous 35% drop over the past four months. Uneasy lies the head that wears a crown.