{ "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "feed_url" : "http://hypercritical.co/feeds/main", "home_page_url" : "http://hypercritical.co", "items" : [ { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "
\n\nThe recent Beeper controversy briefly brought the “blue bubbles vs. green bubbles” topic back into the mainstream. Here’s a brief review for those of you who are (blessedly) unaware of this issue. Messages sent using the iMessage service appear in blue text bubbles within the Messages app. Messages sent using something other than the iMessage service (e.g., SMS, or (soon) RCS) appear in green text bubbles.
\n\nThe iMessage service and the Messages app are only available on Apple devices. This is usually presented as a competitive advantage for the iPhone. If you want to use the iMessage service, the only (legitimate) way to do so is to buy an Apple device. If Apple were to make iMessage available on non-Apple platforms, that would remove one reason to buy an iPhone—or so the argument goes.
\n\nI think this popular conception of the issue is slightly wrong—or right for a different reason, at least. The iMessage service is not so good that it makes the iPhone more attractive to customers. It’s the iPhone that makes iMessage attractive. The iPhone gives iMessage its cachet, not the other way around.
\n\nThis truth is plainly evident at the core of the “blue bubbles vs. green bubbles” debate. One of the biggest reasons green bubbles are looked down upon is that they indicate that the recipient doesn’t have an iPhone. iPhones are expensive, fancy, and desirable. Blue bubbles put the sender into the “in” crowd of iPhone owners.
\n\nThe iMessage service itself, when considered in isolation, has considerably less draw. Here’s an assessment from 2013 from within Apple, as revealed during the recent Epic trial by internal emails discussing the idea of making iMessage work on non-Apple devices.
\n\n\n\n\nEddy Cue: We have the best messaging app and we should make it the industry standard. […]
\n\nCraig Federighi: Do you have any thoughts on how we would make switching to iMessage (from WhatsApp) compelling to masses of Android users who don’t have a bunch of iOS friends? iMessage is a nice app/service, but to get users to switch social networks we’d need more than a marginally better app.
\n
While I appreciate Eddy’s enthusiasm, I think Craig is closer to the mark: if iMessage is better than its competitors at all—and this is highly debatable—it is only marginally so.
\n\nThose Apple emails were written more than a decade ago. In the years since, iMessage has improved, but so has the competition. Today, it still feels like the iPhone is carrying iMessage. Anecdotally, both my teenage children have iPhones, but their group chats with their friends take place in WhatsApp.
\n\nApple has almost certainly missed the most advantageous window of time to make iMessage “the industry standard” messaging service. But as the old saying goes, the best time to plant a tree is 30 years ago, and the second-best time is now. Apple has little to lose by expanding iMessage to other platforms, and there still may be something to be gained (even if it’s just making mixed Android/iPhone conversations in Messages a bit more smooth).
", "date_modified" : "2024-02-09T15:15:39-05:00", "date_published" : "2024-02-09T15:15:39-05:00", "id" : "http://hypercritical.co/2024/02/09/the-imessage-halo-effect", "title" : "The iMessage Halo Effect", "url" : "http://hypercritical.co/2024/02/09/the-imessage-halo-effect" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThe graphical user interface on the original Macintosh was a revelation to me when I first used it at the tender age of 8 years old. Part of the magic was thanks to its use of \"direct manipulation.\" This term was coined in the 1980s to describe the ability to control a computer without using the keyboard to explain what you wanted it to do. Instead of typing a command to move a file from one place to another, the user could just grab it and drag it to a new location.
\n\nThe fact that I’m able to write the phrase “grab it and drag it to a new location” and most people will understand what I mean is a testament to the decades-long success of this kind of interface. In the context of personal computers like the Mac, we all understand what it means to “grab” something on the screen and drag it somewhere using a mouse. We understand that the little pictures represent things that have meaning to both us and the computer, and we know what it means to manipulate them in certain ways. For most of us, it has become second nature.
\n\nWith the advent of the iPhone and ubiquitous touchscreen interfaces, the phrase “direct manipulation” is now used to draw a contrast between touch interfaces and Mac-style GUIs. The iPhone has “direct manipulation.” The Mac does not. On an iPhone, you literally touch the thing you want to manipulate with your actual finger—no “indirect” pointing device needed.
\n\nThe magic, the attractiveness, the fundamental success of both of these forms of “direct manipulation” has a lot to do with the physical reality of our existence as human beings. The ability to reason about and manipulate objects in space is a cornerstone of our success as a species. It is an essential part of every aspect of our lives. Millions of years of natural selection has made these skills a foundational component of our very being. We need these skills to survive, and so all of us survivors are the ones who have these skills.
\n\nCompare this with the things we often put under the umbrella of “knowing how to use computers”: debugging Wi-Fi problems, understanding how formulas work in Excel, splitting a bezier curve in Illustrator, converting a color image to black and white in Photoshop, etc. These are all things we must learn how to do specifically for the purpose of using the computer. There has not been millions of years of reproductive selection to help produce a modern-day population that inherently knows how to convert a PDF into a Word document. Sure, the ability to reason and learn is in our genes, but the ability to perform any specific task on a computer is not.
\n\nGiven this, interfaces that leverage the innate abilities we do have are incredibly powerful. They have lower cognitive load. They feel good. “Ease of use” was what we called it in the 1980s.
\n\nThe success of the GUI was driven, in large part, by the fact that our entire lives—and the lives of all our ancestors—have prepared us with many of the skills necessary to work with interfaces where we see things and then use our hands to manipulate them. The “indirection” of the GUI—icons that represent files, windows that represent documents that scroll within their frames—fades away very quickly. The mechanical functions of interaction become second nature, allowing us to concentrate on figuring out how the heck to remove the borders on a table in Google Docs1, or whatever.
\n\nThe more a user interface presents a world that is understandable to us, where we can flex our millennia-old kinesthetic skills, the better it feels. The Spatial Finder, which had a simple, direct relationship between each Finder window and a location in the file hierarchy, was a defining part of the classic Macintosh interface. Decades later, the iPhone launched with a similarly relentlessly spatial home-screen interface: a grid of icons, recognizable by their position and appearance, that go where we move them and stay where we put them.
\n\nNow here we are, 40 years after the original Macintosh, and Apple is introducing what it calls its first \"spatial computer.\" I haven’t tried the Vision Pro yet (regular customers won’t receive theirs for at least another three days), but the early reviews and Apple’s own guided tour provide a good overview of its capabilities.
\n\nHow does the Vision Pro stack up, spatially speaking? Is it the new definition of “direct manipulation,” wresting the title from touch interfaces? In one obvious way, it takes spatial interfaces to the next level by committing to the simulation of a 3D world in a much more thorough way than the Mac or iPhone. Traditional GUIs are often described as being “2D,” but they’ve all taken advantage of our ability to parse and understand objects in 3D space by layering interface elements on top of each other, often deploying visual cues like shadows to drive home the illusion.
\n\nVision Pro’s commitment to the bit goes much further. It breaks the rigid perpendicularity and shallow overall depth of the layered windows in a traditional GUI to provide a much deeper (literally) world within which to do our work.
\n\nWhere Vision Pro may stumble is in its interface to the deep, spatial world it provides. We all know how to reach out and “directly manipulate” objects in the real world, but that’s not what Vision Pro asks us to do. Instead, Vision Pro requires us to first look at the thing we want to manipulate, and then perform an “indirect” gesture with our hands to operate on it.
\n\nIs this look-then-gesture interaction any different than using a mouse to “indirectly” manipulate a pointer? Does it leverage our innate spatial abilities to the same extent? Time will tell. But I feel comfortable saying that, in some ways, this kind of Vision Pro interaction is less “direct” than the iPhone’s touch interface, where we see a thing on a screen and then literally place our fingers on it. Will there be any interaction on the Vision Pro that’s as intuitive, efficient, and satisfying as flick-scrolling on an iPhone screen? It’s a high bar to clear, that’s for sure.
\n\nAs the Vision Pro finally starts to arrive in customers’ hands, I can’t help but view it through this spatial-interface lens when comparing it to the Mac and the iPhone. Both its predecessors took advantage of our abilities to recognize and manipulate objects in space to a greater extent than any of the computing platforms that came before them. In its current form, I’m not sure the same can be said of the Vision Pro.
\n\nOf course, there’s a lot more to the Vision Pro than the degree to which it taps into this specific set of human skills. Its ability to fill literally the entire space around the user with its interface is something the Mac and iPhone cannot match, and it opens the door to new experiences and new kinds of interfaces.
\n\nBut I do wonder if the Vision Pro’s current interaction model will hold up as well as that of the Mac and iPhone. Perhaps there’s still at least one technological leap yet to come to round out the story. Or perhaps the tools of the past (e.g., physical keyboards and pointing devices) will end up being an essential part of a productive, efficient Vision Pro experience. No matter how it turns out, I’m happy to see that the decades-old journey of “spatial computing” continues.
\n\nSelect the whole table, then click the “Border width” toolbar icon, then select 0pt
. ↩
While the utility of Generative AI is very clear at this point, the moral, ethical, and legal questions surrounding it are decidedly less so. I’m not a lawyer, and I’m not sure how the many current and future legal battles related to this topic will shake out. Right now, I’m still trying to understand the issue well enough to form a coherent opinion of how things should be. Writing this post is part of my process.
\n\nGenerative AI needs to be trained on a vast amount of data that represents the kinds of things it will be asked to generate. The connection between that training data and the eventual generated output is a hotly debated topic. An AI model has no value until it’s trained. After training, how much of the model’s value is attributable to any given piece of training data? What legal rights, if any, can the owners of that training data exert on the creator of the model or its output?
\n\nA human’s creative work is inextricably linked to their life experiences: every piece of art they’ve ever seen, everything they’ve done, everyone they’ve ever met. And yet we still say the creative output of humans is worthy of legal protection (with some fairly narrow restrictions for works that are deemed insufficiently differentiated from existing works).
\n\nSome say that generative AI is no different. Its output is inextricably linked to its “life experience” (training data). Everything it creates is influenced by everything it has ever seen. It’s doing the same thing a human does, so why shouldn’t its output be treated the same as a human’s output?
\n\nAnd if it generates output that’s insufficiently differentiated from some existing work, well, we already have laws to handle that. But if not, then it’s in the clear. There’s no need for any sort of financial arrangement with the owners of the training data any more than an artist needs to pay every other artist whose work she’s seen each time she makes a new painting.
\n\nThis argument does not sit well for me, for both practical and ethical reasons. Practically speaking, generative AI changes the economics and timescales of the market for creative works in a way that has the potential to disincentivize non-AI-generated art, both by making creative careers less viable and by narrowing the scope of creative skill that is valued by the market. Even if generative AI develops to the point where it is self-sustaining without (further) human input, the act of creation is an essential part of a life well-lived. Humans need to create, and we must foster a market that supports this.
\n\nEthically, the argument that generative AI is “just doing what humans do” seems to draw an equivalence between computer programs and humans that doesn’t feel right to me. It was the pursuit of this feeling that led me to a key question at the center of this debate.
\n\nComputer programs don’t have rights1, but people who use computer programs do. No one is suggesting that generative AI models should somehow have the rights to the things they create. It’s the humans using these AI models that are making claims about the output—either that they, the human, should own the output, or, at the very least, that the owners of the model’s training data should not have any rights to the output.
\n\nAfter all, what’s the difference between using generative AI to create a picture and using Photoshop? They’re both computer programs that help humans make more, better creative works in less time, right?
\n\nWe’ve always had technology that empowers human creativity: pencils, paintbrushes, rulers, compasses, quills, typewriters, word processors, bitmapped and vector drawing programs—thousands of years of technological enhancement of creativity. Is generative AI any different?
\n\nAt the heart of this question is the act of creation itself. Ownership and rights hinge on that act of creation. Who owns a creative work? Not the pencil, not the typewriter, not Adobe Photoshop. It’s the human who used those tools to create the work that owns it.
\n\nThere can, of course, be legal arrangements to transfer ownership of the work created by one human to another human (or a legal entity like a corporation). And in this way, value is exchanged, forming a market for creativity.
\n\nNow then, when someone uses generative AI, who is the creator? Is writing the prompt for the generative AI the act of creation, thus conferring ownership of the output to the prompt-writer without any additional legal arrangements?
\n\nSuppose Bob writes an email to Sue, who has no existing business relationship with Bob, asking her to draw a picture of a polar bear wearing a cowboy hat while riding a bicycle. If Sue draws this picture, we all agree that Sue is the creator, and that some arrangement is required to transfer ownership of this picture to Bob. But if Bob types that same email into a generative AI, has he now become the creator of the generated image? If not, then who is the creator?
\n\nWhere is the act of creation?
\n\nThis question is at the emotional, ethical (and possibly legal) heart of the generative AI debate. I’m reminded of the well-known web comic in which one person hands something to another and says, “I made this.” The recipient accepts the item, saying “You made this?” The recipient then holds the item silently for a moment while the person who gave them the item departs. In the final frame of the comic, the recipient stands alone holding the item and says, “I made this.”
\n\nThis comic resonates with people for many reasons. To me, the key is the second frame in which the recipient holds the item alone. It’s in that moment that possession of the item convinces the person that they own it. After all, they’re holding it. It’s theirs! And if they own it, and no one else is around, then they must have created it!
\n\nThis leads me back to the same question. Where is the act of creation? The person in the comic would rather not think about it. But generative AI is forcing us all to do so.
\n\nI’m not focused on this point for reasons of fairness or tradition. Technology routinely changes markets. Our job as a society is to ensure that technology changes things for the better in the long run, while mitigating the inevitable short-term harm.
\n\nEvery new technology has required new laws to ensure that it becomes and remains a net good for society. It’s rare that we can successfully adapt existing laws to fully manage a new technology, especially one that has the power to radically alter the shape of an existing market like generative AI does.
\n\nIn its current state, generative AI breaks the value chain between creators and consumers. We don’t have to reconnect it in exactly the same way it was connected before, but we also can’t just leave it dangling. The historical practice of conferring ownership based on the act of creation still seems sound, but that means we must be able to unambiguously identify that act. And if the same act (absent any prior legal arrangements) confers ownership in one context but not in another, then perhaps it’s not the best candidate.
\n\nI’m not sure what the right answer is, but I think I’m getting closer to the right question. It’s a question I think we’re all going to encounter a lot more frequently in the future: Who made this?
\n\nNon-sentient computer programs, that is. If we ever create sentient computer programs, we’ll have a whole host of other problems to deal with. ↩
I first read about the “blue ocean” strategy in a story (probably in Edge magazine) about the Nintendo Wii. While its competitors were fighting for supremacy in the game-console market by producing ever-more-powerful hardware capable of high-definition visuals, Nintendo chose not to join this fight. The pursuit of graphics power was a “red ocean” that was already teeming with sharks, fighting over the available fish and filling the water with blood.
\n\nNintendo’s “blue ocean” strategy was to stake out a position where none of its competitors were present. The idea of creating a standard-definition game console in the generation when all the other consoles were moving to HD seemed ridiculous, but that’s exactly what Nintendo did. In place of impressive graphics, the Wii differentiated itself with its motion controls and a low price. It was a hit.
\n\nLately, I’ve been thinking about the blue ocean strategy in the context of Apple. Like Nintendo, Apple has made some bold moves with its products, many of which were ridiculed at the time: a smartphone without a physical keyboard, a candy-colored desktop computer with no floppy drive and no legacy ports, a $695 (in 2023 dollars) portable music player, a digital music store in the age of ubiquitous music piracy.
\n\nUnlike Nintendo, Apple has seen its competitors move quickly to imitate its innovations, turning these oceans red and leaving Apple to compete on the basis of execution…until it finds its next blue ocean.
\n\nBut what is that? It’s tempting to point to the Vision Pro. AR/VR headsets are not new, but then, neither were smartphones or portable music players. The Vision Pro hasn’t shipped yet, so the jury’s still out. Let’s keep an eye on it.
\n\nI have something else in mind. It’s actually related to one of Apple’s earlier \"blue ocean\" changes: the elimination of removable batteries. In the beginning, Apple’s laptops all used removable battery packs. Some even let the user pull out the floppy-drive module and replace it with a second battery.
\n\nStarting in 2009, Apple began to phase out removable batteries across its laptop line in favor of batteries that were sealed inside the case and were not user-accessible. The iPod and the iPhone arguably started this trend by never including removable batteries to begin with. (The iPhone defied so many other norms that the sealed battery was less remarked upon than it might have been, but it was still noted.)
\n\nThe upsides, which Apple touted, were many: lighter weight, smaller size, better reliability, longer battery life. We are still reaping these benefits today, and we Apple fans rarely question them. Today, predictably, non-removable batteries are a red ocean in many product categories. They are the norm, not an innovation.
\n\nWhen thinking about Apple’s next blue ocean, it’s tempting to ignore past innovations. Technological progress seems like an arrow pointing in only one direction, never turning back. But I just can’t shake the idea that a return to removable, user-accessible batteries has now become a blue-ocean opportunity just waiting for Apple to seize it.
\n\nFollow me, here. Yes, sealed batteries still offer all the same advantages they always have. And, yes, a return to removable batteries would bring back all their problems: increased size and weight, increased risk of liquid and dust ingress, decreased aesthetic elegance.
\n\nBut some things have changed in the past couple of decades. Battery technology has improved, and Apple has moved its entire product line to its own silicon chips that lead the industry in power efficiency. There’s more headroom than there has ever been to accommodate a tiny bit more size and weight in Apple’s portable products.
\n\nThat’s still a step backwards, right? But there are several countervailing forces, one of which is rapidly increasing in importance. The first is the fact that, as noted earlier, removable batteries are now a blue ocean. Apple would be alone among its biggest competitors if it made a wholesale change (back) to removable batteries in any of its product lines.
\n\nSecond, people still crave the advantages of removable batteries that were left behind: increasing battery life by swapping batteries instead of using a cumbersome external battery pack, inexpensively and conveniently extending the life of a product by replacing a worn-out battery with a new one—without paying for someone else to perform delicate surgery on the device.
\n\nFinally, related to that last point, worn-out batteries are an extremely common reason that old tech products are traded in, recycled, or replaced. Removable batteries are an easy way to extend the useful life of a product. This leads to less e-waste, which is perfectly aligned with Apple’s environmental goals as 2030 approaches.
\n\nOf course, longer product lifetimes means fewer product sales per unit time, which seems to run counter to Apple’s financial goals. But this is a problem that can be solved using one of Apple’s favorite financial tools: higher product margins. If Apple can actually make products that have a longer useful life, it can charge more money for the extra value they provide.
\n\nIt’s easy to think of product ideas that run counter to accepted wisdom; it’s harder to think of the right one. Sometimes a blue ocean is free from sharks simply because there are no fish there. But I think this idea has merit. I am not making a prediction, but I am making a suggestion.
\n\nI know some of you remain unconvinced. How can a removable battery be easy to swap and yet also be sealed against the elements? Won’t removable batteries ruin the appearance of Apple’s existing products by adding unsightly cut lines? Won’t they become unacceptably large and heavy? How can structural integrity be maintained with a giant hole cut out of the product frame? What about the risk of fire due to faulty battery connections or battery packs coming in contact with something metal in someone’s pocket? The list of problems goes on and on.
\n\nInnovation is never easy, but since when has Apple shied away from a challenge? As the industry leader in consumer-electronics design and manufacturing, Apple is best positioned to overcome the obstacles and reap the benefits of removable batteries. There’s no question it will be difficult, but if done well, it will undoubtedly be a hit. And as the company that led the transition away from removable batteries, it’s only fitting1 for Apple to be the one to bring them back.
\n\n“The Plumber Problem” is a phrase I coined to describe the experience of watching a movie that touches on some subject area that you know way more about than the average person, and then some inaccuracy in what’s depicted distracts you and takes you out of the movie. (This can occur in any work of fiction, of course: movies, TV, books, etc.)
\n\nHere’s an example. A plumber is watching a movie with a scene where something having to do with pipes is integral to the plot. But it’s all wrong, and the plumber’s mind rebels. No one else in the audience is bothered. They’re all still wrapped up in the narrative. But the plumber has a problem.
\n\nI’m not sure how long ago I came up with this phrase. The earliest recorded occurrence I can find is from 2021, in episode #153 of Reconcilable Differences (at 47:02) where I explain it to my cohost, Merlin, so it obviously predates that.
\n\nThe Plumber Problem is loosely related to the “Gell-Mann amnesia effect” which is “the phenomenon of experts believing news articles written on topics outside of their fields of expertise, yet acknowledging that articles written in the same publication within their fields of expertise are error-ridden and full of misunderstanding.”\n\n
Anyway, I was thinking about this today thanks to some people on Mastodon sending me examples of The Plumber Problem. Here are a few (lightly edited):
\n\nSimon Orrell: My first exposure to “The Plumber Problem” was sitting in a theatre with my dad in 1973 watching “Emperor of the North” and my dad leans over to whisper, “They didn’t make culvert pipe like that back in the ’30s. It was plate, not corrugated.”\n\n
Tim Allen: In Speed 2, a plot point involves a laden oil tanker about to collide explosively. My wife, native to a major oil port city, couldn’t follow the plot because she could tell the tanker was empty just by looking at it, so she didn’t understand why everyone was saying it would explode.\n\n
Dan Morgan: Interstellar’s farming scenes were just SO BAD. I’m not going to detail them here, but this retired farmer and agronomist found it hard to watch. I’m sure the physics were fine though. 😂\n\n
Someone also mentioned that “The Plumber Problem” is not an easy phrase to look up online, so here’s hoping this post remedies that situation.
\n\nHere’s one more bonus post that I enjoyed:
\n\nmagic: In Star Wars, Luke turns off his targeting computer to use the Force for his attack run on the Death Star. I’ve flown from one side of this galaxy to the other. I’ve seen a lot of strange stuff, but I’ve never seen anything to make me believe there’s one all-powerful Force controlling everything. There’s no mystical energy field that controls my destiny.\n\n", "date_modified" : "2023-08-18T12:44:19-04:00", "date_published" : "2023-08-18T12:44:19-04:00", "id" : "http://hypercritical.co/2023/08/18/the-plumber-problem", "title" : "The Plumber Problem", "url" : "http://hypercritical.co/2023/08/18/the-plumber-problem" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\n\n\n\n\n
It is said that every five years, Hypercritical t-shirts return. The last sale was in 2018, so the time has come! This sale ends on Saturday, August 12th, so if you want a shirt, don’t delay. Last time, I hinted that it would be five years before the shirts were sold again, and some people didn’t believe me. But it was true then, and it’s true now. If you want a shirt, buy it now, or resign yourself to waiting until 2028!
\n\nThe shirts are available in men’s and women’s styles and in light and dark colors:\n\n
My sincere thanks to everyone who has purchased a shirt, past and present, and to all the people who continue to read this website and listen to and support my podcasts.
\n\nA lot has changed since my last Hypercritical t-shirt sale. Most notably, I went independent in March of 2022, turning my former “side projects” into the sole source of my income.
\n\nOver the past decade or so, advertising has made up the vast majority of my podcast income. The podcast ad market has taken a big downturn this year for shows like mine, which has been rough. Thankfully, podcast membership has helped make up some of the difference.
\n\nMerchandise sales like these also help—though less than you might think. Manufacturing and shipping physical products is expensive, and the costs are always increasing. But every little bit helps. And as a podcast fan myself, I understand the draw. A shocking amount of my daily wardrobe consists of podcast t-shirts from the shows I listen to.
\n\nThat’s really what these sales are about: fans want shirts, and I want to provide them. And, yes, each shirt sold does make me a few bucks, so the more I sell, the better. But there’s a reason I only do these sales once every five years. I want these shirts to be special.
\n\nAnd when people’s shirts are starting to become threadbare five years from today, I’ll have another sale for those who want to buy replacements. Think of it as a really slow, non-renewing subscription plan for Hypercritical t-shirts. Just try not to spill anything on your shirts in the meantime. (Or consider buying backups! You know me and backups…)
", "date_modified" : "2023-07-12T11:01:56-04:00", "date_published" : "2023-07-12T11:01:56-04:00", "id" : "http://hypercritical.co/2023/07/12/hypercritical-t-shirts-return", "title" : "Hypercritical T-Shirts Return", "url" : "http://hypercritical.co/2023/07/12/hypercritical-t-shirts-return" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nI’m part of the MTV generation. If you can immediately picture the videos for Hey Mickey, The Safety Dance, You Might Think, Money For Nothing, and Take On Me, you might be too. I was transfixed from day one, not just by the bands and the music, but by the format. Some videos told a story (of varying levels of coherence). Others were more of a vibe, as the kids say these days. But always, the combination of sound and images, intertwined, synchronizing and diverging, pressed all my buttons.
\n\nMy affection for an equal partnership between music and video is reflected in many of the movies I love. Goodfellas, one of my all-time favorites, is arguably structured as a series of music videos separated by exposition. The best Star Wars movies are famous for their pervasive and dominating scores.
\n\nEven today, the alchemy of carefully combined music and video has not lost its power. Witness the outsized cultural impact of a certain scene in Stranger Things season 4.
\n\nIn all these cases, it’s not just the fact that there’s music in addition to dialog and sound effects. It’s that the music steps forward—both technically (in the audio mix) and emotionally. The music is a main character in much of the media that I love.
\n\nWhen game consoles added the ability to easily record gameplay, I immediately knew what I wanted to do with that capability. I wanted to make music videos.
\n\nI’ve been playing Destiny since shortly after it was released in 2014. For complicated and mostly business-related reasons, the game I’m playing today is called Destiny 2, but it’s been a largely unbroken experience across the two games for the past eight years.
\n\nThere’s a huge amount of Destiny-related video content on YouTube, and I’ve watched a lot of it. Two things are very clear about these kinds of videos. First, much like golf or tennis on TV, you’ll find it a lot more interesting if you’ve ever played the game yourself. Second, also like televised sports, the people playing Destiny in these videos are usually very good at the game.
\n\nI am not very good at Destiny. Even after literally thousands of hours1 of playing, I am just about average. And although Destiny is a popular game with millions of players, the chances of someone seeing one of my videos and also being a Destiny player is quite small.
\n\nThis is not a formula for success. My lack of game-playing skill means I can’t produce the raw material (i.e., gameplay recordings) needed to make really great videos, and my existing audience of Apple tech nerds has only a small overlap with the world of Destiny.
\n\nBut did I let this stop me? I did not. Six years ago, I started with a few tentative uploads of some awful (even by my standards) gameplay with minimal editing, no commentary, and no music. (I also snuck in a gag video based on my realization that the movie Moana and the first season of Westworld both have the same emotional climax. It’s true!)
\n\nMy first Destiny music video shows me learning how to not be irredeemably awful at using a sniper rifle in Destiny. It’s a record of the moment when, after four years of playing the game, I finally understood how sniping is supposed to work. It shows me graduating from “truly awful” to “merely bad.” (This was back when a single sniper headshot wouldn’t kill a roaming super, whippersnappers!)
\n\nNext came the “quest” videos, each of which cataloged my journey to acquire some in-game item (e.g., a pinnacle weapon). This is where I started to develop a recognizable style and format.
\n\nRather than retreating from the skill and audience problems described earlier, I embraced them. Since few people would ever see my videos, I could remain blissfuly unconcerned about enticing titles and custom thumbnails. As for the assumed knowledge necessary to get the most from these videos, I piled it on instead of trying to minimize it.
\n\nTake my Revoker Quest video as an example. To understand its premise, you’d have to know that “Revoker” is a sniper rifle and that the quest to obtain it requires a large number of sniper kills while playing against other people in Destiny.
\n\nOn top of that, it would also help to know some things about me. You might know (perhaps from watching earlier videos) that I’m not very good at sniping in Destiny, and you might also know that my preferred weapon in PvP is a shotgun (or at least you might know that shotguns are widely considered “easier to use” than sniper rifles). If you were playing Destiny when the Revoker quest was active, you might be familiar with how quest progress is presented in the user interface, and you might further know that the Revoker quest had multiple components, not all of which required sniper kills.
\n\nYou need all of this context to understand the orchestrated climax of the video (starting at around 5:12), in which I realize that I have completed the sniper-kills portion of the quest and can finally switch back to a loadout where I feel much more competent: a shotgun and a hand cannon. Oh, and that hand cannon? It’s Luna’s Howl, the arduous acquisition of which was documented in an earlier quest video.
\n\nSimilarly, only someone who is average (or worse) at Destiny and has suffered through the pain of having to get 200 double-kills with a grenade launcher in PvP and 100 Calculated Trajectory medals in order to complete the (pre-nerf, dagnabbit!) Mountaintop quest can truly appreciate the pain and suffering documented in my video about it. Fighting for heavy ammo to get “easy” kills with a heavy GL; getting one kill and then immediately dying; learning how to use Fighting Lion, a weapon I’d ignored until its ability to use primary ammo made it uniquely suited to this quest—it’s all in there.
\n\nIf this is all starting to sound like gibberish to you, I understand. It’s asking a lot of the audience to have so much background information and experience. The fact that I’m unable to communicate the prerequisite knowledge in the videos themselves is a condemnation of both my skills as an editor and as a game player. (I can only work with gameplay recordings that I generate myself, after all.)
\n\nAnd yet…I love these videos. I love the idea that a handful of people might watch them with all the context required to fully appreciate them. I love watching them myself from time to time, if only to see my own progress as a player and an editor.
\n\nI also love the moment during my normal life when inspiration strikes and I know what song I’m going to use for my next music video. Sometimes it’s months between the moment of inspiration and when I finally get around to making the video. This was the case with my most recent release, but I’m glad I waited long enough for it to be my first video made with 60-fps gameplay from my PlayStation 5.
\n\nIt’s not a “quest” video (Bungie removed pinnacle weapons a few years ago), so the scant narrative scaffolding that used to exist is gone now. Instead, I’ve gone back to my roots. I’m just trying to make a good music video. Here’s hoping someone else out there enjoys them as much as I do.
\n\nIf you don’t want to wade through everything on my channel, here’s a list of my Destiny music videos in reverse-chronological order.
\n\n4,124 hours as of February 5, 2023. ↩
SwitchGlass 2.0, the first major update to my customizable app switcher for macOS, is now available on the Mac App Store. It’s a free update for existing SwitchGlass users.
\n\n\n\nSince the initial release of SwitchGlass in 2020, the top feature request has been the ability to manually reorder apps in the app switcher. Version 2.0 adds that feature, and many more. To learn more about SwitchGlass, read the FAQ and the introductory post from 2020.
\n\nThough SwitchGlass 2.0 does not appear very different on the outside, more than 50% of the code has changed since the last 1.x release in April, 2022. The view that runs the app switcher saw the most significant revisions, thanks to my graduation from “absolute beginner” to “novice” when it comes to writing SwiftUI code. Baby steps.
\n\nI had to bump up the minimum supported OS to macOS 12.0 Monterey in order to implement drag-and-drop reordering in the app switcher. This is the price of using a framework like SwiftUI that’s still in its infancy on the Mac, I suppose. I would love to continue to update and support the 1.x version that runs on macOS 10.15 Catalina and later, but the Mac App Store does not allow it. Customers who purchased an earlier version of SwitchGlass can still use and re-download that version on pre-Monterey systems, but I can’t publish any new 1.x releases to the Mac App Store.
\n\nI started using TestFlight for macOS to distribute early versions of SwitchGlass 2.0 to a small group of beta testers. Thanks to everyone who provided bug reports and feature suggestions. If you’re interested in testing prerelease versions of SwitchGlass, let me know. There are always more bugs to be found…
", "date_modified" : "2022-10-25T13:38:57-04:00", "date_published" : "2022-10-25T13:38:58-04:00", "id" : "http://hypercritical.co/2022/10/25/switchglass-2", "title" : "SwitchGlass 2.0", "url" : "http://hypercritical.co/2022/10/25/switchglass-2" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nIn the Spring of 2019, I was looking for a way to promote one of our time-limited merchandise sales for Accidental Tech Podcast.\n\n\n\nAs part of these sales, we receive promo codes from our vendor for hitting certain milestones. Each promo code is good for a free t-shirt (including free shipping). I decided to give away these promo codes to fans on Twitter.
\n\nI wanted to do it in a fun way, perhaps with an Apple-themed trivia contest. Sadly, most trivia succumbs immediately to the power of a web search engine. I needed something that wasn’t so easy to Google. My first attempt was to post some hand-drawn line art, then ask people to identify it. Since I’d just created the drawing, I knew it wouldn’t be in any search results. And the crude nature of the art meant that a Google image search wouldn’t turn up any matching photos.
\n\nIt worked (I think), but I couldn’t come up with anything to draw after that. Instead, I posted a small portion of a larger image which I asked people to identify. Again, success. The image I’d chosen happened to be a frame from a TV show, and that gave me an idea.
\n\nFrom that point on, I’d post a small portion of a frame and then ask people to identify the movie or TV show from which it was extracted. I created a notes document to keep track of everything, and I titled it “Frame Game.”
\n\nSince then, I’ve posted almost sixty frames over three years, including a few excursions into audio. People seem to enjoy it. Movies and TV shows are great, and who doesn’t like free stuff?
\n\nWhat I enjoy the most about Frame Game is the process of carefully selecting the frame and the crop such that people who are very familiar with the piece of media will be able to guess the answer, while people who are not will be absolutely dumbfounded that anyone was able to figure it out at all, let alone so quickly. The best example of this was when I posted a tiny, 64-pixel square from a 1920 x 800 frame that was guessed in one minute and four seconds.
\n\nHave some people figured out how to use computers or web searches to brute-force this game? Almost certainly. But it makes me happier to believe that most people are playing it legitimately. I’d like to humbly suggest that playing for real will make the players happier too.
\n\nFrame Game has taken place entirely on Twitter, and it’s meant to be played in real time. Unfortunately, the way I’ve chosen to chain the tweets does not make it particularly easy to follow in the Twitter archives. In an effort to better preserve the historical record, I’ve created my own archive, linked below.
\n\n \n\nThere is no score-keeping, but you can “play” the game by attempting to guess the answer before clicking to reveal the full frame. If you cheat now, you’re only cheating yourself! Some frames also have hints that show ever-larger portions of the frame. (Hold down the Option key when clicking the button to reveal the full frame immediately without seeing any hints.)
\n\nI’ve had to resort to posting hints a few times during Frame Game, but the history viewer contains all the hint frames that I had prepared, regardless of whether or not they were needed. I’ve also linked to the original tweet, the declaration of the winner, and the winning tweet itself, if available. (Some winning tweets have since been deleted.) The time elapsed since the question was posted is also shown.
\n\nIf you like this kind of thing and want to play something similar every day, check out the recently released, Wordle-inspired framed.wtf.
\n\nThere is no schedule for Frame Game, other than usually coinciding with one of ATP’s seasonal merchandise sales. I’m not even sure if it helps increase sales at all. It’s just something fun that I like to do for the handful of fans who like to participate. If you want to play, follow me on Twitter and watch for a tweet that begins with the magic phrase, “The first person to identify…”
\n\nFrame Game can start at any time, so be vigilant!
\n", "date_modified" : "2022-04-25T10:23:01-04:00", "date_published" : "2022-04-25T10:23:02-04:00", "id" : "http://hypercritical.co/2022/04/25/frame-game", "title" : "Frame Game", "url" : "http://hypercritical.co/2022/04/25/frame-game" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nWhen I graduated college in 1997, I started a full-time job with the same dot-com startup that I had been working for part time during my senior year. In the twenty-five years that have followed, I’ve had a series of jobs in the same field (\"full-stack web development,\" in today’s parlance).
\n\nI’ve worked for companies of all sizes, from tiny startups to enterprise businesses with billions of dollars and thousands of employees. I’ve worked in downtown Boston, in Cambridge, and in the western suburbs. I’ve commuted to work by train, by car, and not at all. (I worked remotely at my very first job, and I have done so on and off for many years since.) All these jobs have been in the typical nine-to-five mold, and I’ve usually gone from one to the next without even a single day off in between.
\n\nEver since my first job, I’ve also always done…something else—something besides my “day job,” something that at least had the potential to bring in some extra money. I did a little contract programming at the start, but I didn’t find it appealing to just do more of what I was already doing.\n\n
I started writing for Ars Technica in 1999, and I continued doing that for fifteen years. I also wrote for Macworld (for print and the web), for my own website, and for a few other small publications. I enjoyed writing, and I could get paid for it.
\n\nEleven years ago, as my writing tapered off, I started podcasting, which I also enjoyed and found I could get paid for. Two years ago, I wrote two small Mac apps to scratch a few of my own itches.
\n\nMeanwhile, outside of my work life, I got married, bought a house, and had two children. Over the years, I’ve had to learn how to balance these competing concerns. As the financial demands of my life have increased, I’ve had to find a way to increase my income. As my family responsibilities have grown, I’ve had to reduce my “extra” work to a manageable level.
\n\nAs part of this process, I’ve had to find what I think of as my “maximum capacity.” How much can I ask of myself before I fall apart? I learned some important lessons at my very first job, even before I had a house or kids, by slamming hard into the limits of my own body thanks to chronic RSI. Later, my children helped me plumb the depths of sleep deprivation while also entirely recalibrating my value system.
\n\nAt each decision point, I’ve adjusted my life to fit within my maximum capacity by curtailing “unnecessary” activities. My family and my day job were necessities. Everything else was optional. As I’ve gotten older, my maximum capacity has decreased, of course, and I have exceeded my limits on many occasions. But for the most part, I’ve been able to keep it together.
\n\nIt hasn’t always felt great to be running “at maximum capacity” (or slightly beyond) for two and a half decades, but it has always felt like the right thing to do during this critical part of my life.
\n\nOver the past few years, something has started to change. When I’ve been presented with interesting opportunities that I’ve had to turn down (“Sorry, I’m at my maximum capacity right now…”) it has started to feel less like disciplined life-management and more like disappointment. It’s felt similarly lousy when I’ve had to reject my own ideas for new things I’d like to try. And when I’ve ignored those feelings and said yes when I knew I should say no (e.g., when I decided to make two Mac apps in two months), I’ve quickly bumped into my limits yet again—both physical and mental.
\n\nA few years ago, I started to question some of my assumptions. My decades of work on my “second career” had slowly built it up to the point where it was plausibly viable on its own. Was my day job really necessary? I started formulating a plan to quit.
\n\nThen came COVID-19…and it kept coming. There was just too much uncertainty. My plans were put on hold. It’s been a rough few years for everyone, including my family. The whole experience recalibrated my value system one more time. I started to think more about the limited number of years I have left—with my kids, in good health, on this earth. How do I want to spend that time?
\n\nBy 2022, I had returned to thinking not only that it’s possible for me to quit my day job, but that it’s necessary for me to do so.
\n\nAnd so, on March 25, 2022, I left my “normal” job. I am now officially self-employed.
\n\n“Going indie” is what we used to call it in the early 2000s. Back then, in my circles, it usually meant creating and selling your own Mac (and, eventually, iPhone) apps, but each person’s road to independence is different.
\n\nI’m lucky to know so many people who have walked this same path before me. They’ve all taught me so much about what it means—and what it takes—to be independent. John Gruber took some huge risks when he went independent back in 2006. At that time, like John, I had recently had my first child, and the idea of quitting my “real job” was unthinkable to me. All my current podcast co-hosts are independent: Merlin Mann since 2002, Marco Arment since 2010, Casey Liss since 2018, and Jason Snell since 2014. And there are many more—too many to list here. When I think about the friends I’ve made as part of my second career, it often seems like they’re all independent. Now, finally, I’m ready.
\n\nI am thankful to have had such a conventional, largely successful career at my various day jobs. Like many people who entered the tech world in the late 1990s, I worked for several companies that were later acquired or went out of business. And, like most people, I did not strike it rich at any point via an IPO or similar “exit” event. But the regular salary from my day job did help pay for my house, my car, some nice vacations—a whole life for myself and my family, which is all I ever wanted.
\n\nI’m also thankful for everyone who has made my second career possible: all the people who have read my writing or listened to me on a podcast. Special thanks to those of you who have supported me by buying something from a sponsor or paying me directly for my work. I would not be able to do this without you.
\n\nFinally, I want to thank my wife, Tina, who has always supported my “weird hobbies,” even back when they took an amount of my time that was far out of proportion with the money they brought in. Each time I have exceeded my maximum capacity over the years, she has been there to pick up the slack, all while pursuing her own career. I would not be where I am today without her love and support.
\n\nYou can hear me talk more about this topic on episode 179 of Reconcilable Differences (starting at 50:47).
\n\nIf you want to know how you can best support my work, the answer right now is through podcast memberships. It’s not a coincidence that so many independent podcasts started paid membership programs shortly after COVID hit. Memberships provide reliable income in an uncertain market. Each of my podcasts has a membership program, linked below.
\n\n\n\nBoth monthly and annual memberships are available. The member benefits vary, but all include a version of the show without any ads, plus some amount of bonus content.
\n\nPodcasts are now literally how I make my living. (Boy, that’s weird to write. I’m not sure how I’m going to say it to people in person.) I hope you’ll all continue to listen. Wish me luck…
", "date_modified" : "2022-03-30T21:55:04-04:00", "date_published" : "2022-03-30T21:55:05-04:00", "id" : "http://hypercritical.co/2022/03/30/independence-day", "title" : "Independence Day", "url" : "http://hypercritical.co/2022/03/30/independence-day" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "My unsolicited streaming app spec has garnered a lot of feedback. I’m sure streaming app developers already gather feedback from their users, and I’m also sure that the tone of my post has skewed the nature of the feedback I received. Nevertheless, for posterity, here’s how people are feeling about the streaming video apps they use.
\n\nThe number one complaint, by far, was that streaming apps make it too difficult to resume watching whatever you were already watching. As I noted earlier, conflicting incentives easily explain this, but people still hate it. A reader who wished to remain anonymous sent this story of how customer satisfaction gets sacrificed on the altar of “engagement.”
\n\n\n\nThere was an experiment at Hulu last year to move “Continue Watching” below the fold (down 2 rows from where it was). This was done with a very small group of users. It was so successful that the increased engagement was projected to generate more than $20 million a year. The experiment was immediately ended and the new position was deployed to all users.
\n\nWhile I understand (and largely agree with) your frustration that your “in progress” show isn’t the top feature, you can argue that [making new content more prominent] provides the user more value as they discover content they wouldn’t have otherwise (hence the increased engagement).
This is definitely a case of “be careful what you measure.” I don’t doubt that whatever metric is being used to gauge “engagement” is indeed boosted by burying the “Continue Watching” section, but I must emphasize again, according to the feedback I received, people hate this practice with a fiery passion. It makes them dislike the app, and sometimes also the streaming service itself.
\n\nI don’t think any engagement-related metric is worth angering users in this way—even if it really does help users discover new content or stay subscribed longer. I’m reminded of the old saying, “People won’t remember what you said, but they will remember how you made them feel.” It applies to apps as well as people.
\n\n(Furthermore, given the fact that seemingly every popular streaming app does this to some degree, there’s an opportunity to seize a competitive advantage by becoming the first app to stop this user-hostile practice.)
\n\nThe second biggest category of feedback was about detecting, preserving, and altering state. Apps that do a poor job of deciding when something has been “watched” drew much ire. (Hint: most people don’t sit through all the ending credits.) Compounding this is the inability to manually mark something as watched or unwatched. Failure to reliably sync state across devices is the cherry on top.
\n\nPeople don’t feel like they are in control of their “data,” such as it is. The apps make bad guesses or forget things they should remember, and the user has no way to correct them. Some people told me they have simply given up. They now treat their streaming app as a glorified search box, hunting anew each time for the content they want to watch, and keeping track of what they’ve already watched using other means, sometimes even using other apps. (I imagine this flailing on each app launch may read as “increased engagement.”)
\n\nFinally, there was a long tail of basic usability complaints: text that’s too small; text that’s truncated, with no way to see more; non-obvious navigation; inscrutable icons and controls; and a general lack of preferences or settings, leaving everyone at the mercy of the defaults. Oh yeah, and don’t forget bugs, of course. Multiple people cited my personal most-hated bug: pausing and then resuming playback only to have it start playing from a position several minutes in the past. Have fun trying to fast-forward to where you actually left off without accidentally spoiling anything for yourself by over-shooting!
\n\nWhile again acknowledging how the nature of my original post (and my audience in general) surely affects the feedback I receive, I think it’s worth noting that no one—not a single person—wrote to tell me how much they loved using their streaming app. I didn’t expect to get much pushback on a post criticizing something so widely maligned, but I did expect to get some. I’m sure many people do enjoy their streaming app of choice, especially if it’s one of the more obscure, tech-oriented ones like Plex or Channels, but the overall sentiment is clear. Do streaming services care? I think they should.
", "date_modified" : "2022-02-17T14:28:55-05:00", "date_published" : "2022-02-17T14:28:56-05:00", "id" : "http://hypercritical.co/2022/02/17/streaming-app-sentiments", "title" : "Streaming App Sentiments", "url" : "http://hypercritical.co/2022/02/17/streaming-app-sentiments" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "I subscribe to a lot of streaming video services, and that means I use a lot of streaming video apps. Most of them fall short of my expectations. Here, then, is a simple specification for a streaming video app. Follow it, and your app will be well on its way to not sucking.
\n\nThis spec includes only the basics. It leaves plenty of room for apps to differentiate themselves by surprising and delighting their users with clever features not listed here. But to all the streaming app developers out there, please consider covering these fundamentals before working on your Unique Selling Proposition.
\n\nObviously, a list of even the most rudimentary features can’t help but also be opinionated. Though my tastes have surely influenced this list, I really do think that any streaming app that fails to implement nearly all of these features is failing its users. Again, these are not frills. These are the bare-bones basics.
\n\nOn launch, it must be immediately obvious how to resume watching whatever the user was watching previously. This may be the most important feature outside the video player itself.
\n\nIf the user was in the middle of watching an episode of a TV show, the most prominent thing on the screen should be a way to continue that episode. If the user just finished an episode, then “resuming” means watching the next episode, and so on.
\n\nResuming exactly where the user left off—for example, launching into the video player, paused at the exact moment the user stopped watching—is also acceptable, provided it is made obvious that this has happened. Launching into a completely black video playback screen is not a good experience.
\n\n(I am ignoring user profiles for now—that’s how basic this specification is. But a good app should support profiles in some way, and this may add a step for the user to select their profile before getting to the point where they can resume viewing.)
\n\nExpose and support the intrinsic information hierarchy of the media. TV shows have seasons. Seasons have episodes. Episodes are made by people (actors, writers, directors). Whatever other ways an app chooses to slice and dice the media it vends, it must also support the simple hierarchy that is most likely to match the user’s mental model.
\n\nThis hierarchy should exist both visually and navigationally. From an episode of a TV show, it should be obvious how to go up in the hierarchy to the season that the episode exists within, and from there to the list of seasons in the show, and then perhaps down into another season, then down into an episode of that season, and so on.
\n\nThough it’s often desirable to take shortcuts when navigating (e.g., to jump back to the home screen after completing the final episode of a TV series), that doesn’t mean the hierarchy shouldn’t exist at all. A shortcut is a way to skip levels in the hierarchy, not a way to erase it from the app entirely.
\n\nKeep track of what the user has done, and when. Which things has the user watched? Were they watched entirely or partially? How many times has something been watched? Were any parts skipped? This information is crucial for the functionality of the app, and it should be treated as precious. Preserve this state the same way a text editor preserves typed characters. Sync it across all instances of the app.
\n\nThe things the app knows should be communicated visually to the user. When viewing a list of episodes, put something on the screen to indicate which ones have been viewed and which ones haven’t. Consider showing a user’s progress within an episode as well. No one likes visual clutter, but a simple progress bar (for example) can show both of these things in a single, slim interface element.
\n\nSimilarly, when video is playing, it should be possible to find out what, exactly, is being played. The most straightforward way to do this is to show some text when the video is paused that identifies the TV show, season number, and episode number.
\n\nThe user has questions, and the app has the answers. It need only communicate them. What am I watching? How long is it? How much time is left? What is the name of this actor? What year was this movie made? When will the next episode of this TV show be released? Was this TV show cancelled? And on and on. This information is useless if it’s not exposed in the interface. Visual elements—well-placed in a sensible information hierarchy—are the key to solving this problem.
\n\nThe following playback controls must be one tap/click away and must have large, obvious targets.
\n\nThe following playback controls must be accessible without leaving the video player. They may be more than one tap/click away.
\n\nThe following information must be accessible without leaving the video player.
\n\nThere must be a way to pause the video and get an unobstructed view of a still frame. That means no playback controls on top of the video and no dimming or tinting of the video frame. It’s fine if it takes a few taps to get to this state, but it must be possible.
\n\nWhen a video ends, there must be a way to go to the next video, assuming there is an obvious choice for this (e.g., the next episode in a TV show).
\n\nThere must be a way for the user to manually create a list of media. In the common case, this is a list of media that the user intends to watch (eventually), but it can be used for any purpose. The important part is that the user makes the list intentionally. Nothing gets added to this list automatically.
\n\nAt a minimum, the list must accept top-level items in the hierarchy (e.g., TV shows, movies). The list could also accept more granular items, like individual TV episodes.
\n\nThis is the one feature that may seem the least “basic,” but it really is essential. There’s so much good content available today that we need our apps to help us keep track of it all, not just what we’re currently watching. If state preservation and visual communication are the app’s short-term memory, then “My List” is the app’s long-term memory.
\n\nThis is a pretty boring list, huh? A streaming app with only these features seems like it would be quite limited. But the sad fact is that few, if any, popular streaming apps reach even this extremely low bar. Let’s take a look at some examples.
\n\nThe last thing I did in the app was watch part of an episode of a TV show. On launch, after selecting my user profile, the show I was in the middle of watching is not visible anywhere on the screen. The “Continue watching for John” section, several screens lower down, contains buttons to resume many other shows, but not the one I was just watching. (Maybe it’s because I started watching it from “My List”? Who knows?)
\n\nWhen playing video, there is no way to toggle subtitles on and off with a single tap. (It takes three taps to turn them on and another three to turn them off.) There is also no way to skip to the beginning other than dragging the scrubber manually.
\n\nPausing the video shows the season number, episode number, and title, but not the name of the TV show.
\n\nThe duration of the video is not shown anywhere unless the video has just started. To get the duration, the user must add the time remaining (displayed at the end of the timeline) to the current play position (displayed when the scrubber is “grabbed” by holding a finger down on it).
\n\nThough there is limited access to the intrinsic hierarchy of the media (e.g., I can go from watching an episode of a TV show to a list of episodes in the current season), it is incomplete, and it does not expose all the available information. For example, there is no obvious way to get from the video player to the episode list and then to a detail screen for an individual episode that shows things like the cast and the date it was released. Instead, the video must be “closed,” which may lead to an episode detail page, provided that’s where you started when navigating to the episode in the first place. The information hierarchy, such as it exists, is quite a muddle, and it only sporadically intersects with the navigation hierarchy.
\n\nThe last thing I did in the app was watch the latest episode of a TV show. On launch, a promo for a show I have never watched fills most of the screen, and a small “Continue Watching” section is partially visible at the very bottom. It shows an episode of a TV show that I have already finished watching (complete with an entirely full progress bar) and a movie I skipped into the middle of to check something several months ago. The TV show I was watching is not listed, even though the only thing I’ve done in the HBO Max app for the past week is watch episodes of this show.
\n\nWhen playing video, there is no way to toggle subtitles on and off with a single tap. (It takes three taps to turn them on and another three to turn them off.)
\n\nThe duration of the video is not shown anywhere unless the video has just started. To get the duration, the user must add the time remaining (displayed at the end of the timeline) to the current play position (displayed at the start of the timeline).
\n\nThe last thing I did in the app was watch part of an episode of a TV show. On launch, after selecting my user profile, the show I was in the middle of watching is not visible anywhere on the screen. I had to scroll down two rows to get to the “Continue Watching” section, where my episode was listed.
\n\nWhen playing video, there is no way to toggle subtitles on and off with a single action. Instead, I have to swipe down to display a menu of options, swipe over to subtitles, swipe down to pick a language, and click to select it—then do the same steps again to turn subtitles off.
\n\nI could not find a way to get from the video player to either an episode list or a detail page for the episode I’m watching. Like the Netflix app (and many others), the relationship between the information hierarchy and the navigation hierarchy is tenuous at best.
\n\nThis is not an exhaustive exploration of any of these apps, let alone all streaming apps. And I’m sure some people will quibble with the particulars of my spec. For example, why place so much emphasis on quick access to subtitles? (It’s because being able to quickly skip backwards and briefly enable subtitles is something I do frequently, both on my own and at the request of others. Though keeping subtitles on all the time is surely the most common use case, briefly enabling them to clarify a few lines of dialogue is a close second.)
\n\nAnd, yes, I know that there are often other, “better” ways to accomplish these tasks in some apps on some platforms. For example, I can hold down the microphone button on my Apple TV remote and say “enable subtitles” or “disable subtitles” and it will usually work. Better still, I can ask “What did he say?” and the Apple TV will skip backwards, enable subtitles, play for a short duration, and then disable subtitles again, all on its own. Surprise and delight!
\n\nBut none of this changes the overall picture, which is that even the most popular, well-funded streaming video apps fail to get the basics right in a shocking number of ways. Conflicting incentives surely explain some of these failings (e.g., promoting new content rather than letting me quickly resume what I was already watching), but an explanation doesn’t make these shortcomings any less bothersome.
\n\nAnd then there are the gaps that seem unmotivated. Is there really no room on a giant iPad or TV screen to show me the name of the TV show I’m watching when the video is paused? Why is it so hard to go from viewing an episode of a TV show to a list of episodes for that show? Why is there sometimes no way other than voice control to enable subtitles or change the audio track while watching a video? There’s plenty of low-hanging fruit waiting to be picked.
\n\nI tried to limit myself to the basics to prove a point, but there is a vast world of good ideas that are just beyond the basics. These are simple, proven techniques like remembering which option a user picked from a menu the last time and bubbling that up as the top choice, or adding (gasp!) settings to let the user configure features according to their preferences, like how many seconds forward or backwards the skip buttons should travel, or which subtitle or audio track should be on by default, perhaps with per-show customizations.
\n\nAnd if you think this spec is just a list of my personal preferences, I can assure you that list is much longer. To give just one example, I wish every streaming app had a way to advance forward and backward by a single frame at a time. Trying to precisely manipulate the play/pause button or the timeline scrubber to get to the exact frame where I can read some bit of background text is not a game I enjoy playing. (Laggy, unresponsive apps make this even worse.)
\n\nAlso consider creating interface elements that are reusable. A good control for filtering and sorting lists, for example, could be used in many places within a streaming app. (Most offer no sorting options at all, which is criminal.) The same goes for iconography for status and actions: standardize it, and use it everywhere. It’s a sad state of affairs when the original TiVo on-screen interface bests most modern streaming apps in terms of predictability, legibility, and consistency.
\n\nAnd let’s not forget the tried-and-true practice of stealing features from competitors. How has no one yet copied Amazon’s X-Ray feature? Why doesn’t Apple TV+ have any way to manually curate a list of TV shows like seemingly every one of its competitors? Why don’t more apps provide multiple organizational views of the same content like the Disney+ app does? (E.g., release order vs. chronological order for movie series.)
\n\nMost streaming apps aim for mass-market appeal, so they can’t get too complex. But today, they’re at the far opposite end of the spectrum, missing basic functionality rather than being bogged down with fancy features and customization. These apps need to walk before they can run. I hope, someday, at least one or two of them can fly.
", "date_modified" : "2022-02-15T12:17:59-05:00", "date_published" : "2022-02-15T12:17:59-05:00", "id" : "http://hypercritical.co/2022/02/15/streaming-apps", "title" : "An Unsolicited Streaming App Spec", "url" : "http://hypercritical.co/2022/02/15/streaming-apps" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Thanks to either my opinionated nature or the fact that I have voiced my opinions on various podcasts for years, people often ask me to recommend products. Which Mac should I buy? What’s the best microwave oven? What kind of car should I get for a family of four?
\n\nNow, I’m no Wirecutter or Consumer Reports. I’m just one person. With a few exceptions, I don’t have personal experience with more than a handful of individual products in a given category. But I know a good product when I see it (and use it).
\n\nThis page lists some products that I consider “good.” This may sound like a low bar, but sometimes “good” is as good as it gets for a certain type of product. Even with this lenient standard, the list is not long. As with my Great Games list, I will add products to this page over time. I may also remove or replace products if something better comes along.
\n\nIf you buy something after following a product link on this page, I may receive money through the seller’s affiliate program. (Not all retailers have affiliate programs, and not all products are eligible for affiliate payments.)
\n\nI love toaster ovens, and I’ve personally tested many of them over the years. Casey Liss, my friend and ATP co-host, tells the tale of the strange confluence of events that led me to try so many toaster ovens, and provides links to listen to my (audio) reviews of each one, if you want all the gory details. If you just want my recommendation, it’s (still) the Breville 650 XL. (It’s also available at Amazon.)
\n\nThere are two caveats about this toaster oven. First, it’s bigger than you might expect: 16.5 inches wide, 13 inches deep, and 9.5 inches high. Measure your counter space before purchasing this beast. Second, the knob-feel is terrible: loose, imprecise, unsatisfying.
\n\nAs a product, this is a good toaster oven. But if you can get past its user-interface foibles, it does a great job actually toasting (or cooking) things. I’ve had mine for a decade and, I’ve still not found anything better.
\n\nIf you have too little counter space for the Breville and want a toaster oven that can toast bread both well and quickly, consider the Panasonic FlashXpress. I think its user interface is subpar—confusing, poorly arranged buttons clustered below the door—but it’s a speed demon when it comes to making toast.
\n\nBreville also makes a smaller 450 XL model that is not quite as powerful as its big sister, and not quite as fast as the Panasonic, but it’s a good choice if you like the Breville’s proportions and UI.
\n\n(And, no, I don’t have any recommendations for slot toasters. Toaster ovens forever.)
\n\nThe OXO Good Grips Solid Stainless Steel Ice Cream Scoop is (probably) the world’s greatest ice cream scoop. I know it looks like just the ones you’ve used before that can’t make a dent in hard-frozen ice cream and end up forming ugly, rusty pits in the well of the scoop, but I can assure you that this is a different class of product entirely.
\n\nAs the name suggests, it’s made of solid stainless steel. It’s strong, uniform throughout (no coating to chip away), and pleasingly hefty. The pointed tip can defeat even the hardest ice cream. Soak it in warm water and the thermal mass of this heavy instrument will keep doing work, scoop after scoop, for as long as you need it. The handle is typical Oxo: soft, grippy rubber.
\n\nAs I am writing this, I am ordering myself a backup scoop just in case Oxo ever stops making this product. (The only thing I can imagine damaging the one I already have is a trip into the garbage disposal…but that is a thing that has been known to happen in my house, so better safe than sorry.)
\n\nUpdate (January 2023): Like seemingly all the Oxo products that I love, it looks like this one is no longer available. In its place, there’s this scoop, which matches the shape of mine, but not the material finish, and this scoop, which matches the material, but not the shape. People have reported getting scoops that don’t match either photo on Amazon, however, so beware. One person suggested this scoop from SUMO, which he said arrived looking very much like the Oxo that I recommend.
\n\nThe Victorinox Fibrox Pro Knife, 8-Inch is the best inexpensive chef’s knife I have ever used. There are better knives for (much) more money, but none in this price range come close. I own knives that cost twice as much and are not even half as good.
\n\nThe grip is not quite up to Oxo‘s standards in terms of materials, but it follows the same philosophy: grippy and comfortable, with no concern for how it looks. The blade is shaped perfectly and stays sharp for much longer than you would expect. And it’s easy to clean and sharpen: no weird seams or chamfers.
\n\nLike the ice cream scoop, this is a product I love so much that I’ve purchased backup copies just in case it’s ever discontinued. I still routinely purchase more-expensive chef’s knives (I love kitchen tools), but so far, none has displaced this $35 wonder for all-around utility.
\n\nThe Breville BWM640XL Smart 4-Slice Waffle Maker is $350. This is a ridiculous amount of money to spend on a waffle maker. It’s huge and heavy. And I personally prefer thinner waffles with more, smaller squares. (The Breville makes four waffles that are over an inch thick, each with 25 squares.)
\n\nAll of that said, it does a pretty amazing job. The waffles are evenly cooked and release easily from the non-stick surface. The gutter around the edge, meant to catch excess batter, does actually work. The controls and the LCD screen are surely overkill for what boils down to a fancy way to set the cooking time, but they work well and are easy to understand.
\n\nYou might think the lack of removable heating surfaces would make it hard to clean, but cooked waffles leave almost nothing behind after they’re removed. Wiping the surfaces with a damp paper towel is usually all the cleaning that’s necessary. The permanently attached heating surfaces make the whole device feel sturdy, and they help prevent any batter from getting inside the machine.
\n\nI resisted buying this over-priced monstrosity for a long time. I purchased and returned several waffle makers that were just terrible. I could not find a reasonably priced model that was competent and consistent. I finally bit the bullet and bought the Breville. This price is (still) galling, and I (still) wish the waffles were thinner and had more, smaller squares. But within the size constraints inherent in its design, this damned thing makes perfectly cooked waffles every single time. It’s infuriating, really.
", "date_modified" : "2020-08-31T17:39:10-04:00", "date_published" : "2020-08-31T17:39:10-04:00", "id" : "http://hypercritical.co/2020/08/31/good-products", "title" : "Good Products", "url" : "http://hypercritical.co/2020/08/31/good-products" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Ever since the story broke, I’ve had one overriding thought about the Hey.com App Store rejection controversy. It’s a point I’ve already tried to make on a recent episode of ATP and on Twitter. Before WWDC arrives with its own wave of Apple-related news, I’d like to take one more run at it. Here goes.
\n\nEveryone wants apps that are feature-rich, easy-to-use, secure, and have good customer support. Apple, developers, and customers all agree on this. Incentives diverge slightly from here. Both Apple and developers want to make money. Customers want app prices to be low, but also want apps that are well-supported and maintained.
\n\nApple, through its control of the App Store, dictates the terms that developers must agree to in order to distribute iOS apps to customers. Apple’s rules determine how the interests of all parties are balanced.
\n\nFor many years now, Apple has been aiming for an ambitious goal state: an App Store filled with feature-rich, easy-to-use, secure apps, sold at prices customers find attractive, and monetized in a way that keeps developers happy and profitable while also giving Apple a significant percentage of all app-related revenue: 30% for most things, 15% after the first year of subscriptions, and some other, usually non-public number that’s less than 30% if you happen to be a fellow tech giant like Netflix or Amazon.
\n\nThe App Store rules are the most powerful tool Apple can use to achieve its goal. To this end, the rules have been adjusted many times over the years. But throughout all these changes, Apple has never given up on its dream of an App Store filled with great apps that make everyone happy and make lots of money for both Apple and developers.
\n\nToday, Apple’s stance seems to be that if they just hold the line on a few key provisions of the App Store rules, companies will build their business models around Apple’s revenue cut in the same way companies built their business models around the costs of brick-and-mortar retail in the pre-Internet days. Apple seems to firmly believe that its ambitious goal state can be achieved with something close to the current set of App Store rules.
\n\nThis belief is not supported by the evidence. Years of history has shown that Apple is getting further away from its goal, not closer. Witness Netflix abandoning in-app purchase, Apple having to strike a special deal with Amazon, and all the apps skirting the existing rules as best they can, to the detriment of the user experience and both Apple’s and developers’ revenue. And this is before even considering the customer support situation, which has always been dire, or the existence of businesses like ebook sales that will never have an extra 30% handy to give to Apple.
\n\nApple’s App Store rules need to change not (just) because developers don’t like them. They need to change because time and experience have shown that there is no viable path to Apple’s goal state given the existing rules. The details of any particular App Store controversy can often distract from this larger reality. A hardline stance will not sway hearts and minds, and it has proven unable to change developers’ business models without sacrificing the user experience. Apple needs to decide if it wants to be “right,” or if it wants to be happy.
", "date_modified" : "2020-06-20T11:30:40-04:00", "date_published" : "2020-06-20T11:30:40-04:00", "id" : "http://hypercritical.co/2020/06/20/the-art-of-the-possible", "title" : "The Art of the Possible", "url" : "http://hypercritical.co/2020/06/20/the-art-of-the-possible" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "
\n\nWhen DragThing was finally left behind—after 24 years of service—by macOS Catalina’s lack of support for 32-bit apps, I knew I’d miss many of its features. I missed its (optional) modification of the Mac’s window-layering policy so much that I made my first Mac app, Front and Center, to replace it. My second Mac app, SwitchGlass, also replaces a feature I miss from DragThing. (Thank you, James Thomson, for unwittingly kickstarting my Mac development efforts.)
\n\n \n\nSwitchGlass adds a dedicated application switcher to your Mac. You can customize its appearance, size, and position on each attached display, including hiding it on selected displays. It pairs perfectly with Front and Center, supporting both click and Shift-click actions on app icons in the floating app switcher. SwitchGlass is available for $4.99 on the Mac App Store. To learn more, please read the FAQ.
\n\n\n\n\n\nI wrote SwitchGlass and Front and Center to satisfy my own needs. I run both apps all day, every day on my Mac. I’ve been a professional programmer for almost 25 years, but until this year, I’d never written anything for my favorite platform. It’s immensely satisfying to be able to scratch my own itch. And it’s even more satisfying to learn that there are other people out there who also appreciate my strange little apps.
\n\nThanks to everyone who has purchased one of my apps. And special thanks to Brad Ellis for creating the beautiful SwitchGlass icon.
\n\nP.S. - I may not be the only one who misses DragThing’s application switcher. The phenomenally powerful Mac automation app Keyboard Maestro recently added a similar feature. In fact, SwitchGlass’s default appearance is inspired by Keyboard Maestro’s app switcher. If you want a hugely capable Mac automation tool that just happens to have an (optional) app switcher palette built in, check out Keyboard Maestro. I highly recommend it.
", "date_modified" : "2020-02-12T20:12:44-05:00", "date_published" : "2020-02-12T20:12:44-05:00", "id" : "http://hypercritical.co/2020/02/12/switchglass", "title" : "SwitchGlass", "url" : "http://hypercritical.co/2020/02/12/switchglass" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "For a few years now, I’ve tracked the TV shows I’m watching using the iOS app Couchy, which integrates with the Trakt.tv service. Sadly, Couchy ceased development last year. I’ve kept using it since then, but in the past few weeks it’s finally started to fail.
\n\nI looked at (and purchased) many, many alternative apps back when Couchy’s demise was announced, but I could never find one that I liked as much. In particular, I haven’t found a match for the information density of Couchy’s main screen combined with its “smart” sort order.
\n\nCouchy’s main screen shows a scrollable grid of portrait-orientation poster images for each TV show, three to a row on my iPhone, each with text below it showing the show name, how many episodes behind I am, the season, the episode number, and title of the next episode. (I’d include a screenshot here, but poster images are no longer loading for me in Couchy, so it wouldn’t be much to look at.)
\n\nThe sort order determines how the shows are placed in the grid. Within the app, Couchy describes its smart sort as follows:
\n\n\n\n\nShows will be sorted in the following order:
\n\n\n
\n- Episodes airing today
\n- Missing episodes
\n- Awaiting episode
\n- Ended shows
\n
As I’ve tweeted about my search for a Couchy-replacement app, I’ve found it difficult to explain what I’m looking for in terms of sorting. And even Couchy’s sorting is sometimes not quite what I want. So I’d like to explain here instead, free from Twitter’s character limits.
\n\nI use an app like Couchy because I’m usually in the middle of watching many different TV shows. When I have some time to watch TV, I launch Couchy to remind myself what I’m currently watching, how far behind I am, and which shows have new episodes waiting for me. This is my most important use case: choosing a show to watch.
\n\nI have so many TV shows in my trakt.tv collection that sorting is essential to helping me select a show. I don’t want to scroll through dozens of shows to make a selection. I want to look at the top one or two screenfuls of shows on my phone and be sure that I’m seeing all the shows I’m most interested in watching now.
\n\nMost simple sort orders don’t work for my purposes. For example, consider sorting by the date of the latest episode. There are many shows in my collection that I’m not actively watching. Maybe I’ll get to them in the future, but for now, the unwatched episodes are just piling up. If those shows jump to the top of the sort order every time a new episode is released, it’s just noise to me. They’re obscuring the shows I actually want to watch.
\n\nSorting by the number of unwatched episodes has similar problems. Sorting by the date I last watched an episode of a show might seem like it’d work, but I might really want to know about a newly released episode of a show that I’m caught up on but that hasn’t released an episode in a while.
\n\nIf I had an actual, concrete algorithm in mind, I wouldn’t be writing all this. I could have explained it in a tweet. But I haven’t thought it through enough to nail it down at that level. What I can do instead is describe the desirable features of such an algorithm.
\n\nIf I’m not actively watching a show, it should be pushed down in the list. Deciding what “actively watching” means will surely involve some thresholds (e.g., “has watched an episode in the last N days”), and it would be nice if those were configurable.
Shows that I’m actively watching should jump to the front of the list when a new episode is released.
Shows that I really like but that are on a break (e.g., between seasons) should jump to the front of the list when a new episode or season is released. Again, determining which shows I “really like” is tricky. An easy out here is to just have me choose by marking them as favorites. A ranked list of favorites would be even better and would help with sorting decisions near the top of the list.
When sorting shows that I’m actively watching (or really like) that just had a new episode or season released, favor shows with the smallest backlog—except in cases where a whole new season just dropped for a favorite show. For example, let’s say I’m one episode behind on Homeland, two episodes behind on Fargo, and caught up on The Expanse, which is a favorite show. If Homeland and Fargo both release new episodes and The Expanse releases a whole new season, all on the same day, the sort order should be: The Expanse first (even though it has the largest backlog), Homeland second (because it has a shorter backlog than Fargo), and Fargo next.
I could go on, but I think I’m getting into the weeds. The four points above capture most of it. I’m sure other people have their own preferred sorting orders, but this one is mine. I’ve seriously considered writing a trakt.tv client app for iOS just to scratch my own itch, but I don’t think I’m ready to tackle a task that large quite yet.
\n\nIn the meantime, if you’re an author of one of the many trakt.tv client apps in the App Store, please consider implementing something like what I’ve described here. I’ve probably already purchased your app, but I’ll be extremely grateful on top of that.
", "date_modified" : "2020-01-29T12:03:28-05:00", "date_published" : "2020-01-29T12:03:28-05:00", "id" : "http://hypercritical.co/2020/01/29/sorting-my-tv", "title" : "Sorting My TV", "url" : "http://hypercritical.co/2020/01/29/sorting-my-tv" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nBy the time Mac OS X was first released in 2001, I had been using what would eventually be known as “classic” Mac OS for seventeen years. These were seventeen formative years for me, from the ages of 9 to 26. The user interface of classic Mac OS was as ingrained in me as Star Wars or any other cultural institution.
\n\nMy love for classic Mac OS is why I started researching and reviewing Mac OS X. Big changes were coming to the Mac, and I was going to feel them more than most. I needed to know what I was in for.
\n\nTo deal with some of the changes in Mac OS X, I ran apps and system extensions that restored some behaviors from classic Mac OS. Over the years, I weaned myself off most of these, but a few stuck. In particular, I found I did not want to live without the window layering policy from classic Mac OS.
\n\nIn classic, when you click on a window that belongs to an application that’s not currently active, all the windows that belong to that application come to the front. In Mac OS X (and macOS), only the window that you click comes to the front.
\n\nMy particular style of window management leans heavily on the classic behavior. I also appreciate the Mac OS X behavior in certain circumstances, so I was delighted to find apps that enable both behaviors, using Shift-click to override the default.
\n\nSadly, macOS Catalina’s lack of support for 32-bit apps finally killed the last of the apps that implemented this feature. I was alone in a cold, barren world where I had to click on a Dock icon to switch to an app and bring all its windows to the front.
\n\nI tried to get used to it, but I could not. Next, I tried to persuade a few of my developer friends to create a tiny Mac app that implements just this one feature. My friend Lee, a longtime Mac developer and user, eventually took up the challenge and created a simple app to do it.
\n\nIt was missing a few features I wanted (the Shift-click override, the ability to hide the Dock icon, a menu bar icon, etc.), but Lee shared the source code with me and I dove in and tried to help. I added the Shift-click feature and a mode-switch preference. I drew an app icon and a menu bar icon. The app was just about done. It even had a name: Front and Center.
\n\nThe app was written in Objective-C. I’d always wanted to do a real project in Swift, so I started a new project in Xcode and rewrote the entire (tiny) app in Swift. I’ve also always wanted to get some experience with the App Store, so Lee and I agreed that I would release it under my developer account (though we are sharing the profits).
\n\nFront and Center is a trivial app—so trivial that I was afraid it would be rejected for its limited functionality. But when running, it is used literally hundreds of times a day. And I obviously found it so essential that I was willing to help bring it into existence myself. I also wanted to get some experience with the financial side of the App Store.
\n\nAll of this contributed to the decision to make Front and Center a (cheap) paid app. It’s $4.99 on the Mac App Store. I don’t expect to make any significant money from sales, but I’ve already gained a huge amount of experience just going through the process of development and distribution.
\n\nI also view the price as a kind of deterrent. The increase in downloads it would receive as a free app would just be an unwanted support burden. The (few) people who actually want this app know who they are, and I’m betting they are not just willing but happy\n to pay for it.
\n\nAre you a classic Mac diehard who still misses some of the old ways? Or maybe you just want to try it to see what it’s like? Even if you don’t want the classic behavior to be the default, you can switch to “modern” mode and use Shift-click to trigger the “classic” behavior. It beats mousing down to a Dock icon, right?
\n\nI’m just glad this app exists. I had a ton of fun working on it. Thanks to Lee for being a kindred spirit when it comes to classic Mac OS, and to all my other Mac-nerd friends who offered advice and code during development—especially Gus Mueller, maker of many fine Mac apps, who provided a surprising performance enhancement for our tiny app. I’m excited to finally be able to use this badge on my website.
\n\n", "date_modified" : "2020-01-08T19:31:10-05:00", "date_published" : "2020-01-08T19:31:11-05:00", "id" : "http://hypercritical.co/2020/01/08/front-and-center", "title" : "Front and Center", "url" : "http://hypercritical.co/2020/01/08/front-and-center" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "The upcoming sequel to the 1986 classic Top Gun has reminded me of a favorite memory from my youth. When I was a kid, I spent a lot of time looking over the TV listings. Each daily newspaper had the TV listings for that day, but there was also a weekly TV guide that came with the Sunday paper. This was the one I’d pore over while eating breakfast each morning.
\n\nThe weekly guide had a section where it listed all the movies that were airing on TV that week. Each movie was accompanied by a short, plain-spoken description of the plot. In addition to the star ratings (where the maximum was four stars, I believe), the descriptions also sometimes included a few words about the quality of the movie or performances. Something like this:
\n\nOne day in the late ’80s or early ’90s, I recall seeing the following entry for the movie Top Gun in the weekly TV guide:\n\n
There was no description at all, just this frank assessment. After spending years of my life reading these movie summaries, it was as if the author had finally broken through and had spoken with a clear voice for one brief, shining moment. Trivializes war by turning it into a music video.
\n\nIt’s now several decades later and I still remember this movie review word-for-word. I have no idea who the author was, or how many similar gems were hidden in the pages of that weekly TV guide over the years. But I credit this tiny act of defiance with inspiring me in multiple ways.
\n\nIt taught me the power of well-chosen words to shake people out of their daily routines and patterns of thought. It showed me that all jobs, no matter how seemingly dull, can be an outlet for self-expression and excellence. And it reminds me, to this day, that each work of art can be—deserves to be—considered from multiple points of view, not all of which will be comfortable.
\n\nNote: This post is not a polemic against Top Gun or war movies in general. I have always loved jet fighter planes, and I enjoyed Top Gun when I saw it. This review did not make me hate it. (That said, like most older media, I suspect a modern rewatching will reveal a whole host of problems.) My memory of this capsule review is one of surprise, subversiveness, and delight. The review is a slam on Top Gun, yes, but it’s also a celebration of the indomitable human spirit. Four stars.
", "date_modified" : "2019-08-08T14:11:05-04:00", "date_published" : "2019-08-08T14:11:05-04:00", "id" : "http://hypercritical.co/2019/08/08/top-gun", "title" : "Top Gun", "url" : "http://hypercritical.co/2019/08/08/top-gun" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "According to any reasonable set of quantifiable measures, Jony Ive departs Apple as the greatest product designer who has ever lived. His hit products sold in vast numbers and were fundamentally transformative to both the company he worked for and the world at large. We all know their names: iMac, iPod, iPhone, iPad. Together, these products helped set the direction for the most consequential industry of the last century.
\n\nAs the leader of design at Apple, Ive inevitably receives acclaim for work done by other people on his team. This is what it means to be the public face of a collaborative endeavor involving hundreds of people. Ive himself is the first to credit his team, always using the word \"we\" in his appearances in Apple's design videos. One gets the impression that Ive has historically used \"we\" to refer to the design team at Apple, rather than Apple as a whole, but he certainly never meant it to refer to himself.
\n\nWhile the iPhone is obviously the most important product in Ive's portfolio, his most significant and lasting contribution to Apple and the tech industry in general is embodied by a product that he worked on much more directly, and with far less help: the original iMac.\n\n
Aside from dramatically reversing Apple's slide into obscurity, the iMac finally pushed the industry over the hill it had been climbing for decades. Nearly overnight, it went from an industry primarily concerned with technical specifications to one that more closely matches every other mainstream consumer business—one where fashion and aesthetics are not just a part of the appeal of a product, they are often the dominant factor. As much as any individual product design, this is Ive's legacy.
\n\nThere is a certain predictable progression in the career of creative professionals. In the beginning is the acquisition of basic skills and experience—the tools needed on the road to mastery. Work done in this phase is more likely to be constrained by the orthodoxy of a given industry. The first step to making a great product is to make a competent product. One must know the rules before breaking them.
\n\nThe lives of creative people are often animated by a few deeply held notions. These may be philosophical, aesthetic, fanciful—anything that stirs the soul. Early creative work often fails to embody these ideals to the satisfaction of the creator. Perhaps one's skills are not yet adequate. Perhaps one lacks the confidence to defy convention to the degree required. An early-career creative professional is surrounded by constraints.
\n\nWith the acquisition of greater skill and authority comes more freedom. If you're Jony Ive, working in a company where that skill has led to world-changing hit products and their associated fortune and well-deserved corporate promotion, you may find yourself with very few limitations indeed. Everything has come together to finally give you a chance to do it right for once—to get closer than ever to that deeply held notion, that ideal.
\n\nIt's not hard to guess what animates Ive's design philosophy. He's repeated some variation of it in nearly every Apple product design video. Ive wants to get to the essential nature of a thing. By stripping away the extraneous, we are left with the intrinsic truth of a thing. A successful design should seem obvious in retrospect. It should seem inevitable.
\n\nThis philosophy has been embodied in the products themselves, and its potency has tracked Ive's career. Early on, technical, financial, and authoritative limits led to designs that today's Ive would likely view as over-complicated: a jigsaw of decorative exterior panels fastened to an inner framework housing a hodgepodge of components.
\n\nContrast this with latter-day products like the unibody Apple laptops, where a single slab of machined aluminum replaced dozens of individual parts and their associated fasteners, seams, squeaks, and rattles. Or look at products like AirPods and the Apple Pencil that seem not to be assembled at all, but rather to have sprung into existence as complete entities. When introducing each similar product or manufacturing advance—each further simplification—Ive's joy has been apparent, even through his usual understated demeanor.
\n\nAnd so we come to the most common criticism of Ive's work. With so few limitations on his power and skills, the spark that animates his creative philosophy has been allowed to burn so brightly that it has overwhelmed everything else. Symmetry overrides utility1. Simplicity overrides flexibility2. Purity of form overrides quality of function3.
\n\nThis creative arc is dramatized in spectacular fashion in Zima Blue, an animated short that's part of the Netflix anthology series Love, Death & Robots. I don't want to spoil the ending; suffice it to say that I doubt Jony Ive's career beyond Apple will lead to quite such a dramatic conclusion. But the dogged pursuit of a core animating belief rings true to me.
\n\nIf Ive has overstayed his usefulness at Apple, it is only by a little. Few careers in any field will ever match his run at Apple. His designs changed the tech industry forever, and he hit home run after home run on the playing field that he built.
\n\nIt's often said that the best creative work requires limitations. In this case, another piece of industry wisdom also applies: success hides problems. But in the years to come, when I look back on Jony Ive's work at Apple, I doubt I'll dwell much on the tail end, when he very nearly caught that thing he'd been pursuing for his entire career. Will he ever catch it? Does anyone? I'm not sure it matters to me. After all, it's the chase that I love.
\n\nThese are some of my favorite video games. They also happen to be truly great games, though they vary widely in terms of the required time commitment and gaming experience.
\n\nMany of these games are old enough to have spawned “remastered” versions. The remasters are usually easier to find, and are often—but not always—the versions I recommend playing. See the descriptions for more details.
\n\nThis list is not exhaustive. It’s mostly limited to games that it’s possible to play today without too much trouble. As the games get older (and therefore harder to find and play), the selection criteria get stricter. I don’t go much further back than the 1990s, which ends up excluding my beloved classic Macintosh games. Maybe I’ll do a separate list of those someday.
\n\nThe Destiny series of games is omitted because it’s very difficult to go back and play this kind of multiplayer online game after the community has moved on. But I do love Destiny…even if it doesn’t always love me back.
\n\nI could write many thousands of words about each game, but my failure to do so has prevented me from making this list for too long. In an effort to get the ball rolling, this list does not feature much commentary. It’s mostly just a list, with some information about how and where to play each game. This list is updated as new games warrant inclusion.
\n\nThe games are listed in no particular order.
\n\nAvailable on PS3 as a download, on PS4 as a download and a collector’s edition disc bundled with two other games, on PC via the EPIC Store or Steam, and on iOS.
\n\nThis is the most accessible game on the list. It only takes two hours to play from start to finish, and it costs just $15 on console/PC and $5 on iOS. I recommend playing it alone, in the dark, with no interruptions, in a single sitting. A good sound system (or headphones) really enhances the experience.
\n\nTo avoid spoilers, finish the game before reading the article I wrote or listening to the podcast I recorded about the game.
\n\nAvailable on PS5, PS4, Switch, Xbox One, Xbox Series X and S, and PC.
\n\nThis game is nearly as accessible as Journey, and is similarly a good choice for someone who doesn’t have much experience with modern video games. (Some familiarity with first-person 3D controls helps.) Though it is a bit longer than Journey, there are natural intermission points within the game. I recommend playing it in a few uninterrupted sittings.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it.
\n\nAvailable on many platforms. I strongly recommend playing on a system with a controller. (Don’t forget that you can pair many modern console controllers with iPhones and iPads via Bluetooth.)
\n\n\n\nAvailable on many platforms. I recommend playing on a system with a controller.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it.
\n\nAvailable on PS4, PS3, Xbox 360, Xbox One, PC, and iOS. There’s also a remake available on PS5, Xbox Series X/S, and PC.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it.
\n\nAvailable on Mac, PC, PS4, Xbox One, and Switch.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it.
\n\nAvailable on PS2, and PS3 as a download and a disc bundled with Shadow of the Colossus.
\n\nIt’s worth the effort to dig out an old console (or borrow one or buy a used one) to play this game. The PS3 version is a remaster with better graphics and no downsides. Prefer it if you have a choice.
\n\nTo avoid spoilers, finish the game before reading my review.
\n\nAvailable on PS2, PS3 bundled with Ico, and PS4 as a disc and a download.
\n\nBoth the PS3 and PS4 versions are remasters. The PS4 version substantially changes the art style of the game. It’s not worse or better than the original art style, but it is different. I recommend either the PS3 version or the PS4 version, depending on your tolerance for dated graphics.
\n\nThough it is not a direct sequel (or prequel), it helps to have played Ico before playing this game.
\n\nAvailable for PS4 as a disc and download.
\n\nThough it is not a direct sequel (or prequel), it helps to have played both Ico and Shadow of the Colossus before playing this game.
\n\nTo avoid spoilers, finish the game before reading my review.
\n\nAvailable for PS3 as a disc and download, for PS4 as a disc and download, and as an excellent remaster for PS5 as a disc and download.
\n\nThe PS4 and PS5 versions both come bundled with the Left Behind expansion. I strongly recommend the PS5 version, but you should be sure to play both the main game and the Left Behind expansion—in that order—whichever version you get.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it or watching the HBO show.
\n\nAvailable for PS4 on disc and as a download.
\n\nThough it helps to have played the previous three installments of the Uncharted series, doing so is not necessary to both understand and enjoy this game.
\n\nAvailable for the Wii U on disc, and for the Switch as a cartridge or download (optionally including expansions).
\n\nThis game alone is worth the purchase price of a Switch. I recommend playing on a Pro Controller with the Switch connected to a TV.
\n\nIf you want to hear over two hours of my spoiler-filled thoughts on Breath of the Wild and the entire Zelda series, listen to episode 91 of the Pragmatic podcast.\n
Available for the GameCube, Wii, and Wii U.
\n\nThe Wii U version is a remaster that includes both enhanced graphics and some streamlined quest mechanics. It is the version I recommend. I strongly recommend against the Wii version due to the clunky motion controls, which are absent (or optional) on the other two versions.
\n\nAvailable for the GameCube and the Wii U.
\n\nThe Wii U version is a remaster that subtly changes the art style of the game. I prefer the art style in the GameCube original, but the Wii U version is certainly more palatable to modern players. The Wii U version also streamlines a few of the game’s quests.
\n\nAvailable for N64, GameCube, and 3DS as a cartridge and download.
\n\nThe 3DS version is a remaster with much-improved graphics, but I prefer to play Zelda games on a big TV. The GameCube version is a straight port of the N64 original with no significant improvement to the graphics. It’s a tough call, but I guess I recommend going back in time to 1998 and playing the N64 original when its graphics were cutting-edge. (Doing so would also be very on-brand for the game.)
\n\nAvailable on Mac, PC, Switch, PS3, and Xbox 360.
\n\nThe PS3 and Xbox 360 versions come bundled with Half Life 2 and Team Fortress 2, both of which are also great games.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it.
\n\nAvailable on Mac, PC, Switch, PS3, and Xbox 360.
\n\nThis is the rare sequel that matches or improves upon its fantastic predecessor in nearly every way. You should play Portal before playing this game.
\n\nTo avoid spoilers, finish the game before listening to this podcast where I talk about it.
\n\nAvailable for N64, DS, and Switch. The DS version is a remaster, but I’m not sure the improved graphics are enough to make up for the smaller screen of the handheld platform. The Switch version is also a (newer) remaster, but you can play it on your TV if you have the Switch hardware to do so. Plus, the Switch version is part of the Super Mario 3D All-Stars bundle that comes with two other Mario games. Good deal.
", "date_modified" : "2019-03-01T16:32:04-05:00", "date_published" : "2019-03-01T16:32:05-05:00", "id" : "http://hypercritical.co/2019/03/01/great-games", "title" : "Great Games", "url" : "http://hypercritical.co/2019/03/01/great-games" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nFive years ago, I sold t-shirts commemorating my first podcast, Hypercritical, which ran for 100 episodes in 2011–2012. The shirts also celebrated this website, which is updated nearly once per year. Thanks to everyone who purchased a shirt all those years ago.
\n\nSince then, I've gotten many requests to sell the shirts again, either to replace old shirts or because someone missed the previous sale entirely. Today, the time has come for the triumphant return of the Hypercritical t-shirt. The sale ends on Friday, June 29th at 8 p.m. EDT, so if you want a shirt, don't delay. It may be five years—or longer—before they're sold again.
\n\nThe shirts are available in men's and women's styles and in light and dark colors:\n\n
My sincere thanks to everyone who has purchased a shirt, past and present, and to all the people who continue to listen to my podcasts and read this site.
", "date_modified" : "2018-06-15T09:56:58-04:00", "date_published" : "2018-06-15T09:56:58-04:00", "id" : "http://hypercritical.co/2018/06/15/hypercritical-t-shirts-return", "title" : "Hypercritical T-Shirts Return", "url" : "http://hypercritical.co/2018/06/15/hypercritical-t-shirts-return" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nFumito Ueda’s first game, Ico, was a beautiful, moody masterpiece. Its spare depiction of a boy attempting to escape from a vast castle with the help of a mysterious companion discarded the gameplay and interface conventions of its day, delivering an almost meditative sense of immersion. Ueda’s next game, Shadow of the Colossus, added the bare minimum of status indicators to the screen to support its complex boss battles that required the player to clamber up and onto a succession of giant creatures.
\n\nIn terms of both gameplay and mood, Ueda’s latest game, The Last Guardian, is a straightforward combination of its predecessors. It features a boy attempting to escape from a mysterious castle with the help of a giant creature. Like Ico, it eschews a conventional HUD, save system, inventory management, power-ups, and nearly every other modern gaming convention. And as in Shadow of the Colossus, players will find themselves scrambling up the back of a large, often uncooperative, incredibly life-like beast (cheekily named Trico).
\n\nIco was able to deliver on the promise of its design by reducing complexity in other areas. It’s set in a largely rectilinear castle that the player navigates on foot. It has a small number of enemies. Its environmental puzzles are mechanically and conceptually simple. Similarly, Shadow of the Colossus manages to pull off its extremely ambitious boss battles by removing nearly everything from the game except those creatures.
\n\nWhile The Last Guardian attempts to combine the strengths of its predecessors, it’s burdened by the combination of their features. The environment and the player’s movement through it is far more complex than in Ico. The puzzles play fast and loose with their own rules at a few critical points. The giant creature, no longer confined to a limited engagement in a boss arena, sometimes pushes the game mechanics past their limits.
\n\nNothing kills immersion more than an acute awareness of the game engine itself. In The Last Guardian, the camera often gets stuck on walls or briefly shows the view from inside Trico. (Spoiler alert: like all your favorite 3D-rendered characters, he’s hollow.) Arguably, Shadow of the Colossus had an even more frustrating camera and control scheme, but that game was released eleven years ago on a far less powerful console. The Last Guardian has made tremendous strides since then, but it’s still not quite enough to avoid illusion-breaking lapses.
\n\nThese shortcomings are compounded by an uncharacteristic lack of faith in its design. Traditional (read: oppressive) on-screen prompts describing the control scheme mar the opening of the game and are impossible to completely banish. A voice-over extends beyond its narrative role to provide a dynamic hint system that is often too quick to reveal solutions. Several brief cutscenes in quick succession at the start of the game undercut player agency. It's tempting to attribute these lapses to Ueda’s departure from the project several years before its release, but the reason is less important than the result.
\n\nAll of that said, it’s important to remember the context of these criticisms. Ico and Shadow of the Colossus are two of the greatest video games ever created. Both pushed the limits of the hardware they were released on, and both have influenced video game designers, filmmakers, and other creative professionals far out of proportion with their modest sales numbers. That The Last Guardian fails to resoundingly best its distinguished parents is only disappointing because of how close it comes.
\n\nLet’s start with the obvious. The Last Guardian is a gorgeous game. The world design is in line with Ico and Shadow of the Colossus, but the increased fidelity of the PlayStation 4 really makes it shine. (PlayStation 4 Pro running at 1080p is recommended for best frame rates.) Lighting effects that Ico could only dream of add a poignancy to already majestic vistas. At so many points, I wished this game had the photo mode from Uncharted 4.
\n\nTrico is an amazing achievement: a building-sized NPC that truly feels alive. Its animations rarely feel canned or repetitive. Its behavioral inscrutability is completely in keeping with its character. Learning to read Trico’s moods and signals is a core part of the game. The experience smoothly transitions from frustration to a deep, intuitive understanding by the end.
\n\nAnyone who has finished Ico and Shadow of the Colossus will have no trouble completing The Last Guardian. I found the environmental puzzles a bit more challenging than those in Ico, but I never had to go to the Internet to look up a solution. Anyone who got stuck in Ico will almost certainly be even more stymied by The Last Guardian, however. The hand-eye coordination required is substantially lower than in Shadow of the Colossus, but the camera management and overall control-scheme finesse is much more demanding than in Ico.
\n\nAlso keep in mind that these are comparisons to the difficulty of two much older games. The Last Guardian has a significant skill-barrier to enjoyment when compared to contemporary console games, especially those with such an artistic bent. Inexperienced gamers looking for a better match for their skills should try Journey instead.
\n\nLongtime console gamers who have never played Ico or Shadow of the Colossus should definitely do so, preferably before playing The Last Guardian. High-definition remakes of both games are available for the PlayStation 3 on a single game disc for a combined price of $25. If your taste in games is anything like mine, it is absolutely worth buying or borrowing a PlayStation 3 console just to play these two games. (Plus Journey for just $15 more.) [Update: Both games are also available on the PS4 and Windows PC via the PlayStation Now cloud gaming service, though I have not tried playing them this way.]
\n\nIf you loved Ico and Shadow of the Colossus, The Last Guardian is well worth playing, but it bears the scars of its nearly decade-long development. Like The Force Awakens, there’s almost no way The Last Guardian could have lived up to the expectations accumulated during the long wait for its release. In the end, its reach exceeds its grasp, if only slightly. But, oh, what a reach it was. Like its star creature, The Last Guardian occupies a lofty perch—defiantly idiosyncratic and occasionally inscrutable, but a towering achievement nonetheless.
", "date_modified" : "2016-12-18T14:55:20-05:00", "date_published" : "2016-12-18T13:45:20-05:00", "id" : "http://hypercritical.co/2016/12/18/the-last-guardian", "title" : "The Last Guardian", "url" : "http://hypercritical.co/2016/12/18/the-last-guardian" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "These are the canonical bagel flavors:
\n\nAlso:
\n\nNearly 15 years ago, I wrote my first review of Mac OS X for a nascent “PC enthusiast’s\" website called Ars Technica. Last fall, I wrote my last. Though Apple will presumably announce the next major version of OS X at WWDC this coming June, I won’t be reviewing it for Ars Technica or any other publication, including the website you’re reading now.
\n\nThose who listen to the ATP, the weekly podcast I host with Marco Arment and Casey Liss, know that I’ve been contemplating hanging up my OS X reviewer’s hat for some time now. Producing thousands of words (and hundreds of screenshots) about each major release of OS X was my first real claim to fame on the Internet. The prospect of stopping has made me reconsider my public identity and sense of self. Who am I if I’m not “that guy who writes those OS X reviews”? But when I finally decided, the relief I felt let me know I’d made the right choice.
\n\nThere is no single, dramatic reason behind this. It’s an accumulation of small things—the time investment, the (admittedly, self-imposed) mental anguish, the pressure to meet my own expectations and those of my readers year after year—but it all boils down to a simple, pervasive feeling that this is the time to stop. I’ve done this. It is done.
\n\nWhen I started, I was at the forefront of long-form nerd-centric tech writing. Today, the world has moved on. I might have stopped with my OS X 10.9 review in 2013 if not for my love of round numbers and my expectation that OS X 10.10 would bring a complete interface overhaul that I really wanted to write about.
\n\nWhile OS X reviews were my public debut, the Hypercritical podcast brought me to a new audience starting in 2011. Hypercritical ran for 100 episodes, and in the years that followed I’ve recorded at least one podcast every week. (I’m currently a co-host of the weekly Accidental Tech Podcast and a regular guest on The Incomparable.) The one, long article I wrote about OS X for Ars Technica every year or two has long since been dwarfed by the volume of my audio output.
\n\nI still love OS X—and I still have many complaints about it. I will certainly talk about OS X 10.11 (whatever it’s called) at length on ATP, and I’ll read the many great reviews written by others when it’s released. But neither podcasting nor writing have ever been full-time jobs for me. I’ve always had to fit them into my life alongside my actual job and my family. Right now, I’m looking forward to my first summer in many years that won’t be dominated by stolen daytime minutes and long, sleepless nights in front of a screen with a noisy air conditioner blowing behind me. I’m content to have reviewed 10.0 through 10.10. Someone else can pick up the baton for the next 15 years.
\n\nI reviewed OS X 10.10 Yosemite for Ars Technica. This is the eleventh major release of OS X, and I've reviewed them all. There are several ways to read my review.
\n\nHere are my thoughts on the various reading options. This is mostly a repeat of last year’s post about Mavericks, with some text carried over verbatim, but there is some new information.
\n\nThe web version of my review is the canonical version. It has the best formatting, the biggest images, and includes mouse-over image toggle effects that can't be done in an ebook. It's also the most up-to-date. I believe that good writing for the web includes many links. A web browser is the best place to inspect and follow those links.
\n\nAll the images in my review are Retina resolution. To see all the detail in the images, read the review on a screen with at least 1,920 “native” pixels of horizontal resolution. Most images are 1,280 pixels wide (presented to the browser with a width value of 640), but the “full-width” images are 1,920 pixels wide (presented to the browser with a width value of 960).
\n\nThe free web version has ads, and it’s split up into multiple “pages” (which are usually much longer than a single printed page). This kind of pagination annoys some people. I actually like it for very long articles because it helps me keep my place across multiple reading sessions. I can remember I was on page 8 instead of remembering the exact point in a very long, scrolling web page.
\n\nThat said, I also really like how an Ars Premier subscription eliminates all ads from the Ars Technica website and gives me the option to view any article on a single page. I use single-page view on very long articles when I’m searching for some text using my web browser’s “Find…” feature. I use it all the time on short articles.
\n\nSome people think Ars Technica forces me to break my article up into many tiny pages. That’s not the case. I choose how to paginate the article. I like to break it up on logical section boundaries, which means that the “pages” vary widely in length. I do try to keep any single “page” from being too short, however.
\n\nMy review is available on Apple’s iBookstore as well as Amazon.com.
\n\nThe Kindle and iBooks readers for OS X and iOS have their own strengths and weaknesses, but I think the iBooks version of my review has a slight edge over the Kindle version. Amazon adds a “delivery” charge of $0.15 per megabyte (varying a bit for different countries). This can really eat into the price of a $4.99 book. Like the web version, both ebook versions include Retina-resolution images, making them quite large. To control the size of the Kindle ebook, I used JPEG images throughout.
\n\nUnlike Amazon, Apple does not charge a per-megabyte fee in its ebook store. Since both ebooks are the same price, this means I make slightly more money from each iBookstore purchase than I do from each Kindle purchase. But there’s something in it for you, too. The iBookstore version of my review uses lossless PNG images throughout. (Kindle version: 5 MB; iBookstore version: 25 MB.) In practice, I doubt most people will be able to tell the difference between the JPEG and PNG images, but I know which one I’d choose.
\n\nI've tried to make both ebooks available for purchase in as many countries as possible, but there are some limits on this that are beyond my control. If the ebook is not available in your country, remember that you can get both versions of the ebook by subscribing to Ars Premier.
\n\n
My sincere thanks to everyone who reads the review, in any form, in whole or in part. You’re the reason I’ve been doing this for the past fifteen years.
", "date_modified" : "2014-10-16T15:47:55-04:00", "date_published" : "2014-10-16T15:02:14-04:00", "id" : "http://hypercritical.co/2014/10/16/yosemite", "title" : "About My Yosemite Review", "url" : "http://hypercritical.co/2014/10/16/yosemite" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Most of the nonfiction books I read these days fall into two broad categories: books about people I admire and books about the creation of things I admire. Good books about the latter often turn into the former by the end.
\n\nThe book I just finished, Creativity, Inc. by Ed Catmull, co-founder of Pixar, had a head start on both counts. My love of Pixar is not surprising or uncommon. As for Ed Catmull, I’ve been aware of him and his contemporaries for decades (I had an Alvy Ray Smith quote in my .sig for a while in the 90s), but my nerd crush really stepped into high gear when I saw a video of Catmull’s talk at the Stanford Graduate School of Business in 2007.
\n\nIt’s difficult for me to describe my reaction to that talk—and to his new book—without sounding absurdly self-aggrandizing, but I’m going to give it a shot. Saying what other people are thinking is a proven formula for mass-market appeal employed by everyone from talk radio hosts to stand-up comedians. But as someone whose thoughts and interests have always been outside the norm, I’ve rarely heard excerpts from my own inner dialog voiced on a broader stage.
\n\nEd Catmull does that for me. If you’ve listened to my Hypercritical podcast or read the article that inspired it, you will find many familiar topics and themes in Creativity, Inc. Now, believe me, I harbor no illusions about this overlap. I am not the guy who hears Louis C.K. tell a joke and thinks he could be just as funny because he had a similar thought once. But shared values and the fulfillment of common aspirations are at the heart of all hero worship.
\n\nEd Catmull’s dream was to create the first fully computer-animated feature film. As a child, I also dreamed of such a thing; Catmull and the rest of the people at Pixar actually made it happen. Similarly, as an adult, I’ve clung to the notion that critical thinking can be both useful and powerful. Creativity, Inc. explains just how powerful it can be when practiced by a handful of the most brilliant technical and creative people alive today.
\n\nAy, there’s the rub. It’s so easy to hear the vaguest echo of your own thoughts expressed by someone fantastically smart and accomplished and view that as a cosmic endorsement of your approach to life. But that absolutely would not be in keeping with the message of the book—a message Catmull tries again and again to communicate to readers he knows will resist it.
\n\nIndeed, Catmull most often uses himself as an example of someone who has failed to see through to the heart of a problem. This is the true strength of the book. Unlike so many other tech-industry memoirs and business books, Creativity, Inc. is not an abstract exploration of a philosophy, nor is it a list of accomplishments interspersed with bold commandments. Instead, it is a deep, thoughtful investigation of a never-ending series of failures—and the reactions to those failures that eventually led to success.
\n\nThink of it: the man who invented texture mapping, made computer-animated films possible, and led his studio to release a string of amazing, Oscar-winning examples of the form decides to write a book…and then builds it around an examination of his own mistakes. Ed Catmull may not be your kind of hero, but he sure is mine.
", "date_modified" : "2014-04-30T08:52:34-04:00", "date_published" : "2014-04-27T20:53:17-04:00", "id" : "http://hypercritical.co/2014/04/27/creativity-inc", "title" : "Creativity, Inc.", "url" : "http://hypercritical.co/2014/04/27/creativity-inc" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThirty years ago today, Steve Jobs introduced Macintosh. It was the single most important product announcement of my life. When that upright beige box arrived in my home, it instilled in me an incredible sense of urgency. I greedily consumed every scrap of information about this amazing new machine, from books, magazines, audio cassettes, and any adult whose ear I could bend. This was the future—my future, if I could help it.
\n\nThe death of Steve Jobs in 2011 brought back a lot of these same memories. What I wrote then echoes my thoughts on the Mac’s 30th anniversary.
\n\n\n\nI was 9 years old at the time. That year, my grandfather had changed my life by purchasing a Macintosh 128K, and convincing my parents to do the same. My grandfather also had a subscription to Macworld magazine, including multiple copies of issue #1, two of which I took home with me. I cut the Macintosh team picture out of one [see above] and left the other intact. (I still have both.)
\n\nI pored over that magazine for years, long after the technical and product information it contained was useless. It was the Macintosh team that fascinated me. That’s why I’d chosen to cut out this particular picture, not a photo of the hardware or software. After seeing the Macintosh and then reading this issue of Macworld, I had an important realization in my young life: people made this.
That last part is the most important. It wasn’t just the product that galvanized me; it was the act of its creation. The Macintosh team, idealized and partially fictionalized as it surely was in my adolescent mind, nevertheless served as my north star, my proof that knowledge and passion could produce great things.
\n\nMemories are short in the tech industry. For most people, Apple and Steve Jobs will always be synonymous with the iPhone, an uncontested inflection point in our computing culture. For me, the introduction of the Macintosh will always be more important. Though people who didn’t live through it might not feel it as keenly as I do, the distance between pre-2007 smartphones and the iPhone is much smaller than the distance between MS-DOS and the Mac.
\n\nOn a personal level, nothing will ever replace my tanned-plastic beauty, the greatest electronic gift I had ever received, or would ever receive. My attachment to the Mac explains why, in the late 1990s, I was desperate to know everything possible about the fate of Apple and the future of the Mac operating system. Almost fifteen years later—half the Mac’s life—I’ve reviewed every major release of OS X and zero releases of iOS. Don’t get me wrong, I love my iPad and iPod touch, but you never forget your first.
\n\nI’m eternally grateful to the people who created the Mac, and to the countless others who kept it alive and shepherded its rebirth. In this age of iOS, it’s heartening to hear Phil Schiller say, “Our view is, the Mac keeps going forever.” That’s just fine with me.
", "date_modified" : "2014-01-24T08:49:59-05:00", "date_published" : "2014-01-24T08:36:10-05:00", "id" : "http://hypercritical.co/2014/01/24/macintosh", "title" : "Macintosh", "url" : "http://hypercritical.co/2014/01/24/macintosh" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nAsk a room of computer geeks how they came to deserve this appellation and you’ll likely hear many similar stories. “I got my first computer when I was very young. By the time I was a teenager, I’d logged thousands of hours at the keyboard doing everything imaginable with my computer: gaming, programming, networking, upgrades, the works.”
\n\nThat’s certainly my story. I was lucky enough to get a Macintosh in 1984, and it changed my life. I spent so many hours in front of that computer, I often look back in wonder at how I found so much to do with so little. This was years before I had an Internet connection. I had very little software and no convenient way to get more. My dollar-a-week allowance didn’t go very far. The only other person I knew with a Mac was my grandfather who lived two hours away. Nevertheless, I put in the hours—willingly, joyfully—and became the seasoned Mac geek you see before you today.
\n\nMy Macintosh origin story is part of who I am. Being there from the beginning (and staying with the Mac, even through the dark times) gives me a useful historical perspective on the platform. But this is not the only road to geekdom.
\n\nThe Mac is actually one of the few things I’m a geek about that I’ve been in on since the start. Geekdom is not defined by historical entry points or even shared experiences. A geek must possess just two things: knowledge and enthusiasm.
\n\nI became interested in remote control cars in high school after seeing a friend drive one in his backyard. He’d been building and racing RC cars since he was in elementary school. I was fascinated by these machines, but I worried I’d never be a “real” RC car geek like my friend.
\n\nI saved my money, bought a car, built it (badly) myself—and then crashed it. Undaunted, I bought replacement parts, fixed it, learned to drive it with far less crashing, and eventually bought a better car. Most importantly, I subscribed to Radio Control Car Action magazine and read every issue from cover to cover as soon as they arrived at my house.
\n\nA year or so later, I found myself in my local hobby shop answering another customer’s questions about his car. It started to dawn on me that I now knew more about RC cars than the average hobby shop patron. I was no longer an outsider looking in.
\n\nAround the same time, I was engaged in one of those cheap-music-for-membership marketing schemes that led to me having to select some CDs on a whim. I ended up getting Achtung Baby, and it knocked my socks off. I’d been aware of U2 for years and had probably heard the hits from The Joshua Tree on the radio dozens of times, but I’d never really been into the band—or any band, for that matter. Achtung changed that.
\n\nI started to work my way backwards through U2’s catalog, buying as many CD long boxes as I could get my hands on. I bought and read biographies of the band. At my local library, I devoured reviews of all their past albums in Rolling Stone and Spin. I found every magazine with a cover story about U2. When I couldn’t find anything else in the stacks of back issues, I turned to the library’s microfiche collection.
\n\nIn college, I finally had easy access to singles, b-sides, and bootlegs, allowing me to complete my collection. I also had a fast, reliable Internet connection for the first time. This was beyond the local hobby shop; I was communicating with other U2 fans across the entire planet.
\n\nI learned to play the guitar (badly) and downloaded tab for my favorite U2 songs. Dissatisfied with the state of lyrics websites (some things haven’t changed), I transcribed every U2 album, single, b-side, and rarity, leading to the creation of my first public website, The U2 Lyrics Archive. This was my first claim to fame on the net. (The site is gone now, but when the official u2.com website launched a few years after mine, it contained lyrics copied from my site, typos and all.)
\n\nRemote control cars existed for decades before I got my first kit. Achtung Baby was U2’s seventh album. Yet I was once a serious RC car geek and an unassailable U2 geek. It started with enthusiasm. Given the opportunity, I channeled that energy into a dogged pursuit of knowledge.
\n\nYou don’t have to be a geek about everything in your life—or anything, for that matter. But if geekdom is your goal, don’t let anyone tell you it’s unattainable. You don’t have to be there “from the beginning” (whatever that means). You don’t have to start when you’re a kid. You don’t need to be a member of a particular social class, race, sex, or gender.
\n\nGeekdom is not a club; it’s a destination, open to anyone who wants to put in the time and effort to travel there. And if someone lacks the opportunity to get there, we geeks should help in any way we can. Take a new friend to a meetup or convention. Donate your old games, movies, comics, and toys. Be welcoming. Sharing your enthusiasm is part of being a geek.
\n\nAnyone trying to purposely erect border fences or demanding to see ID upon entry to the land of Geekdom is missing the point. They have no power over you. Ignore them and dive headfirst into the things that interest you. Soak up every experience. Lose yourself in the pursuit of knowledge. When you finally come up for air, you’ll find that the long road to geekdom no longer stretches out before you. No one can deny you entry. You’re already home.
", "date_modified" : "2014-01-14T01:42:08-05:00", "date_published" : "2014-01-14T00:26:58-05:00", "id" : "http://hypercritical.co/2014/01/14/the-road-to-geekdom", "title" : "The Road to Geekdom", "url" : "http://hypercritical.co/2014/01/14/the-road-to-geekdom" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "At the beginning of last year, I posted a list of things Apple can and should do during 2013. It’s time to settle up. Because I’m feeling scholastic, I’ll give a letter grade to each item.
\n\nShip OS X 10.9 and iOS 7. Done and done, with only a few minor bumps in the road. A-
Diversify the iPhone product line. “There needs to be more than one iPhone,” I wrote. This is a drum I’ve been beating for many years. Apple finally made it happen in 2013 with the cleverly conceived iPhone 5C. I’m disappointed that the 5C doesn’t have more internal changes beyond a slightly larger-capacity battery, and I’m still anxiously awaiting an iPhone with a larger screen, but Apple got the important parts right. The 5C is a good phone, and it’s easily distinguished from the 5S. B+
Keep the iPad on track. The iPad Air is impressive, and the mini finally went Retina. On the downside, the creaky old iPad 2 lives on, the iPad Air really deserves more RAM, and a larger “iPad Pro” is still off in the hazy future. The iPad is “on track,” for sure, but exciting times are still ahead. A-
Introduce more, better Retina Macs. The latest Retina MacBook Pro has Intel’s Iris Pro 5200 graphics, finally giving the integrated GPU enough muscle to handle all those pixels. Apple also kept around an option for a discrete GPU on the high-end model. But the MacBook Air and iMac are still excluded from the Retina club, and even the mighty Mac Pro has extremely limited high-DPI options. We’ll get ’em next year, right Tim? B-
Make Messages work correctly. It’s difficult to measure the scope and frequency of problems in Messages based solely on blog posts and tweets, but I feel safe in saying that weird behavior still exists and is likely to be seen by anyone who uses Messages every day. Hope is fading. D
Make iCloud better. The iCloud Core Data team got a chance to regroup in Mavericks. It may be too little, too late, but at least it’s a step in the right direction. More broadly, iCloud still doesn’t have a good reputation for reliability, and debugging problems related to it remains difficult. If the only user-accessible control for a service is a single checkbox, it had better “just work.” iCloud has yet to earn that label. C
Resurrect iLife and iWork. Be careful what you wish for, I suppose. Apple did finally release new versions of the applications formerly known as the iLife and iWork suites, but the focus on simplicity and feature parity with the web and iOS versions left Mac users wanting more. It does not feel like an upgrade worthy of the years that have passed since the last major revisions of these applications. B-
Reassure Mac Pro lovers. Apple was thoroughly convincing in its rededication to the Mac Pro, presenting a dramatic introduction video at WWDC for its radical new high-performance hardware. It’s not for everyone, but it represents a hell of a turnaround for a once-neglected product. Let’s hope it doesn’t take 18 months for the next revision to appear. A
Do something about TV. Sigh. F
Out of the 10 items on my to-do list, Apple did 8 of them well enough to earn a checkmark. (The TV thing was always a bit of a reach, anyway.) I’d call that a solid year.
", "date_modified" : "2014-01-02T17:11:02-05:00", "date_published" : "2014-01-02T15:19:12-05:00", "id" : "http://hypercritical.co/2014/01/02/apples-2013-scorecard", "title" : "Apple’s 2013 Scorecard", "url" : "http://hypercritical.co/2014/01/02/apples-2013-scorecard" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "On two recent episodes of Accidental Tech Podcast, I talked about calibrating my new TV. The reactions of my co-hosts and the feedback from listeners has made it clear that the entire concept of calibrating a home TV is foreign to most people.
\n\nWhile a full-zoot ISF HDTV calibration is expensive and unnecessary for most people, there are some important steps that every TV owner should take to improve image quality. If you have an iOS device plus either an HDMI output cable (Lightning or 30-pin) or an Apple TV, you can use the simple THX tune-up application to dial in your color, contrast, brightness, and other basic settings.
\n\nBefore calibrating, don’t forget to turn off all the “image enhancement” features of your TV. These are the things with names like Vivid Color, Color Remaster, Motion Interpolation, Brilliance Enhancer, Black Extension, C.A.T.S., AGC, and so on. Check your TV’s manual for explanations of what each setting does, if you’re curious, but you really do want to turn them all off. They all mess with the image in ways not intended by the creator, and they will make proper calibration more difficult or impossible.
\n\nThere’s one setting in particular that anyone can adjust without requiring any skill or special software. Let’s say you buy a new 1080p HDTV with a native resolution of 1920×1080. Out of the box, that TV will most likely be configured to never show you a full 1920×1080 pixels of information. In computer parlance, it’s running at a non-native resolution by default, like a 1024×768 LCD display set to a resolution of 800×600.
\n\nImagine this test image exactly matches the native resolution of your HDTV. (It doesn't, so please don't use it to test your actual TV. Use a real calibration app or image instead.)
\n\nIf you’re viewing this post on a Retina display, the thin lines extending from the squares in the corners should be crisp and pixel-perfect. Send this image to your HDTV, however, and this is what you’re likely to see:
\n\nThe green box is no longer visible; the squares in the corners are now rectangles; the fine lines are now blurred together, producing an unpleasant moiré pattern. You can read all about the origins of this terrible behavior in the Wikipedia entry on “overscan,” but all you need to know is that it’s no longer necessary in the age of HDTV.
\n\nYou paid for all 1920×1080 pixels of your fancy new HDTV—use them! Most HDTVs have a setting somewhere to correct this problem. It may be called “Overscan,” “1:1 Pixel Mapping,” “Native,” “Screen Fit,” “Just Scan,” or something even more generic like “Size 1” or “Size 2.” Consult your TV’s manual to find out. (If you can’t find your paper manual, a Google search for your TV’s model number followed by “manual PDF” will usually lead to an online version.) Don’t give up; the setting is almost always there somewhere. For TVs with no dedicated setting, you may have to change the input label to “PC” or similar to force the issue.
\n\nThe nerd-rage I feel at the thought of a display running in non-native resolution may not be something you can relate to, but everyone can appreciate a sharper image that shows more information. This holiday, after you’re done fixing all your relatives’ computer problems and updating their software, take a moment to correct the image size on their HDTV as well. Your relatives might not thank you for it, but I will.
", "date_modified" : "2013-12-22T18:49:19-05:00", "date_published" : "2013-12-22T14:53:21-05:00", "id" : "http://hypercritical.co/2013/12/22/fill-your-tv", "title" : "Fill Your TV", "url" : "http://hypercritical.co/2013/12/22/fill-your-tv" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "I reviewed OS X 10.9 Mavericks for Ars Technica. I’ve been reviewing OS X since 1999, and this is the tenth major release. There are several ways to read my review.
\n\nHere are my thoughts on the various reading options. This is mostly a repeat of last year’s post about Mountain Lion, with some sections carried over verbatim, but there is some new information.
\n\nThe web version of my review is the canonical version. It has the best formatting and the most features. It's also the most up-to-date. I believe that good writing for the web includes many links. A web browser is the best place to inspect and follow those links.
\n\nThis year, all the images in my review are Retina resolution. To see all the detail in the images, read the review on a Retina iPad, Mac, or other device with at least around 1,400 “native” pixels of horizontal resolution. (The “full-width” images are 1,280 pixels wide, presented to the browser with a width value of 640, but there are also margins around the content column.)
\n\nThe free web version has ads, and it’s split up into multiple “pages” (which are usually much longer than a single printed page). This kind of pagination annoys some people. I actually like it for very long articles because it helps me keep my place across multiple reading sessions. I can remember I was on page 8 instead of remembering the exact point in a very long, scrolling web page.
\n\nThat said, I also really like how an Ars Premier subscription eliminates all ads from the Ars Technica website and gives me the option to view any article on a single page. I use single-page view on very long articles when I’m searching for some text using my web browser’s “Find…” feature. I use it all the time on short articles.
\n\nSome people think Ars Technica forces me to break my article up into many tiny pages. That’s not the case. I choose how to paginate the article. I like to break it up on logical section boundaries, which means that the “pages” vary widely in length. I do try to keep any single “page” from being too short, however.
\n\nFor the first time, my review is available on Apple’s iBookstore as well as Amazon.com. The new iBooks application bundled with Mavericks means you can also read the iBookstore version on your Mac.
\n\nThe Kindle and iBooks readers for OS X and iOS have their own strengths and weaknesses, but I think the iBooks version of my review has a slight edge over the Kindle version. Amazon adds a “delivery” charge of $0.15 per megabyte (varying a bit for different countries). This can really eat into the price of a $4.99 book. Like the web version, both ebook versions include Retina-resolution images this year, making them much larger than in past years. To control the size of the Kindle ebook, I used JPEG images throughout. (Last year’s Kindle ebook used a mix of JPEG and PNG images for the same reason.)
\n\nUnlike Amazon, Apple does not charge a per-megabyte fee in its ebook store. Since both ebooks are the same price, this means I make slightly more money from each iBookstore purchase than I do from each Kindle purchase. But there’s something in it for you, too. The iBookstore version of my review uses lossless PNG images throughout. (Kindle version: 5.5 MB; iBookstore version: 30.5 MB.) In practice, I doubt most people will be able to tell the difference between the JPEG and PNG images, but I know which one I’d choose.
\n\nThis year is the first time I haven’t known the price and release date of a major OS X release well in advance. The lead times dictated by the ebook stores (anywhere from 12 hours to a week) meant that I had to submit the ebooks before I knew how much Mavericks would cost. The ebooks are now updated, but Amazon in particular does not make downloading updates easy or convenient. Updates to the web version are visible instantly, of course.
\n\nMy sincere thanks to everyone that reads the review, in any form, in whole or in part. You’re the reason that I’ve been doing this for the past fourteen years.
", "date_modified" : "2014-10-16T19:54:44-04:00", "date_published" : "2013-10-22T14:27:05-04:00", "id" : "http://hypercritical.co/2013/10/22/mavericks", "title" : "About My Mavericks Review", "url" : "http://hypercritical.co/2013/10/22/mavericks" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "When Apple was on the ropes sixteen years ago, there was no shortage of advice about what the company should do to save itself, much of it fueled by a deep love for Apple’s products. It takes a diehard Apple fanatic to create something like the iconic “Pray” cover from the June 1997 issue of Wired magazine—coupled with the faith that there are enough like-minded readers to appreciate the sentiment. A decade later, those of us who spent the 1990s worrying about Apple felt relieved, and maybe even a little nervous about Apple’s newfound power. It was a hell of a ride.
\n\nNintendo engenders the same kind of affection and loyalty. Like Apple, it has a recent history of defeat followed by unlikely triumph. Nintendo’s dark times were not as bad as Apple’s; the N64 and GameCube were outgunned by the PlayStation and PlayStation 2, but Nintendo wasn’t days away from bankruptcy at any point, nor did it have to buy another company to save itself.
\n\nNow the roles appear reversed. Apple is in a bit of a slump (or so the narrative goes), but it’s a comparatively mild crisis of expectations. Apple’s products are still in demand and selling in large numbers. Nintendo, meanwhile, is experiencing one of the most disastrous console launches in its history—and that’s not even the worst news, according to some observers. It’s the handheld market where Nintendo is in the most trouble, they say.
\n\nAs expected, people who don’t want to live in a world without a successful, thriving Nintendo feel compelled to offer their heartfelt suggestions for saving the company. It’s this same compulsion that has briefly driven me out of my months-long Mavericks-review-writing haze to offer my own perspective.
\n\nI agree that Nintendo is in trouble. Before considering possible solutions, I’m forced to ask a tougher question: can it be saved? Some say no, that it’s only a matter of time. I think it comes down to this. As long as there continues to be a market for devices that are primarily designed to play games, then it’s possible for Nintendo to live to fight another day.
\n\nIf not, then I fear the worst. Nintendo is not equipped to produce and maintain a long-lived, general-purpose software platform. Precious few companies have ever done it. You know all their names: Microsoft, Apple, Google. I don’t expect to ever see Nintendo on that list.
\n\nI think there is still a market for game-only (or at least “game-mostly”) hardware products. I’m not sure how long it will last, but I’m betting this upcoming generation of consoles will sell well enough in the aggregate to maintain the status quo, at the very least.
\n\nAssuming I’m right, Nintendo has all the tools it needs to pull itself out of its current tailspin. To understand how, just look at how Nintendo has always done it: with hardware and software working together to provide new, fun experiences.
\n\nThe NES was Nintendo’s first big video game success. After the game console crash of the 1980s, home video game software alone was not going to lead Nintendo to riches. Personal computers were still expensive and wouldn’t have mass-market penetration for years. Any attempt to field an Atari-2600-like hardware product would surely be met with skepticism.
\n\nNintendo’s solution required hardware and software. The hardware: an Atari-like game console, yes, but also…a robot? Yep, and a light gun, too. Very few games used these accessories, but you can be sure they were featured heavily in all the initial advertising for the NES. They were hardware decoys, misdirections. They existed to get the NES into homes. Once there, a tiny mustachioed trojan plumber spilled out of the belly of the beast and conquered a generation of gamers.
\n\nNow consider the Nintendo 64, the company’s first 3D console. The Saturn and the PlayStation beat it to market by years, and both had the good sense to use optical disks instead of cartridges. Though the PlayStation came to dominate that generation, it was the Nintendo that transformed 3D gaming forever with the potent combination of Super Mario 64 and the Nintendo 64 controller—hardware and software products that were designed together, and it showed.
\n\nMario 64 taught the world how to make a good 3D game. Though it couldn’t save the N64 from an ignominious fate in the market, it left its mark on gaming history and perhaps singlehandedly kept Nintendo relevant. The idea of releasing a 3D gaming system today without a standard analog stick is absurd, but that’s just what Sega and Sony did in 1994. After the N64 was revealed to the world, analog sticks quickly appeared on both the Saturn and the PlayStation—hastily tacked onto the existing controller, in the latter case, but I’m sure that was only a temporary condition, right? (Sigh.)
\n\nThen there’s the Wii. Nintendo sacrificed hardware power for a novel input method and low price, then paired it with software that explained the value proposition to the world. After two generations of defeat at the hands of Sony, Nintendo put itself back on the top of the game console market.
\n\nNone of these examples would have been possible if Nintendo didn’t make both the hardware and the software. And I didn’t even mention the Game Boy product line or the dual-screened DS, two of the top three best-selling gaming platforms of all time. Again, impossible without hardware and software synergy. This is how Nintendo succeeds.
\n\nWhen I read the current crop of advice for Nintendo, much of it focused on how to survive in a world where iOS comes to dominate portable gaming, I think about how it would have helped Nintendo at its previous low points. Nintendo should make games for iOS, some say. If you can’t beat ’em, join ’em.
\n\nAt the tail end of the GameCube’s life, Sony had sold many times more consoles and games than Nintendo over the course of a decade. Should Nintendo have started writing games for the overwhelmingly dominant Sony platform? Would that have helped Nintendo achieve Wii-like success? I don’t think so; no amount of software alone could have done that.
\n\nThe game software business is tough. It’s hit-driven, like Hollywood. Most games lose money or break even. A few big winners fund all the others—if you’re lucky. A game development studio going out of business shortly after releasing a critically acclaimed game is not unheard of. (Hell, the best game released last year bankrupted its developer.)
\n\nConsolidation is rampant in game development. Small players are routinely snatched up by behemoths that have a better capacity to absorb the inevitable losses that come with games that are not monster sales successes.
\n\nThis is not a world that Nintendo should aspire to enter. Better to stick with hardware platforms that it controls, profiting from both the hardware sales and the fees collected from third-party games sold on its platforms. That’s the kind of steady (and potentially enormous) income that will keep Nintendo afloat as it works on the next big thing.
\n\nEven if Nintendo sticks to its guns, and even if the market for game-focused hardware continues to exist, Nintendo still faces some big challenges. A gaming platform doesn’t have to compete with iOS on its own terms, but it does have to at least match it in the areas that are relevant to gaming.
\n\nRight now, Apple is crushing Nintendo when it comes to the software purchase, installation, and ownership experience. Hell, even Steam—a PC gaming platform—embarrasses Nintendo’s e-commerce efforts. My Nintendo games should not be tied to a piece of hardware. My purchases should transfer seamlessly to any new Nintendo device I purchase. Illegal emulation should not be the easiest way (or only way) to play classic Nintendo games. Nintendo needs to get much, much better at this stuff—fast.
\n\nApple is also winning when it comes to market access. It’s much easier for a two-person team to write an iOS game and put it up for sale than it is for that same team to get a game onto a Nintendo platform. Expensive, formal, limited developer access has no place in the modern gaming world. Nintendo needs to wake up and smell the App Store.
\n\nA lot of things have to go right for Nintendo to get its mojo back. It’s worth reiterating: if the market for dedicated gaming hardware disappears, I fear it’s game over for Nintendo as we know it.
\n\nBut if the time of the game console is not yet at an end (handheld or otherwise), then Nintendo has a lot of work to do. It needs to get better at all of the game-related things that iOS is good at. It needs to produce software that clearly demonstrates the value of its hardware—or, if that’s not possible, then it needs to make new hardware.
\n\nAny advice that leads in a different direction is a distraction. There’s no point in any plan to “save” Nintendo that fails to preserve what’s best about the company. Nintendo needs to do what Nintendo does best: create amazing combinations of hardware and software. That’s what has saved the company in the past, and it’s the only thing that will ensure its future.
", "date_modified" : "2013-09-02T20:25:58-04:00", "date_published" : "2013-09-02T15:00:45-04:00", "id" : "http://hypercritical.co/2013/09/02/nintendo-in-crisis", "title" : "Nintendo in Crisis", "url" : "http://hypercritical.co/2013/09/02/nintendo-in-crisis" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nLet's try this again. Last month, inspired by Marco and bolstered by the drop-dead-simple Teespring website, I put the first Hypercritical t-shirt up for sale. The response from fans was amazing, vastly exceeding my expectations. Unfortunately, that sale was aborted due to my unauthorized use of copyrighted artwork. All orders were refunded and no t-shirts were printed.
\n\nNow the Hypercritical t-shirt is back, with a new design. At a glance, it may look exactly like the previous shirt, but this version features new artwork. (It's the same image that appears next to this site's title and as its favicon.)
\n\nEverything else is the same as last time: the shirt is available in men's and women's styles and in four colors. Teespring requires two separate pages for the two different ink colors used on the light and dark shirt. Here are the links:\n\n
Once again, my sincere thanks to everyone who has purchased a shirt, past and present, and to all the people who continue to read this site.
", "date_modified" : "2013-06-20T20:53:53-04:00", "date_published" : "2013-06-20T20:53:53-04:00", "id" : "http://hypercritical.co/2013/06/20/hypercritical-t-shirts-2", "title" : "Hypercritical T-Shirts 2.0", "url" : "http://hypercritical.co/2013/06/20/hypercritical-t-shirts-2" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nNow that the Xbox One has been revealed, joining the already-released Wii U and the previously announced PlayStation 4, we can finally get a sense of what the next generation of game consoles will look like.
\n\nThis used to be a simple business. Cutthroat and fiercely competitive, yes, but at least all the players were racing for the same prize. Every handful of years, we’d get a new crop of consoles, each claiming to be the most powerful and to have the best games.
\n\nSeven years ago, after being outsold by Sony in the two previous console generations, Nintendo broke from the pack and went after a new market: people who were not interested in—or were too intimidated by—traditional game consoles.
\n\nThe Wii was startlingly less powerful than the other consoles in its generation. This helped make it the least expensive and the smallest, which only increased its appeal to non-gamers. The coup de grâce was the Wii’s novel control scheme, which let your dad, who couldn’t get past World 1-1 back in the 80s, make an improbable transformation into a hardcore gamer…of a sort.
\n\nAnd if the idea of “winning” a console generation with laughably underpowered hardware wasn’t enough, the Wii and its contemporaries also put an end to the idea of a game console that just plays games. Just a few years after launch, all of the consoles—even the dainty, standard-definition Wii—supported some kind of social networking, photo viewing, and one or more video streaming services.
\n\nArguably, this movement started to gain momentum with the original PlayStation’s ability to play music CDs, and continued with the PlayStation 2’s secondary role as a DVD player. But the Wii, PS3, and Xbox 360 definitively moved the entire product category beyond gaming. In fact, the PlayStation 3 ended up as the most popular way to view Netflix on a TV.
\n\nThis was all a natural consequence of the decreased cost of storage and computation combined with the ubiquity of wireless networking. It was inevitable that any TV-connected box would eventually support these features. But it also means the Xbox One, PlayStation 4, and Wii U lack the clarity of purpose enjoyed by the previous generations of game consoles. Here’s how things look to me at the dawn of the next generation.
\n\nStop me if you’ve heard this one before. The Wii U is dramatically less powerful than the Xbox One and PlayStation 4. In place of hardware power, Nintendo is offering an unconventional multi-screen gaming experience using a tablet-style controller. Although pricing has not been announced for its competitors, there’s a reasonable chance the Wii U will end up being the least expensive console in this generation.
\n\nIt sure looks like the Wii formula all over again, but there’s a difference this time. The Wii U’s GamePad controller is significantly more intimidating to non-gamers than the familiar-looking Wii remote. Wii accessories (and games) also work with the Wii U, which is nice, but the GamePad is the face of the new system to consumers. For former Wii buyers who are intimidated by the GamePad, Wii hardware and software compatibility may only make them further question what the new system really offers beyond the Wii. And though the Wii U expands on the Wii’s non-gaming features, its TV integration feels half-hearted and has thus far failed to impress.
\n\nThe end result has been dismal Wii U sales coming out of the 2012 holiday season. Nintendo’s rumored consideration of allowing smartphone apps to run on the Wii U seems uncharacteristically desperate.
\n\nThanks to the novelty and accessibility of the Wii remote and the universal appeal of launch titles like Wii Sports, the Wii sold in such huge numbers that third-party developers couldn’t afford to ignore it. They dutifully cut down the features and graphics quality of their most popular games to get them to run on the Wii. These games were often terrible, but at least they existed, giving the Wii’s game library “checkbox parity” with the rest of the market.
\n\nLike the Wii, the Wii U is not powerful enough to run the same games as its competitors. Unlike the Wii, the Wii U’s sales numbers aren’t high enough to motivate cut-down ports of new games. That leaves the Wii U with Nintendo’s franchise titles (many of which are not yet available), a scant few Wii U exclusives from third-party developers, and several ports of previous-generation games that Nintendo’s new hardware is finally able to run.
\n\nIt’s still too early to call this race, but the Wii U certainly looks like it’s in trouble. It may be that Nintendo has just built the wrong machine. For the most part, the Wii succeeded despite its underpowered hardware, not because of it. Choosing to produce another “next-generation” console with previous-generation power isolates Nintendo.
\n\nNew multi-platform titles can easily target the Xbox One, the PlayStation 4, and the PC simultaneously. The Wii U isn’t even in the running—unless it sells so well that a hobbled port is justified. The same goes for exclusives built around the Wii U’s unique features. No third-party developer wants to invest in a game that can only ever be sold on a single platform with a tiny installed base.
\n\nI own a Wii U, and I’m convinced that it really does offer new, fun gaming experiences not available on any other platform. I’m also a diehard fan of several of Nintendo’s popular franchises. But I’m not the kind of customer that carried the Wii to head of the class in the previous generation. I’m the kind that would gladly pay twice the price of a Wii U for the ability to play a Zelda game on a console with the power of the PlayStation 4. The Wii U is not built for me. Whatever kind of customer it is built for, there sure don’t seem to be many of them.
\n\nSony is the reigning king of overblown hardware hype, famously promising that the PS2’s emotion engine and the PS3’s Cell processor would change the face of computing forever. And maybe they did, in a tiny way. But their power was notoriously difficult to unlock. They became the standard-bearers for the gaming version of the ancient Chinese proverb: “May you develop for interesting hardware.”
\n\nHardware eccentricity has been part and parcel of console development for decades. And the weirder the hardware, the more likely it is that a straightforward implementation of a game engine will run up against bottlenecks. The developer laments are familiar. “If only there were more bandwidth between the CPU and main memory.” “If only I had just 10% more RAM.” “If only this console had a much more powerful programmable GPU instead of a ring bus studded with custom SIMD processors, each with its own tiny local storage.”
\n\nThe PlayStation 4 aims to repent for the sins of both its father and grandfather—and then some. Unlike its predecessors, it was designed in close cooperation with game developers. During the design process, new revisions of the PS4 architecture were presented to developers along with a challenge: find the bottleneck. Every aspect of the system was put through a similar gauntlet, from the shape and travel of the controller triggers to the accuracy of the gyroscopes.
\n\nAll game consoles go through some version of this process, but the PlayStation 4 is defined by it. The hubris of the PS2 and PS3 is nowhere to be found in the PS4. This is a product of a newly humbled and rededicated Sony.
\n\nAnd the thing that Sony is rededicated to is gaming, plain and simple. Sony was the first console maker to really push the idea of a gaming system that does much more than just play games, but now it’s returning to its roots.
\n\nThe PlayStation 4 is exactly the sort of thing that a hardcore gamer might have envisioned if presented with the product name back in the days when the original PlayStation reigned supreme. It’s got more of everything, and the vast majority of its resources are bent towards being the best system for developing and playing games. In this generation of consoles, that’s actually a radical notion.
\n\nThe final entrant in this round of the console wars is the most ambitious. No longer content to walk the old paths blazed by Nintendo, Sega, and Sony, Microsoft is finally making its play for the entire living room.
\n\nTake a peek at the back of the box—a box that looks for all the world like a futuristic VCR—and you’ll find the hardware incarnation of this ambition: an HDMI input. Any form of entertainment that does not spring from the Xbox One is invited to at least flow through it, to be mediated and controlled by it. It’s all right there in the name: One box to rule them all.
\n\nThe Xbox One announcement was unabashedly focused on everything but games. Microsoft promised more at E3, relying on the substantial goodwill it’s earned with gamers over the past decade to stave off any anxiety about the One’s gaming bona fides.
\n\nIndeed, at first glance, the core hardware architecture looks nearly identical to the PS4. But a closer look reveals a system designed to accommodate a much broader vision of home entertainment.
\n\nWhere the PS4 uses high-speed GDDR5 RAM, the Xbox One opts for slower—but also less power-hungry—DDR3. And in the Xbox, that RAM is shared between two separate operating systems running simultaneously: one for games, and one for everything else.
\n\nThese hardware features express two very different usage models. The PS4 expects to be turned on when in use, then turned “off” afterwards, entering a super-low-power mode during which a tiny auxiliary processor handles housecleaning chores like downloading game content and applying software updates.
\n\nThe Xbox One, with its HDMI input and non-game-related OS and apps, expects to be fully powered whenever the television is on. Thus, Microsoft’s focus on idle power consumption—even at the cost of gaming performance.
\n\nTo mitigate this disadvantage, the Xbox One includes 32MB of low-latency embedded SRAM right on the SoC. This is a common technique, but it leads to increased complexity. Game developers must now take care to ensure that the right data is in the tiny local eSRAM pool exactly when it’s needed. A single pool of uniformly fast memory (albeit with higher latency), as in the PS4, is a much simpler arrangement. Different priorities, different trade-offs.
\n\n(The eSRAM also consumes die space, which, along with power consumption and cost, may have contributed to Microsoft's decision to give the Xbox One 33% fewer GPU cores than the PS4.)
\n\nThen there’s the Xbox One’s companion hardware, the next iteration of Microsoft’s Kinect motion control system. The first version of this technology, released as an add-on for the Xbox 360, was the proverbial dancing bear: it didn’t work well, but it was amazing that it worked at all.
\n\nThe new incarnation comes bundled with every Xbox One, and it dances like a furry Fred Astaire. It surpasses its predecessor by many multiples in every specification: resolution, depth perception, motion tracking, latency, noise cancellation, local computation. This technology is no joke.
\n\nBut does it make games more fun? Or, failing that, is it a better way to control a television than a remote control? Microsoft is betting a lot, in terms of both hardware cost and software support, that the new Kinect will be an essential component of at least one of these activities in a way that the first Kinect was not.
\n\nWhen I’m feeling optimistic about the Kinect, I think back to the many generations of terrible touch-screen devices that preceded the iPhone. The history of touch-based interfaces on consumer electronics wasn’t a gradual ramp up to acceptable quality. The iPhone wasn’t just the next iteration; it was a discontinuity. Once the technology passed some critical threshold of responsiveness and reliability, it went from a nerdy curiosity to completely mainstream in the blink of an eye.
\n\nI don’t know where that threshold is for multi-sensor full-body motion control and voice recognition, but I do believe it’s out there. Microsoft does too. Of course, that belief will be of little consolation to Xbox One owners if the “iPhone moment” is still many years in the future.
\n\nLast generation, Nintendo did something crazy—and it worked. This generation, everyone is taking big risks.
\n\nNintendo tried to play the same hand that it won with in the last round, but now finds itself stranded with previous-generation hardware in a next-generation market. Like Apple in the 90s, Nintendo is a sentimental favorite. But it took more than just the iMac and the iPod to transform Apple. The Wii U still has the potential to be an excellent platform for Nintendo’s beloved first-party games, and a low-cost alternative to the PS4 and Xbox One. Nintendo should milk it for all it’s worth, and get busy on the next great thing.
\n\nSony is betting that the market for game consoles made by and for hardcore gamers has not yet peaked. If it’s right, Sony is well-positioned to dominate this generation. If it’s wrong, the PS4 could be Sony’s Spruce Goose: the ne plus ultra of game consoles, remembered in equal parts as a technical marvel and a cautionary tale.
\n\nFinally, there’s Microsoft, offering us a brief glimpse of the boundless hunger that once defined the company. But as Microsoft knows all too well, the living room is littered with the bones of past suitors.
\n\nI applaud the technical prowess of the Xbox One’s software, particularly the focus on responsiveness. The demonstrated performance when switching between live TV, gaming, and other apps puts all previous efforts at “smart” TV interfaces to shame.
\n\nThat said, I seriously question the public’s appetite for displaying any additional content alongside a TV show or movie. The “second screen” experience is already well established, and it happens with a device that’s in your hand or on your lap. Grabbing one third of a large, communal TV screen to look up an actor on IMDB isn’t just unappealing and cumbersome, it’s downright rude.
\n\nThere are other contexts where the Xbox One’s unique abilities might shine: jumping in and out of a game to check a sports score, for example, or quickly hitting the web to watch an extended version of an interview after finishing an episode of The Daily Show. Yes, I can see that.
\n\nBut will it be enough to crown the Xbox One the king of the living room? As with all TV-connected devices, content is the key. The Xbox One has games, live TV, and video streaming services covered, but it appears to lack any form of time-shifting functionality. Given how much popular content remains locked up in broadcast and cable TV packages, there’s no way any box without DVR-like functionality can ever be the One True Interface to “watching television.”
\n\nLuckily for all three companies, things change quickly in this industry. If a critical mass of programming becomes available on streaming services a few years down the road, the Xbox One could finally fulfill its destiny.
\n\nOn the other hand, Microsoft’s new focus could be a giant turn-off to gamers who were expecting an “Xbox 720,” not a Kinect-powered “media center.” However brief and anecdotal it may be, a Wii U sales spike accompanying the Xbox One announcement has to have Microsoft at least a bit worried. If the gamers who bought the Xbox 360 don’t show up in the expected numbers to buy the Xbox One, I have a hard time believing this monstrous, sensor-festooned device will pull a Wii and capture the imaginations—and dollars—of non-gamers on a grand scale.
\n\nNo matter what happens, I don't envision a future where the market is evenly divided between these three very different products. Game on.
\n\nIf you’d like to hear an expanded audio discussion of these topics, including my take on the TV-related efforts of Apple and Google, check out episode 3 of the Ad Hoc podcast with Guy English and Rene Ritchie.
", "date_modified" : "2013-05-28T14:14:07-04:00", "date_published" : "2013-05-28T10:50:33-04:00", "id" : "http://hypercritical.co/2013/05/28/next-generation", "title" : "Next Generation", "url" : "http://hypercritical.co/2013/05/28/next-generation" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nUpdate - May 14, 2013: I regret to report that the Hypercritical t-shirt has been canceled due to the unauthorized use of an icon from a past version of the Macintosh operating system. All purchases will be refunded in full. This situation is entirely my fault. I'm sorry for disappointing everyone. Thanks to all of you for your support.
\n\n(My original post about t-shirts appears below, for historical purposes.)
\n\nI’ve wanted to create a Hypercritical t-shirt for a while now. When I saw that my friend and podcast co-host Marco Arment had created a t-shirt for Marco.org using a new Kickstarter-like website called Teespring, I was intrigued. When I looked at the shipping dates for Marco’s shirt, I realized that it was now or never if I wanted to get Hypercritical shirts into people’s hands in time for WWDC.
\n\nThe Teespring website made it incredibly easy to get a shirt up for sale. In hindsight, doing this in the middle of the night on a Friday was perhaps not the best idea, but that’s what I did. In less than 30 minutes, I’d created the artwork, uploaded it, and started the sale. The rest of the weekend was a bit of a blur. I worked with the Teespring staff (and people on Twitter) to improve the artwork I’d created in haste, and to make more colors and styles available.
\n\nTeespring is a relatively young company, and the user-facing interface on the site doesn’t yet support adding multiple colors and styles. The Teespring staff made all these changes for me behind the scenes—on a weekend. The site’s limitations still necessitated the creation of two separate t-shirt “campaigns” for the two different ink colors used on the dark and light shirts.
\n\nIf you follow @hypercritical or @siracusa on Twitter and were online this weekend, this is all probably old news to you. My dual t-shirt sales have already far surpassed their goals, thanks to the amazing response of my Twitter followers.
\n\nIf you’d like to support my writing on this site, the t-shirt sale will continue until May 14th. According to Teespring's shipping estimates, all orders should arrive by June 4th at the latest, including international orders. Here are the two links for the dark and light t-shirt sales:
\n\nMy sincere thanks to everyone who has already purchased a shirt, and to all the people who continue to read this site.
", "date_modified" : "2013-05-14T14:42:00-04:00", "date_published" : "2013-05-06T08:31:32-04:00", "id" : "http://hypercritical.co/2013/05/06/hypercritical-t-shirts", "title" : "Hypercritical T-Shirts", "url" : "http://hypercritical.co/2013/05/06/hypercritical-t-shirts" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThe prevailing wisdom about software design at Apple is that the pendulum has swung too far in the direction of simulated real-world materials, slavish imitation of physical devices, and other skeuomorphic design elements, producing a recent crop of applications that suffer from an uncomfortable tension between the visual design of the software and its usability and features. After the executive reshuffle six months ago, we Apple fans have been hoping that Jony Ive, now in charge of Human Interface for both hardware and software, will end this destructive conflict and bring order to the galaxy.
\n\nWith iOS 7 and OS X 10.9 looming, we’re left to wonder exactly what kind of software designer Ive will turn out to be. Certainly, Apple’s software has been influenced by Ive’s hardware designs in the past—and perhaps vice versa—but this will be the first time Ive is officially in charge of the virtual bits as well as the physical ones.
\n\nWe may not have much to go on when predicting Ive’s software tastes, but we do know a heck of a lot about his opinions on hardware design. Though Ive has historically spent his time at Apple keynotes in the audience rather than on the stage, he’s starred in many, many videos wherein he explains why Apple’s great new hardware product looks and works the way it does. In these videos, his message has been remarkably consistent.
\n\nIve demands that the hardware be true to itself—its purpose, its materials, the way it looks, and the way it feels. Here’s a quote from one of Ive’s rare appearances outside an Apple press event, talking about hardware design at Apple.
\n\n\n\n\nWhen we’re designing a product, we have to look to different attributes of the product. Some of those attributes will be the materials that it’s made from and the form that’s connected to those materials. So for example, with the first iMac that we made, the primary component of that was the cathode ray tube, which was spherical. We would have an entirely different approach to designing something like that than the current iMac, which is a very thin, flat-panel display. […]
\n\nA lot of what we seem to be doing in a product like [the iPhone] is actually getting design out of the way. And I think when forms develop with that sort of reason, and they’re not just arbitrary shapes, it feels almost inevitable. It feels almost undesigned. It feels almost like, well, of course it’s that way. You know, why wouldn’t it be any other way?
\n
Steve Jobs also subscribed to this philosophy. Witness his explanation of the design of the first iMac with an LCD display at Macworld New York in 2002. Here’s how Jobs described Apple’s solution to the inherent compromises (in 2002 technology) of putting an optical drive in a vertical orientation and trying to pack an entire computer behind an LCD display.
\n\n\n\nThe big ideas was, that rather than glom these things all together and ruin them all—a lower-performance computer and a flat screen that isn’t flat anymore—why don’t we let each element be true to itself? If the screen is flat, let it be flat. If the computer wants to be horizontal, let it be horizontal.
It’s interesting that Jobs and Ive saw eye to eye on hardware design and yet seemed far apart, at least in Jobs’s final years, when it comes to software design. While Jobs was reportedly a champion of rich Corinthian leather, Ive could only wince when asked about it in an interview.
\n\nI’m confident that we’ll see less leather, wood, felt, and animated reel-to-reel tapes in Apple’s future software products, but the question remains: what does it mean for an application or an OS to be true to itself?
\n\nI’m not sure how Ive will express that concept, but Loren Brichter, creator of Tweetie and Letterpress, offers one possible interpretation on an episode of the Debug podcast (starting at 6:10, and again at 1:02:26, specifically mentioning Ive). Letterpress is an exemplar of the so-called “flat design” aesthetic (and it’s also currently featured on the front page of Apple.com). Brichter designed the look and feel of Letterpress based on the things that modern graphics hardware is naturally good at doing: drawing and manipulating flat planes of mostly solid colors.
\n\nA design philosophy so tightly linked to nitty-gritty details of silicon chips and OpenGL APIs is unlikely to resonate with Ive as much as it does with a programmer like Brichter, but the end results may be similar. I expect Ive to focus on harmony between the look and feel of the software, the materials and finish of the hardware, and most importantly, the intended purpose of each specific application. (It’s kind of a shame that Apple’s already used the “Harmony” code name.) This is my message to Jony Ive and my hope for iOS 7, OS X 10.9, and each bundled application: to thine own self be true.
", "date_modified" : "2013-05-07T11:14:34-04:00", "date_published" : "2013-05-03T21:59:48-04:00", "id" : "http://hypercritical.co/2013/05/03/beauty-truth-and-jony-ive", "title" : "Beauty, Truth, and Jony Ive", "url" : "http://hypercritical.co/2013/05/03/beauty-truth-and-jony-ive" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "In a recent podcast, I rejected the idea of a lottery system for selling WWDC tickets as too random. I wanted to preserve at least some aspect of the process that rewarded the most enthusiastic Apple fans: the people who are willing to be roused from bed at 2 a.m. and rush to their computers to buy tickets; the crazy ones; the people who just want it more.
\n\nAfter yesterday’s experience of watching WWDC tickets sell out in what I measured to be less than 2 minutes, I’ve changed my mind. If the tickets had sold out in, say, 10 minutes (and assuming no server errors—more on that in a moment), then dedicated buyers would have been rewarded. If you couldn’t be bothered to be online until more than 10 minutes after the tickets went on sale, well, tough luck. Someone else wanted it more.
\n\nBut tickets selling out in less than 2 minutes does not reward anyone’s dedication. We were all online at 10 a.m. PDT sharp, all ready to purchase, all equally dedicated. It was a de facto lottery, with an extra layer of pointless stress added on top.
\n\nApple’s servers performed admirably…for about the first 5 seconds after tickets went on sale. After that, it was a crapshoot. Even if the tickets had sold out in an hour, it’d still effectively be a lottery if that hour was filled with server errors. You’d “win” if you happened to get through the purchase process with no errors.
\n\nAn actual lottery, pre-announced, with no time pressure for entry, would be more equitable than what happened yesterday. That’s what I recommend for next year.
\n\nMany more people want to attend WWDC than the conference can accommodate. There has been no shortage of interesting suggestions for how to fix this. Broadly speaking, WWDC has not changed in decades. Apple and its developer ecosystem, on the other hand, are radically different than they were just five years ago. Something has to give.
\n\nI’ve heard many non-developers discuss the rush to get WWDC tickets as if the big draw is the keynote presentation, where Apple typically reveals new products. That is the most interesting part of the conference for the public, but it’s not why WWDC sells out so fast.
\n\nDevelopers flock to WWDC because it’s a rare opportunity to communicate with Apple directly, human to human. The best way to decrease the demand for WWDC tickets is for Apple to increase its communication with developers throughout the year. And by communication I don’t mean throwing documentation or even video presentations over the wall to developers; I mean staffing up for more real, personal, timely, informal contact with developers outside the court-like atmosphere of the App Store review process or the artificial scarcity of Technical Support Incidents.
\n\nApple’s decision to release WWDC session videos to all registered developers during the conference was long overdue, but it clearly didn’t decrease demand for WWDC tickets enough to make a difference. Maybe next year, after developers have experienced their first tape-delayed WWDC, it will make a dent. But I really believe that increased, improved communication between Apple and developers on all fronts is the best long-term solution.
", "date_modified" : "2013-04-26T10:15:36-04:00", "date_published" : "2013-04-26T09:22:17-04:00", "id" : "http://hypercritical.co/2013/04/26/the-lottery", "title" : "The Lottery", "url" : "http://hypercritical.co/2013/04/26/the-lottery" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nWhen Apple decided to make its own web browser back in 2001, it chose KHTML/KJS from the KDE project as the basis of its rendering engine. Apple didn’t merely “adopt” this technology; it took the source code and ran with it, hiring a bunch of smart, experienced developers and giving them the time and resources they needed to massively improve KHTML/KJS over the course of several years. Thus, WebKit was born.
\n\nIn the world of open source software, this is the only legitimate way to assert “ownership” of a project: become the driving force behind the development process by contributing the most—and the best—changes. As WebKit raced ahead, Apple had little motivation to help keep KHTML in sync. The two projects had different goals and very different constraints. KDE eventually incorporated WebKit. Though KHTML development continues, WebKit has clearly left it behind.
\n\nWhen Google introduced its own web browser in 2008, it chose WebKit as the basis for its rendering engine. Rather than forking off its own engine based on WebKit, Google chose to participate in the existing WebKit community. At the time, Apple was clearly the big dog in the WebKit world. But just look at what happened after Google joined the party. (Data from Bitergia.)
\n\n \n\n \n\nGiven these graphs, and knowing the history between Apple and Google over the past decade, one of two things seemed inevitable: either Google was going to become the new de facto “owner” of WebKit development, or it was going to create its own fork of WebKit. It turned out to be the latter. Thus, Blink was born.
\n\nGoogle has already proven that it has the talent, experience, and resources to develop a world-class web browser. It made its own JavaScript engine, its own multi-process architecture for stability and code isolation, and has added a huge number of improvements to WebKit itself. Now it’s taken the reins of the rendering engine too.
\n\nWhere does this leave Apple? All the code in question is open-source, so Apple is free to pull improvements from Blink into WebKit. Of course, Google has little motivation to help with this effort. Furthermore, Blink is a clearly declared fork that’s likely to rapidly diverge from its WebKit origins. From Google’s press release about Blink: “[W]e anticipate that we’ll be able to remove 7 build systems and delete more than 7,000 files—comprising more than 4.5 million lines—right off the bat.” (There’s some streamlining in the works on the other side of the fence too.)
\n\nDoes Apple—and the rest of the WebKit community—have the skill and capacity to continue to drive WebKit forward at a pace that matches Google’s grand plans for Blink? The easy answer is, “Of course it does! Apple created the WebKit project, and it got along fine before Google started contributing.” But I look at those graphs and wonder.
\n\nThe recent history of WebKit also gives me pause. Google did not want to contribute its multi-process architecture back to the WebKit project, so Apple created its own solution: the somewhat confusingly named WebKit2. While Google chose to put the process management into the browser application, Apple baked multi-process support into the WebKit engine itself. This means that any application that uses WebKit2 gets the benefits of multi-process isolation without having to do anything special.
\n\nThis all sounds great on paper, but in (several years of) practice, Google’s Chrome has proven to be far more stable and resilient in the face of misbehaving web pages than Apple’s WebKit2-based Safari. I run both browsers all day, and a week rarely goes by where I don’t find myself facing the dreaded “Webpages are not responding” dialog in Safari that invites me to reload every single open tab to resume normal operation.
\n\nHaving the development talent to take control of foundational technologies is yet another aspect of corporate self reliance. Samsung’s smartphone business currently relies on a platform developed by another company. Leveraging the work of others can save time and money, but Samsung would undoubtedly be a lot more comfortable if it had more control over the foundation of one of its most profitable product lines.
\n\nThe trouble is, I don’t think Samsung has the expertise to go it alone with a hypothetical Android fork. Developing a modern OS and its associated toolchain, documentation, developer support system, app store, and so on is a huge task. Only a handful of companies in history have done it successfully on a large scale—and Samsung’s not one of them. Sure, it’s possible to staff-up and build that expertise, but it’s not easy and it requires years of commitment. I’d bet against Samsung pulling it off.
\n\nFacebook Home can also be viewed through the lens of developer-based self reliance. Facebook clearly wants to make sure it’s an important part of the future of mobile computing, but that’s not easy to do when you’re “just a website.” Home lets Facebook put itself front and center on existing Android-based smartphones.
\n\nIt seems unwise for Facebook to build its mobile strategy on the back of a platform controlled by its mortal enemy, Google. But perhaps Home is just the first step of a long-term plan that will eventually lead to a Facebook fork of Android. If so, the question inevitably follows: can Facebook really take ownership of its own platform without help from Google?
\n\nFacebook has proven that it can expand its skill set. Over the past few years, it’s been hiring talented designers and acquiring companies with proven design chops. Facebook Home is the first result of those efforts, and by all accounts, the user interface exhibits a level of polish more commonly associated with Apple than Facebook.
\n\nStill, a lock screen replacement is a far cry from a full OS. Maybe Facebook just plans to ride the bear, relying on Google to do the grunt work of maintaining and advancing the platform for as long as it can, while Facebook slowly takes over an increasing amount of the user experience.
\n\nSome people wonder how Google can possibly have any power in the Android ecosystem if the source code is free. Facebook Home has been cited as an example of Google’s ineffectualness. Look at how one of Google’s fiercest enemies has played it for a fool, they say. Google did all the hard work, then Facebook came in at the last minute and co-opted it all for its own purposes.
\n\nBut look again at the graphs above. Now imagine similar graphs for the Android source code. Any company with Android-based products that wants to be truly free from Google’s control has to be prepared—and able—to match Google’s output. Operating systems don’t write themselves; platforms don’t maintain themselves; developers need tools and support; technology marches on. It’s not enough just to just fix bugs and support new hardware. To succeed with an Android fork, a company has to drive development in the same way that Apple did when it spawned WebKit from KHTML, just as Google is doing as it forks Blink from WebKit.
\n\nThis is not a real-time strategy game. Companies like Samsung and Facebook can’t just mine for more resources and build new developer barracks. Building up expertise in a new domain takes years of concerted effort—and a little bit of luck on the hiring front doesn’t hurt, either.
\n\nFacebook may already be a few years into that process. Its recent acquisition of the mysterious, possibly-OS-related startup osmeta provides another data point. Samsung, meanwhile, has just joined an exploratory project to develop a new web rendering engine.
\n\nGoogle certainly has its own share of problems, but what may save it in the end is its proven ability to tackle ambitious software projects and succeed. The challenge set before Facebook, Samsung, and other pretenders to the Android throne is clear. And as a wise man once said, you come at the king, you best not miss.
", "date_modified" : "2013-10-23T12:52:43-04:00", "date_published" : "2013-04-12T19:53:27-04:00", "id" : "http://hypercritical.co/2013/04/12/code-hard-or-go-home", "title" : "Code Hard or Go Home", "url" : "http://hypercritical.co/2013/04/12/code-hard-or-go-home" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Technology can be a surprisingly ideological topic. In politics, the spectrum of belief is right on the surface: conservative/liberal, right/left. In tech, that same spectrum exists, but it’s rarely discussed. What’s more, unlike political beliefs, I’m not sure most people are even aware of their own core ideas about technology.
\n\nAnyone who’s read the past three months of posts on this site could be forgiven for pegging me as a technological ideologue. Though I draw the line at outright dogmatism, railing against technological conservatism has indeed been a recurring theme of mine.
\n\nTo illustrate the concept, I’ll use myself as an example. Back in the early days of the operating system now known as OS X, I was not happy that the user-customizable Apple menu from classic Mac OS had been replaced with an anemic, non-customizable incarnation. In classic Mac OS, the Apple menu was how I quickly found and launched commonly used applications and Desk Accessories. Apple removed this feature in Mac OS X and replaced it with…nothing, really. The Dock attempted to cover some of the same bases, but the Apple menu could comfortably hold many more items, and in a much more compact form.
\n\nIn this situation, a technological-conservative position is that Mac OS X needs something like the classic customizable Apple menu. It wouldn’t necessarily have to be an Apple icon in the upper-left corner of the screen. It could be a hierarchical menu spawned from the Dock or another screen corner. (This was actually a popular request back in the days before the Dock supported any form of hierarchy.) The old OS had a feature like this, and it was useful. The new OS needs a similar feature, or it will be less useful.
\n\nBeneath what seems like a reasonable feature request lurks the heart of technological conservatism: what was and is always shall be.
\n\nIn my review of the public beta, I was self-aware enough to moderate my position, merely asking for “some sort of mechanism that equals or betters the functional merits of the Apple Menu.” But what my conservatism prevented me from seeing was that things like LaunchBar, Quicksilver, and (later) Spotlight would provide similar functionality in an entirely different way, and with far more efficiency and elegance.
\n\nNo one wants to think of themselves as a Luddite, which is part of what makes technological conservatism so insidious. It can color the thinking of the nerdiest among us, even as we use the latest hardware and software and keep up with all the important tech news. The certainty of our own tech savvy can blind us to future possibilities and lead us to reject anything that deviates from the status quo. We are not immune.
\n\nConsider four of my recent posts, each of which, in its own way, pressed uncomfortably against the dark matter of technological conservatism among tech nerds.
\n\nIn response to The Case for a True Mac Pro Successor, a few readers insisted that there’s no longer anything technically interesting about high-performance personal computers. A new Mac Pro would just be a pair of the latest Xeons, some ECC RAM, a few SSDs and/or hard drives, and a big, hot video card.
\n\nThat’s what the Mac Pro has been, so that’s what it will always be, right? And there it is.
\n\nEven explicitly listing several technologies that debuted on Apple’s high-end Macs did not derail the people whose feedback was based on the premise that the Mac Pro will never be anything that it is not already. This assumption is counter to the entire purpose of a product like the Mac Pro. It’s meant to push the envelope, to seek out new frontiers of computing power.
\n\nIn Don’t Stop Thinking About Tomorrow, I tackled technological conservatism head on—though without naming it—by addressing the surprisingly widespread notion that the iPhone 5 is “too light.” This criticism leans heavily on the seductive view of the present as an endpoint, rather than just another step in a journey towards something radically different. (For a long time, I avoided writing the post you're reading now because it felt like a retread of this older one. But I eventually decided that these ideas bear repeating. Do not be surprised when both posts arrive at a similar conclusion.)
\n\nFear of a WebKit Planet was a celebration of what turned out to be the tail end of peacetime in the browser wars. (Well, maybe it was really just a cold war turning hot again.) The post addressed the fear that “WebKit everywhere” would lead us into another dark age of web development. Even before Google’s fork of WebKit, I noted that WebKit was a lot more like Linux than IE6, and that “the products built with WebKit are as varied as those built with Linux.” Pondering that variety, the idea of a homogenous, stagnating WebKit monoculture seemed extremely unlikely. I didn’t have to wait long for confirmation.
\n\nFinally, the point of Annoyance-Driven Development was completely blotted out in the minds of a few readers by the audacious suggestion that a beloved service remains ripe for further improvement. This post revealed technological conservatism in its most virulent form: not only is the current state of affairs satisfactory, but wanting more is evidence of a character flaw, perhaps even a moral failing.
\n\nI find this idea absurd in its present-day context, and numerous analogous historical contexts immediately spring to mind as a means to persuade those who don’t. The trouble is, I can also imagine those same people taking the same technological-conservative positions in all the historical contexts as well. How far back in time do I have to go before it finally clicks?
\n\nPoor baby, you have to wait a whole day after a new episode airs on cable before it magically appears on your silent, $99, network-connected TV box.
\n\nWalking to the mailbox, unsealing an envelope, and sticking a disc into a slot under your TV is too much work, is it? Now you need to be able to start watching a movie without even picking your lazy ass up off the couch?
\n\nOh no! There are rooms in your house where you don’t have instant access to the sum of all human knowledge! And running wires is just so hard, isn’t it? Those few cents for zip ties to keep yourself from tripping over the wires will obviously break the bank. The prince demands radio-based networking everywhere in his castle!
\n\nI guess it’s just too much work to walk out the front door five steps, pick up the newspaper that was delivered while you slept, and then bring it back to your kitchen table each morning to read the news of the world. Now you want it to appear instantly on your computer screen. OK, Mr. Fancypants Bigshot.
\n\nYeah, pressing seven buttons in sequence is so much work. You need a faster way to call someone. Pressing just one button instead will be such a big change in your life, won’t it? You’ll finally have time to write that novel.
\n\nYou’ve got a way to send a piece of paper from your home to anywhere in the entire country for literal pocket change, but that’s just too much work for you. You need to talk to someone right now, hearing an actual voice as if it’s in the same room instead of miles away.
\n\nYou are warmed by the sun for nearly all your waking hours, but I guess that’s not good enough for you. No, you’re so important that you need to have light and heat at night as well. What you need, you precious snowflake, is a miniature artificial sun that’s under your control—obviously!
\n\nAt some point, we’re all guilty of looking down upon things that have changed since our own formative years, but this attitude has no place in technology criticism—and it’s absolute poison for anyone trying to create great tech products and services. Not all new ideas represent progress. (Do I really need to spell this out? It seems so.) But ideas should not be rejected based merely on a lifetime of having lived without them. Today’s “unnecessary” frill is tomorrow’s baseline.
\n\nAs the famous saying goes, the reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.
\n\nEvery great scientific and engineering triumph in human history has been a slap in the face of technological conservatism—the little ones, perhaps even more so. And yet each new step forward, no matter what the size, is inevitably met with a fresh crop of familiar objections. “Just look at what you have already, and it’s still not enough for you. Where does it end?”
\n\nIt doesn’t. It never ends. Keep moving or get out of the way.
", "date_modified" : "2013-04-07T21:30:25-04:00", "date_published" : "2013-04-07T13:04:05-04:00", "id" : "http://hypercritical.co/2013/04/07/technological-conservatism", "title" : "Technological Conservatism", "url" : "http://hypercritical.co/2013/04/07/technological-conservatism" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThe mobile market, everyone agrees, is the technology industry’s future. What’s not so clear is which company is best positioned to thrive in that future.
\n\nFor smartphones in particular, the traditional metrics are confusing. Android has 70% market share, but Apple is taking 70% of the profit. Google, meanwhile, is not benefiting from Android’s market share dominance as much as Samsung, which recorded $4 billion in profit from its cellphone and telecom business in Q4 2012. In the same quarter, Google made less—$2.89 billion—from all its businesses combined. And when it comes to selling actual smartphones, the only two companies making any money are Apple and Samsung.
\n\nSo who’s winning? When pondering this, I find myself thinking about dependencies. What is each company doing for itself, and in what ways does each company rely on others? I think this balance, much more than profits or market share, is what will determine long-term success. Let’s see how the players stack up.
\n\nGoogle’s Android strategy looks a lot like Microsoft’s Windows strategy of yore—minus the part where you collect all the money. Google got the other parts right, though: create a viable platform, support it, evangelize it, and get as many other companies as possible to use it. That last part is made a lot easier when the OS is free and open source, of course.
\n\nIn the PC’s heyday, Compaq, Dell, HP, Gateway, and others all killed each other selling PC hardware, grinding their profit margins down to almost nothing, leaving only a few players (barely) standing in the end. Microsoft, meanwhile, sat back and collected the same fat software margins from all of them (and from nearly all of their customers, as well).
\n\nWith Android, Google seemed to posit that there was value inherent in being the platform “owner,” even if hardware makers didn’t pay for each copy of the OS. Android was filled with connections to Google’s (also free) services. More people using Android meant more people seeing Google ads, which meant more money for Google.
\n\nIn the early days of Android, this theory looked promising. As in the PC era, hardware makers jockeyed for position in the nascent Android market. Individual fortunes rose and fell, but the number of Android activations just kept growing. So far, so good.
\n\nBut unlike the early PC market, the Android market hasn’t produced a group of strong competitors duking it out at the top. As previously noted, only one company, Samsung, is making any money at all selling Android smartphones—and it’s making more from them than Google itself.
\n\nFrom the beginning, Google has shrewdly hedged its bets by fielding its own line of Android hardware. More recently, Google purchased Motorola, giving it its very own bona fide handset maker. Thus far, none of these efforts have produced Samsung-like numbers. But it’s clear that Google is unwilling to be entirely dependent on other companies to create the hardware that its mobile OS needs to be a complete product.
\n\nSamsung seems like an Android success story. Previously better known in the US for its TVs than its smartphones, Samsung combined its hardware manufacturing prowess (and its shameless willingness to copy other companies’ design cues) with Google’s mobile OS to produce profitable phones that customers love.
\n\nThough the Galaxy line of devices would not be possible without Android, Samsung is far from Google’s ideal of a dutiful Android licensee, selflessly ferrying customers to Google’s services.
\n\nJust as PC makers used to insist on adding their own graphical shell or other brand-specific “enhancements” to their Windows PCs, most companies selling Android-based hardware products feel compelled to put their own stamp on the vanilla Android experience. Samsung is no different, steadily papering over the underlying Android OS with each new release of its TouchWiz user interface.
\n\nAnd why not? If Android is a money-loser for every other smartphone maker, Samsung is obviously doing something right. In its recent ill-conceived Galaxy S4 launch event, Android was barely mentioned at all. Samsung’s dependence on Android is clearly chafing.\n\n
In truth, Apple has been bitten more than once by its dependence on other companies. The viability of the Mac once depended on Microsoft’s willingness to produce a decent version of Office for it. Later, the Mac faltered multiple times when IBM and Motorola were unwilling or unable to produce competitive desktop and laptop CPUs. When Apple wanted to revamp its OS, Adobe and Microsoft were unwilling to port their software, forcing Apple back to the drawing board. Then there was that time when Apple asked another company to make a phone.
\n\nLike a lover who’s been betrayed one too many times, Apple has hardened its corporate heart against any form of true partnership. If it’s important, Apple wants to own and control it. When Apple does work with others, it insists on having the upper hand. iOS developers serve at the pleasure of Apple. Manufacturing partners must fight for the privilege of building Apple’s products, often using equipment Apple purchases for them. And, of course, Apple has its own mobile OS that runs exclusively on its own hardware. As God is its witness, Apple will never be hungry again!
\n\nSteve Jobs personified this attitude, which is why he felt so deeply betrayed when Google, his partner on stage during the iPhone introduction, remade Android in iOS’s image. After that, Apple’s reliance on Google for essential parts of its mobile experience simply could not stand.
\n\nThe trouble is, online services have not historically been Apple’s strength. That’s why it partnered with Google, Yahoo, and others in the first place. It took Apple several years (and several acquisitions) to finally replace Google maps—and the results were not ideal.
\n\nThere’s an old saying in business: don’t outsource your core competency. Or, as Joel Spolsky originally put it, “If it’s a core business function, do it yourself, no matter what.” This guideline makes it easy for a software developer to decide to outsource, say, catering and landscaping services. But what about Apple, with its historically well-founded paranoia about relying on outside companies for anything related to its actual products? What happens when everything starts to look like a “core business function?”
\n\nEven among just these three companies, there are more than enough dependencies to go around. Google depends on other companies to make and sell the vast majority of the products that run its mobile OS. Samsung depends on Google to make and support the most important software component of its flagship mobile devices. Even the fiercely independent Apple still depends on Samsung to manufacture many of its mobile processors (for now…) and Google to provide web search services—and perhaps to give a little help with maps as well.
\n\nBack to the original question: who has the upper hand? Yes, there are dependencies in all directions—but not all dependencies are created equal.
\n\nDespite its recent success, Samsung remains in the weakest position. It clearly doesn’t want to remain beholden to Google, and that’s the right instinct. But I’m not confident in Samsung’s ability to completely divorce its mobile platform from Android. I just don’t think it has the experience or expertise to be a real platform owner.
\n\nFurthermore, while Android’s market share may be overwhelming, Samsung’s is not. Even if Samsung had the skills to take the reins of its software stack, it’d have to maintain compatibility with present and future versions of Android, lest it become just another low-volume also-ran smartphone platform.
\n\nGoogle’s present position looks weak, but it has two big trump cards. First, Google has proven to be one of the few companies capable of creating, popularizing, and supporting a platform. Despite all the skinning and branding by handset makers, Google is still the driving force behind Android. This power can only be negated by another company that’s willing and able to match Google’s Android efforts on all fronts: OS development, app store, developer tools, evangelism, the works. That’s a tall order.
\n\nSecond, Google is still the king of online services. Apple, the biggest technology company in the world, just tried to replace maps, one of Google’s second-tier services, and barely avoided disaster. Microsoft, the former undisputed ruler of the tech sector, has been trying for years to challenge Google for the web-search crown, with little success. Maps and search are not obscure or obsolete services. If you can’t create equal or better alternatives—and so far, no one has—then you’re stuck relying on Google.
\n\nGoogle still needs hardware partners to maintain its Android empire, but we already have a model for how a software-focused platform owner can dominate a market. It’s harder to imagine a hardware maker dominating while relying on a software platform controlled by someone else.
\n\nFinally, there’s Apple, the jilted lover, feverishly working to eliminate any dependency that puts it at the mercy of a potential competitor. Apple remembers when Samsung was a great source of mobile CPUs and Google provided network services for iOS. Now look at those two traitors. No partnership is safe!
\n\nAnd so, in addition to developing its own OS, designing its own hardware, producing many of its most popular applications (built in its own IDE using its own compiler and language), Apple now has its own mapping service, is designing its own mobile CPUs, and is trying to get someone other than Samsung to manufacture them—all the while presumably eyeing its other parts suppliers and software partners warily.
\n\nDespite the bumps, Apple’s position remains strong. It’s got the best app ecosystem, competitive, trend-setting hardware, great adoption of each new version of its OS, and double the margins of the only other company making money selling smartphones. Oh yeah, and it dominates the tablet market too. There’s a lot for Apple to do in 2013, but at least it’s poised to succeed or fail on its own merits.
\n\nLooking out further than a year, the picture gets fuzzier. An unfortunate side effect of doing everything yourself is that every other company starts to look like an enemy. Realistically, Apple can’t do everything—or can’t do everything well, anyway. Online services are only going to become more important with time, so it’s understandable that Apple wants to be the master of its own destiny in this area. But it needs to improve much more quickly if it wants to even remain competitive, let alone catch up to Google. Failing that, it needs to find some partners that aren’t mortal enemies. (I’m sure Marissa Mayer would take Tim Cook’s call.)
\n\nIn general, Apple needs to engage in more balanced partnerships that produce sustainable benefits on both sides. The switch to Intel CPUs is a good example, especially given how the situation has changed since the deal was first struck. In business, no strategic partnership is forever, but that’s no reason to avoid them entirely. And who knows? Perhaps Apple’s good relations with Intel will lead to its next great mobile SoC being manufactured at 22 or even 14nm.
\n\nLet’s just hope Tizen doesn’t come up during the meeting.
", "date_modified" : "2013-03-20T16:36:03-04:00", "date_published" : "2013-03-19T19:58:14-04:00", "id" : "http://hypercritical.co/2013/03/19/self-reliance", "title" : "Self-Reliance", "url" : "http://hypercritical.co/2013/03/19/self-reliance" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThe xMac has been back in the news lately—the idea, if not necessarily the name. Whether it’s called a “Mac minitower\" or a “Mac Pro mini,” we long-suffering Mac Pro fans are all looking forward to the “really great” thing Tim Cook told us to expect this year.
\n\nWhat almost no one expects is another straightforward revision of the existing Mac Pro, a gargantuan tower-style computer built with server-grade CPUs and RAM that pushes the limits of computing performance. Very few people want that kind of computer these days, and even fewer people actually need one.
\n\nOn paper, the Mac Pro may no longer be a viable product, but it would be a mistake for Apple to abandon the concept that it embodies. Like the Power Mac before it, the Mac Pro was designed to be the most powerful personal computer Apple knows how to make. That goal should be maintained, even as the individual products that aim to achieve it evolve.
\n\nWhy is this important? If Apple produces a new Mac that’s faster than any of its current models by leaps and bounds, will people suddenly buy it in huge numbers, choosing it over the laptops, tablets, and phones they prefer today? No. Is it because a very fast Mac can be sold for such a high price that its huge margins will make its profits significant, despite the expected low number of sales? No, that won’t happen either. Is a new, insanely fast Mac even guaranteed to make any money at all for Apple? Sadly, no.
\n\nSo why bother creating a true Mac Pro successor at all? Good riddance, right?
\n\nIn the automobile industry, there’s what’s known as a “halo car.” Though you may not know the term, you surely know a few examples. The Corvette is GM’s halo car. Chrysler has the Viper.
\n\nThe vast, vast majority of people who buy a Chrysler car get something other than a Viper. The same goes for GM buyers and the Corvette. These cars are expensive to develop and maintain. Due to the low sales volumes, most halo cars do not make money for car makers. When Chrysler was recovering from bankruptcy in 2010, it considered selling the Viper product line.
\n\nWhy wouldn’t a company want to get a low-volume, money-losing product line off its books, bankruptcy or no bankruptcy? If you can’t think of a reason, you may be what is known in the auto industry as a “bean counter.” Luckily for Viper fans, Chrysler had a few car guys left. Here’s a passage from Car and Driver’s preview of the 2013 SRT Viper—the Viper that almost didn’t exist.
\n\n\n\n“I knew the very last thing Chrysler needed during our bankruptcy was a 600-hp sports car,” says Ralph Gilles, the 42-year-old president and CEO of SRT and senior V-P of Chrysler Product Design. “But I’m an optimist. I wanted to fight for a chance. We discussed it for a year. I got Sergio [Marchionne, Chrysler CEO] to drive one of the last Vipers. He jumped in and disappeared to God knows where. He came back 15 minutes later and said, ‘Ralph, that’s a lot of work.’ He meant it was a brutal car. But he didn’t say, ‘Good riddance,’ or anything. Then in late ’09, I showed him a video of a Viper breaking the Nürburgring record. He watched all of it and was impressed. I gave him a list of the supercars the Viper had put away.
The car guys won; Chrysler chose to keep the Viper.
\n\nApple is not yet in bankruptcy, but every other reason that Chrysler should have run screaming from the Viper applies equally to the Mac Pro (except perhaps the lack of profitability; Apple doesn’t share that information about individual Mac lines). To understand Chrysler’s decision, let’s consider why halo cars exist at all.
\n\nOne reason is prestige. Though few people can afford to buy a Viper, its mere existence makes the affordable cars from the same manufacturer that have even the mildest bit of sporting pretension slightly more attractive to buyers. Yes, this makes little logical sense, but it’s a very real phenomenon. (There’s a reason the term “halo effect” reportedly dates back to at least 1938.)
\n\nHalo cars also push car makers to their limits. Engineering teams must use all their powers and all their skills to create the very best car possible. This exercise inevitably leads to the exploration of new technologies. The failed experiments are forgotten, but the winners eventually find their way into more prosaic cars from the same manufacturer.
\n\nThe Mac Pro is Apple’s halo car. It’s a chance for Apple to make the fastest, most powerful computer it can, besting its own past efforts and the efforts of its competitors, year after year. This is Apple’s space program, its moonshot. It’s a venue for new technologies to be explored.
\n\nConsider Larrabee, Intel’s project to create a massively multi-core x86-based GPU. Rumor has it that Apple was working on integrating the technology into a Mac Pro. Intel eventually scuttled the project, but consider what would have happened if it had taken off, reshaping the GPU market in the process. Apple would have had a head start on integrating the technology into its OS and application frameworks. Its drivers would have had their kinks worked out. When it became feasible to incorporate Larrabee technology into the rest of its product line, Apple would have been ready.
\n\nI intentionally chose a (rumored) failure as an example because that’s part of the point. Better to experiment on your niche product than your high-volume money-maker. There are plenty of success stories as well.
\n\nThink of all the technologies that debuted on Apple’s high-end Macs: hard drives, color, FireWire, multiple CPUs, multi-core CPUs, 64-bit CPUs, programmable GPUs, real-time video processing. All these features had a chance to get shaken out on machines that most people don’t buy. When they trickled down to “normal” Macs, Apple had enough experience under its belt to implement them competently.
\n\nAs for prestige, perhaps you think the existence of the Mac Pro has precisely zero influence on the average MacBook buyer. The existence of the Corvette probably doesn’t affect the behavior of Chevy Malibu buyers either. But things change as you creep up the respective product lines, edging closer to the high end. The Titanium PowerBook G4 was all the more impressive for incorporating the CPU previously only available on Apple’s “supercomputer” Power Mac G4.
\n\nI used the present tense earlier when I said that the Mac Pro is Apple’s halo car, but that hasn’t actually been true for a while. By allowing the Mac Pro line to languish for so long, Apple has negated any possible prestige effect and abandoned an arena where it could safely push the limits of PC performance.
\n\nI know what you're thinking. That was then, this is now. The age of the high-end PC is over! But halo cars are even more absurd than high-end PCs. There are some pretty hard limits on car performance. Anything that carries a human around can only pull so many Gs before its fragile cargo gives up the ghost.
\n\nCompare this to computing power, which has no apparent useful limit. While car performance has increased by perhaps a factor of 5 in the past 50 years (and that's being generous), humanity has absorbed a million-fold increase in computing power during that same period without sating its appetite for more. (And that factor gets quite a bit larger if I add GPUs to the mix.) Computers are not “fast enough.” They weren’t when they were invented, nor when they got 10x faster, nor when they got 100,000x faster still. They never will be.
\n\nTo be clear, absolute performance is not the only worthy technological frontier. Apple continues to push the limits on many other fronts: miniaturization, power efficiency, manufacturing processes, materials, and, of course, user experience. The same is true for car manufacturing, where fuel efficiency, safety, reliability, and even comfort are arguably more important axes of innovation than absolute performance (the limits of which can’t be legally explored on public roads anyway). And yet there they all are, those absurd halo cars, laughing in the face of logic.
\n\nThis brings us to the final, and perhaps most important reason that halo cars exist, and that the Mac Pro—or its spiritual equivalent—should continue to exist. Let’s talk about the Lexus LFA, a halo car developed by Toyota over the course of ten years. (Lexus is Toyota’s luxury nameplate.) When the LFA was finally released in 2010, it sold for around $400,000. A year later, only 90 LFAs had been sold. At the end of 2012, production stopped, as planned, after 500 cars.
\n\nThose numbers should make any bean counter weak in the knees. The LFA is a failure in nearly every objective measure—including, I might add, absolute performance, where it’s only about mid-pack among modern supercars.
\n\nThe explanation for the apparent insanity of this product is actually very simple. Akio Toyoda, the CEO of Toyota, loves fast cars. He fucking loves them! That’s it. That’s the big reason. It’s why the biggest car maker in the world spent ten long years and well over a billion dollars developing a car that almost no one will ever own—or even know about, for that matter. It explains why Toyota scrapped the LFA’s frame design and essentially started over with carbon fiber midway through the development process. (Talk about a Steve Jobs move.)
\n\nAnd perhaps it also explains why the famously cantankerous Jeremy Clarkson of Top Gear, a man who has driven nearly every supercar produced in the last several decades, recently called the LFA “the best car I’ve ever driven.”
\n\nI’m not here to convince you that the LFA is a good car, that you should trust Jeremy Clarkson’s opinions on cars (or anything, really), or that you should buy a Mac Pro. All the common reasons you’ve heard for Apple to abandon the market for high-end PCs are logically and financially sound. They also don’t matter.
\n\nApple should keep pushing the limits of PC performance because it’s a company that loves personal computers. If Apple can’t get on board with that, then all the other completely valid, practical reasons to keep chasing those demons at the high end are irrelevant. The spiritual battle will have already been lost.
", "date_modified" : "2013-03-08T20:42:12-05:00", "date_published" : "2013-03-08T16:09:06-05:00", "id" : "http://hypercritical.co/2013/03/08/the-case-for-a-true-mac-pro-successor", "title" : "The Case for a True Mac Pro Successor", "url" : "http://hypercritical.co/2013/03/08/the-case-for-a-true-mac-pro-successor" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "I must confess, I was neither surprised nor disturbed by last month’s announcement that the Opera web browser was switching to the WebKit rendering engine. But perhaps I’m in the minority among geeks on this topic.
\n\nThe anxiety about the possibility of a “WebKit monoculture” is based on past events that many of us remember all too well. Someday, starry-eyed young web developers may ask us, “You fought in the Web Standards Wars?” (Yes, I was once a Zeldi Knight, the same as your father.) In the end, we won.
\n\nAs someone whose memory of perceived past technological betrayals and injustices is so keen that I still find myself unwilling to have a Microsoft game console in the house, my lack of anxiety about this move may seem incongruous, even hypocritical. I am open to the possibility that I’ll be proven wrong in time, but here’s how I see it today.
\n\nAs much as I despised Internet Explorer for Windows, and what its simultaneous stagnation and dominance did to the web, I don’t think it’s the correct historical analog in this case. WebKit is not a web browser. It’s not even a product. It’s much more analogous to Linux, an open-source project that any company or individual is free to build on and enhance.
\n\nLinux, once a personal project created just for fun, now dominates the data center. It’s also in phones, tablets, game consoles, set-top boxes, and even (sometimes) PCs.
\n\nIs there a “Linux monoculture?” In some ways, yes. These days, it’s surprising if a startup creates a hardware product sophisticated enough to need an operating system and that operating system isn’t Linux. And let’s not forget that Linux has all but wiped out the proprietary Unix-based operating systems that once ruled the high-end.
\n\nLinux is the canonical open source success story. It succeeded for reasons that are now so boring they’re accepted as common sense. There’s still plenty of room for variation and innovation, but now all the significant achievements are shared with the world. If a company improves Linux, it’s not just improving its own products; it’s making Linux better for everyone. Linux let us “put all the wood behind one arrowhead” (to borrow one of Scott McNealy’s favorite sayings), but on a global—instead of merely a corporate—scale. (Funny how things turn out, eh, Scott?) Linux solved the Unix problem—for everyone.
\n\nWebKit fills a similar role. Thanks to WebKit, anyone who needs a world-class web rendering engine can get one—for free. And the products built with WebKit are as varied as those built with Linux. Even products in the same category vary wildly. Chrome and Safari, for example, have different features, different extension mechanisms, different JavaScript engines, different process models, and very different user interfaces. Opera adds yet more variation. And these are all just standalone web browsers. Consider all the embedded applications of WebKit, from game consoles to theme-park kiosks, and the idea of a homogenous, stagnating WebKit monoculture seems even more unlikely.
\n\nI haven’t forgotten the past. A single, crappy web browser coming to dominate the market would be just as terrible today as it was in the dark days of IE6. But WebKit is not a browser. Like Linux, it’s an enabling technology. Like Linux, it’s free, open-source, and therefore beyond the control of any single entity.
\n\nWeb rendering engines are extremely complex. There are very few companies that have the expertise to create and maintain one on their own. (Again, the similarity to Linux is strong here.) I’m glad all those developers at Apple and Google are working on improving the same open-source web rendering engine, rather than dividing their efforts between two totally different, proprietary engines. Adding Opera’s developers can only make things better. The proliferation of WebKit will be a rising tide that lifts all boats.
", "date_modified" : "2013-03-04T13:15:39-05:00", "date_published" : "2013-03-04T13:15:39-05:00", "id" : "http://hypercritical.co/2013/03/04/fear-of-a-webkit-planet", "title" : "Fear of a WebKit Planet", "url" : "http://hypercritical.co/2013/03/04/fear-of-a-webkit-planet" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nI’ve been watching House of Cards, the new TV series available exclusively on Netflix, which reportedly outbid HBO, Showtime, and others for the rights to the show. This is part of Netflix’s ongoing effort to “become HBO faster than HBO can become us.” That quote, from Netflix’s chief content officer Ted Sarandos, neatly draws the battle lines between the old and new worlds of TV.
\n\nOnce the upstart, HBO now finds itself playing catch-up with Netflix in terms of pricing and distribution. Netflix, meanwhile, is shelling out its own money to try to overcome its historic inability to offer the very best content.
\n\nI’m not ready to predict a winner in this race—though the two-year wait for HBO to add AirPlay support to its HBO Go iOS app does not inspire confidence in the old guard. I’m more interested in what Netflix offers that HBO doesn’t.
\n\nThe answer is obvious to anyone who has used the service. For a fixed, low monthly fee, Netflix lets customers watch TV shows and movies whenever they want, wherever they want, on phones, tablets, “smart” TVs, game consoles, streaming media boxes, blu-ray players, even personal computers—remember those?
\n\nNetflix’s decision to release the entire first season of House of Cards all at once is in keeping with its disregard for the traditional limitations of TV. This is how products and services endear themselves to consumers: remove everything that gets in the way of what we want. We want to be entertained. We don’t want to arrange our schedules around your TV show. We don’t want to watch commercials. We don’t want to be forced to use a particular device. We just want it the way we want it.
\n\nBut even Netflix has been unable to escape some of the trappings of the days of video past. A TV series like House of Cards that’s released a season at a time naturally lends itself to multi-episode viewing sessions. But as I recently tweeted, watching a minute and a half of opening credits before each episode can get tiresome.
\n\nThis position proved somewhat controversial on Twitter. Hard-working people deserve credit, some said. Others said that the credits set the mood for the show. Some people just plain liked the credits, with no qualifiers.
\n\nBut there were also people who agreed with me, people who routinely skip the opening credits (often lamenting the limited content-skipping tools provided by their chosen Netflix viewing device). One person even read my tweet while killing time as the House of Cards credits ran in another browser tab.
\n\nTo be fair to Netflix, the existence of opening credits may not be entirely under its control, even when it’s paying for a series itself, given existing union contracts for actors, directors, writers, etc. But getting bogged down in the details of this debate misses the point.
\n\nYes, opening credits are a longstanding part of traditional TV—but so were fixed broadcast schedules, commercial breaks, and viewing all TV shows on a television set. As the delivery mechanism changes, the content itself must also adapt to its changing context.
\n\nNot everyone binges on House of Cards four episodes at a time, but the people who do really love Netflix for making it possible. Every time I fast-forward through those 90-second opening credits (made more difficult by the occasional variable-length pre-credits scene), I get the opposite feeling about Netflix. It’s an unhappy reminder of the old world of TV. No explanation of contractual obligations or artistic credit is going to convince me that I’m mistaken about my own desires. I just want it the way I want it!
\n\nThis may sound comically selfish, but true innovation comes from embracing this sentiment, not fighting it. For companies looking to get the best bang for their buck out of technology, this is the way forward. Find out what’s annoying the people you want to sell to. Question the assumptions of your business. Give people what they want and they will beat a path to your door.
\n\nThis brings us, perhaps surprisingly, to the PlayStation 4, the newly announced successor to the six-year-old PlayStation 3. Six years is an eternity in the world of technology. For the first few decades of console gaming, each new hardware platform surpassed the capabilities of its predecessor by leaps and bounds. There was little question about what to do with technology. More, better, faster was an end in and of itself. If you build it, the games will come.
\n\nThe Wii was the first console to break that cycle, directing a large chunk of its innovation toward a novel control scheme, sacrificing raw computing power to do so. It worked. The Wii became the best-selling console of its generation, and its competitors soon followed with non-traditional control schemes of their own.
\n\nBased on what’s been announced about the PlayStation 4 so far, it seems that Sony has learned at least some of the lessons of the Wii. While the PS4 will indeed be substantially more powerful than the PS3 (and embarrassingly more powerful than its competitor from Nintendo, the Wii U), Sony has not chosen to sink millions into developing a radical new CPU architecture like the PS3’s Cell processor in the hopes that raw MIPs will inexorably lead to market dominance.
\n\nInstead, Sony has built the PS4 using a nicely balanced arrangement of existing technology. All the time, money, and energy that would have otherwise gone toward a true Cell successor has been refocused on ensuring that the PS4 does things that makes Sony’s customers happy.
\n\nGame developers are one kind of customer. There may not be many of them relative to the number of people Sony hopes will buy its products at retail, but developers can make or break a game console by choosing which games to develop for which platform, and when. And developers sure weren’t happy with the PS3, which was unlike any piece of gaming hardware that had come before it. Thanks to its familiar combination of an x86 CPU and an ATI GPU, the PS4 will be much easier to write games for.
\n\nSony feels gamers’ pain as well. The PS4 appears to have been designed by identifying the parts of the PS3 experience that are annoying and deploying technology to eliminate them. Deciding to play a game and being delayed by 30 minutes of mandatory system updates is not fun, so Sony added a dedicated processor to handle background downloads, and a low-power state for the entire system to allow this to happen unattended. Resuming an interrupted gaming session only to find yourself back at the last checkpoint in the game is not fun, so Sony promises the ability to suspend a game’s state in its entirety and resume later at the instant you left off. Waiting an hour for a multi-gigabyte game to download before you can start playing it is not fun, so the PS4 will allow games to be played as they download.
\n\nSony is providing new features as well. A dedicated video encoder allows gameplay to be recorded in real time with no loss of performance, and a “share” button on the controller allows that video to be uploaded (in the background, naturally), without leaving the game. That same video encoding hardware plus Sony’s game-focused social network will allow players to invite their friends to watch them play in real time. Sony even promises the ability to play games remotely. If a player is having trouble with some part of a game, he could invite one of his friends to remotely assume control for a bit to help out.
\n\nNow, anyone who remembers Sony’s promises about the PlayStation 3 knows all too well how far they can be from the eventual reality. I’m very skeptical about Sony’s ability to deliver all the announced PlayStation 4 capabilities in a competent and timely manner. And then there are all the areas where the interests of gamers and game developers may conflict (e.g., the market for used games).
\n\nBut when I look at the PlayStation 4 hardware itself, I see a shrewd acknowledgement of the true nature of innovation. It doesn’t cost much to add dedicated silicon to handle background network transfers and video encoding and decoding, and it sure isn’t sexy, technologically speaking. Low-power sleep states, instant suspend/resume, progressive downloads, and remote play are all features that are a giant pain to implement and do precisely nothing to make games look, sound, or perform better. But it’s these things, not the number of CPU/GPU cores or the amount of RAM, that really have a chance of making the PS4 gaming experience stand head and shoulders above what has come before.
\n\nWe nerds love technology for its own sake. Indeed, there’s always something to be gained by advancing the state of the art and providing more of a good thing. But the most profound leaps are often the result of applying technology to historically underserved areas. By all means, make everything better and faster, but also find the things that seem like minor annoyances, the things that everyone just accepts as necessary evils. Go after those things and you’ll really make people love you. Accentuate the positive. Eliminate the negative.
", "date_modified" : "2013-03-04T08:24:32-05:00", "date_published" : "2013-02-24T20:46:59-05:00", "id" : "http://hypercritical.co/2013/02/24/annoyance-driven-development", "title" : "Annoyance-Driven Development", "url" : "http://hypercritical.co/2013/02/24/annoyance-driven-development" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "The iPhone 5 caught some flak for being “too light.” Similarly, some consider the latest revision of the iMac to be “too thin.” You’ll find some incredulity in the articles that address this topic. It’s a little silly, right? After all, what’s the alternative? Thicker and heavier? Stagnation? But these complaints are not entirely unreasonable.
\n\nWhen it comes to electronics, density is often a signal of quality. A product that feels like an empty metal box seems cheap. A tiny item with surprising heft seems expensive. For handheld items, higher density can also help produce stronger, more concentrated pressure on the hand. This helps to more clearly delineate the sensations of a securely held item and an item that’s about to slip out of the hand. I’ve heard this complaint about the iPhone 5 many times: “It’s so light, I’m afraid I’m going to drop it!”
\n\nNo one is holding an iMac while using it, so there’s no fear of dropping it. But if it’s not being held, why the rush to slim down? Dissatisfaction with the ever-slimming iMac is exacerbated by the removal of the optical drive in the latest revision. In all likelihood, that optical drive was going away regardless of the thickness of the iMac’s edge. (Apple’s been steadily dropping optical drives from the Mac line for years.) Still, some people can’t help but infer a cause and effect relationship, blaming Apple’s seemingly pointless drive for thinness for the loss of the slot for the spinning shiny things.
\n\nIn the past, I’ve voiced my own complaints about the edge of the latest iMac and how the iPhone 5 feels in the hand. But though I might disagree with the timing and details of these changes, I fully support the broader long-term trend towards lighter, thinner hardware. Here’s why.
\n\nIn technology, things that can be measured appear to exist on a smooth continuum: large to small, slow to fast. But the experiences provided by these measurable quantities often have sharp discontinuities.
\n\nConsider touch-screen user interfaces. They’ve existed for decades, but it wasn’t until the iPhone arrived that they entered widespread usage. Yes, there are many non-tech factors that contributed to this, but the responsiveness of the iPhone’s interface was an essential factor. With the iPhone, touch interfaces finally crossed the threshold from frustrating to joyful.
\n\nI’m not sure where the threshold is, or even what quantities it applies to (e.g., frames-per-second of animation, input lag, finger pressure), but it’s definitely there. It’s not a steady ramp from unacceptable to acceptable. It’s a perceived discontinuity—a leap.
\n\nMost measurable qualities of tech products have experiential discontinuities like this. In fact, there are usually multiple discontinuities. It’s human nature to think that we’re at the pinnacle of useful achievement, but it’s never actually true. Watch what happens to the experience of using a touch-screen when we go in search of the next discontinuity—what the Microsoft researcher in this video calls “a perceptual cliff.\"\n\n
This phenomenon is not limited to performance measurements. It extends to every aspect of a product, including size, weight, and even shape. Let’s reconsider the iPhone. The change in thickness and weight between the iPhone 4S and the iPhone 5 was very small. Using an iPhone 5 does not feel dramatically different than using a 4S. Clearly, the iPhone 5 has not yet reached the next perceptual cliff—but it’s out there.
\n\nConsider a distant-future iPhone roughly the same width and height as the iPhone 5, but as thin and as durable as a credit card. Accidentally drop such a phone and it’d flutter harmlessly to the ground. Now maybe this would be a terrible design—the edges might dig into your hand, and it might be even less secure-feeling when held—but it’d clearly change the equation when it comes to fear of dropping your iPhone (not to mention where and how to carry it, and so on).
\n\nDon’t get distracted by the details. I’m not arguing for or against a particular design. My point is that it’s important to keep making progress towards the next discontinuity, wherever it may be.
\n\nApple has its compass trained on “thinner and lighter,” a direction that’s proven fruitful in the past. But as much as we’d all like to jump right to the next big win, you can’t just skip to the end. The original iPhone was never going to be followed by the credit-card-thin iPhone—again, ignoring whether this is actually a good idea; stay with me! Instead, it was followed by the 3G (thicker in the middle, but thinner-feeling on the edge), then the 4 (thinner overall), then the 5 (thinner still), and so on.
\n\nThe same goes for the iMac, with the same caveats about the direction and endpoint. How does the iMac change as a product when it’s as thin as an iPad, or a cafeteria tray, or a credit card? Does it even need to exist at that point? Maybe the distant-future iMac is “just a big iPad.” Or maybe some new i/o device makes all of this moot.
\n\nMistakes will be made in the march towards the future. But the worst possible mistake is neglecting to do the work required to get there because you think we’ve already arrived. There is no destination; there is only the journey. Pick a direction or get out of the way.
", "date_modified" : "2013-02-08T15:21:56-05:00", "date_published" : "2013-02-08T15:08:57-05:00", "id" : "http://hypercritical.co/2013/02/08/dont-stop-thinking-about-tomorrow", "title" : "Don’t Stop Thinking About Tomorrow", "url" : "http://hypercritical.co/2013/02/08/dont-stop-thinking-about-tomorrow" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "I didn’t just lead Apple to a record quarterly profit of $13.1 billion on sales of $54.5 billion, so I don’t expect to be consulted. But were Tim to ask me, here’s what I would tell him Apple should do in 2013—in broad strokes, and in no particular order. (We’ve got people to work out the details—right, Tim?) This is not a fantasy wish list. These are things I think Apple can and should do this year. This list is not exhaustive.
\n\nShip OS X 10.9. Last year, Apple announced OS X’s move to an annual release cycle. Lion was released in 2011; Mountain Lion followed in 2012. Two points may make a line, but it’ll take three points to fulfill this promise. As tired as I get just thinking about writing another OS X review, it’s time to do it all over again. (Big cat name optional.)
Ship iOS 7. Apple’s mobile platform started out way ahead of the competition, and it’s stayed ahead thanks to relentless iteration: six releases in six years. Apple can’t let up now. What’s left to do in iOS? Plenty.
Diversify the iPhone product line. There needs to be more than one iPhone. Selling models from previous years at a discount is no longer good enough. Apple can make more attractive phones at similar prices if they’re purpose-built using modern parts and processes. Margins may go down, but sales will go up. Apple has done this before, with the Mac, the iPod, and now the iPad. It’s the iPhone’s turn. Cheaper, smaller, bigger, or multiple combinations of these attributes—it doesn’t matter. Write it down, Tim: more new iPhones in 2013.
Keep the iPad on track. Ship some new, slimmer, faster, lighter iPads, just like everyone expects. Cheaper wouldn’t hurt either. The mini was a great start. Now ditch the iPad 2 and make a new model to fill that role, if necessary. (A larger, more powerful “iPad Pro” would also be great, but this year is probably too soon.)
Introduce more, better Retina Macs. The first Retina MacBook Pro had a GPU that could barely handle all the pixels it was asked to push. Burn-in was also an issue. This year, the available CPU, GPU, and display options should make the existing 13- and 15-inch Retina MacBook Pros look like the first-generation MacBook Air: technical marvels, but also compromises that we’ll soon be happy to forget. Oh, and a Retina display on a non-laptop Mac would be nice too.
Make Messages work correctly. Apple’s iMessage service is rapidly approaching MobileMe levels of undesirable brand association. Fix it in 2013, or be ready for an iCloud-like rebrand/relaunch in 2014. Speaking of which…
Make iCloud better. iCloud beats the pants off MobileMe, but it’s still got plenty of room for improvement. Google should be the reliability and performance target. Decide which technologies and APIs under the giant umbrella term “iCloud” are working well, and fix or deprecate the ones that are not.
Resurrect iLife and iWork. Both application suites are in desperate need of some serious attention. The last new release of iLife was two years ago; iWork hasn’t had a major revision in four years. People still use these apps. Abandoning them is not an option (yet).
Reassure Mac Pro lovers. Fans of the Mac Pro did not get the new machine they wanted in 2012. After WWDC 2012, Tim Cook said, “Although we didn’t have a chance to talk about a new Mac Pro at today’s event, don’t worry as we’re working on something really great for later next year.” As I’ve frequently noted, this statement is not a promise for a new Mac Pro, but merely for something that customers disappointed in the stagnant Mac Pro will consider “really great.” 2013 has not gotten off to a good start on that front, but the year is young. Wow me, Tim.
Do something about TV. After years of steadily ramping up its rhetoric, it’s time for Apple to put up or shut up about TV. Make an actual Apple TV set; allow third-party apps on a massively revised Apple TV box; buy Netflix; whatever—you decide, Tim. I agree, it’s a hard problem and a tough market. But it’s time for action.
Should be a cinch, right? Too bad there are only two items on this list that will help Apple’s stock price recover from its calamitous 35% drop over the past four months. Uneasy lies the head that wears a crown.
", "date_modified" : "2013-02-02T23:19:04-05:00", "date_published" : "2013-02-02T22:39:51-05:00", "id" : "http://hypercritical.co/2013/02/02/apples-2013-to-do-list", "title" : "Apple’s 2013 To-Do List", "url" : "http://hypercritical.co/2013/02/02/apples-2013-to-do-list" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThe highlight of Nintendo’s video presentation this week was the announcement of a Wii U remake of The Legend of Zelda: The Wind Waker, a GameCube game originally released in the US a decade ago. As a dedicated Zelda fan, my reaction was predictably enthusiastic.
\n\nElsewhere on the net, fretting about the content and appearance of the game started immediately. It made me think about why I’m such a fan of video game remakes while my default position on movie remakes is to turn up my nose at them. How can I hate the Star Wars special editions but love the HD remakes of Ico and Shadow of the Colossus? I think both sentiments have the same underlying motivation: I don’t want to lose the things I love.
\n\nIn the case of Star Wars, I’m frustrated not so much by the existence of alternate versions of the movies, but by the disappearance of the original theatrical releases. I discussed this at length in episode 45 of the Hypercritical podcast (the topic starts at 35:57), but here’s a summary: Artists are often not the best stewards of their own work. Once an artistic creation reaches a certain level of cultural significance, it belongs to society at large more than it belongs to the creators—philosophically, if not legally. Cultural touchstones belong to all of us, and they deserve to be treasured and preserved, regardless of the creator’s wishes.
\n\nVideo games are an odd art form in many ways, one of which is that they’re extremely dependent on their delivery platform. More established kinds of art like paintings, books, video, and audio recordings have all proven resilient to changes in technology. The novels of Charles Dickens did not disappear as book technology evolved. Most filmmakers have been vigilant about preserving and (eventually) digitizing movies that were shot on film. (Again, Star Wars stands out as a sad exception.) All these art forms have a clear path to move forward in time; they’ll always be with us.
\n\nVideo games are a different story. Historically, video game platform owners have been unwilling or unable to preserve the works of art originally delivered on their platforms. When the Wii, PS3, and Xbox 360 all launched with some ability to play games made for the consoles they replaced, I was optimistic about the future. But the PS3’s ability to play PS2 games rapidly diminished, first losing dedicated hardware support and then disappearing completely. Similarly, the latest iteration of the Wii can’t play GameCube games. Hoarding and preserving console launch hardware started to make a lot more sense.
\n\nToday, Nintendo sells its own emulated versions of many of its classic games. Presumably this will extend to Wii U games when that hardware is eventually phased out. But I have little faith in Nintendo’s motivation to preserve its past beyond its function as an income source. And let’s not forget all the important video game makers that have gone out of business—or been acquired and re-acquired so many times that they might as well have.
\n\nAgain, as in the case of Star Wars, it has fallen to the fans to preserve classic games, sometimes by preserving the original hardware, but most often through emulation. This doesn’t just apply to video games that are 30 years old. Games are becoming inaccessible so rapidly that even platforms created just a handful of years ago already have active emulation projects.
\n\nThat’s the fear that HD remakes tap into. Though there are many things that can go wrong when an older video game is ported and “improved” for release on a newer hardware platform, the risks are vastly outweighed in my mind by the playable-lifespan extension that a remake bestows on a beloved game.
\n\nRight now, I can play Wind Waker on my GameCube and my Wii. Newer Wiis (and the Wii U) don’t play GameCube games. Both the GameCube and the Wii send their video signal over a component cable, at best. I suspect TVs will stop shipping with component video inputs in a few years, which will leave me at the mercy of video converter boxes. Eventually, no matter how well I care for them, my 12-year-old GameCube and my 7-year-old Wii will break. (The optical drives will probably go first.) But when that happens, my Wii U, with its HDMI connection and 2012 manufacture date, will probably still be working. Time extended!
\n\nAlas, things get even more complicated when you consider not just the software but also the controller hardware and the details of the display device. I’ve still got my N64 in the attic, but my son experienced Ocarina of Time by playing the GameCube port on the Wii connected to a plasma HDTV. Was it the same as playing the original using an N64 controller and an old CRT television? Well, not quite. This problem only gets worse as the hardware gets more novel.
\n\nIn the end, I’m content to at least preserve the software in some playable form, even if the controller and display are slightly different. Just doing this is turning out to be enough of a fight. I hope my purchase of the Wii U remake of Wind Waker will help convince Nintendo and other game makers that older titles are valued by gamers long past the death of their original platforms.
\n\nI’m also a little afraid that remakes like this will delay or prevent the original version of the game from appearing in an officially sanctioned emulated form. But for now, I’ll take what I can get. I’m glad my son has already played the original GameCube version of Wind Waker—twice. I’m also excited to replay Wind Waker with him on the Wii U in HD. It won’t be exactly the same as it was, but I think it’ll still be great. Most importantly, I hope he can share both of these experiences with his children someday.
", "date_modified" : "2014-08-13T21:14:37-04:00", "date_published" : "2013-01-25T14:05:00-05:00", "id" : "http://hypercritical.co/2013/01/25/we-can-remember-it-for-you-wholesale", "title" : "We Can Remember It for You Wholesale", "url" : "http://hypercritical.co/2013/01/25/we-can-remember-it-for-you-wholesale" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Watching the CES coverage out of the corner of my Internet eye, I’m reminded of exactly how bad most hardware makers are at writing software. Mat Honan summed it up nicely last month: No One Uses Smart TV Internet Because It Sucks. Amen to that. But it’s not just TVs. Who really likes the “software” in their car, microwave, or blu-ray player?
\n\nAll of this software is terrible in the same handful of ways. It’s buggy, unresponsive, and difficult to use. I actually think the second sin is the worst one, especially when it comes to appliances and consumer electronics. Dials and knobs respond to your touch right now. Anything that wants to replace them had better also do so. But just try finding and watching a YouTube video on your TV and see how far you get before your brain checks out. It’s faster to get up off the couch and walk to a computer—or, you know, whip out your iPhone.
\n\nThe companies out there that know how to make decent software have been steadily eating their way into and through markets previously dominated by the hardware guys. Apple with music players, TiVo with video recording, even Microsoft with its decade-old Xbox Live service, which continues to embarrass the far weaker offerings from Sony and Nintendo. (And, yes, iOS is embarrassing all three console makers.)
\n\nCompanies that make physical products that have only recently started sprouting sophisticated software features all find themselves in a similar bind. The obvious solution is to just make better software. If only. I have little faith that these companies are willing and able to transform themselves in the radical ways required to produce and support great software. Here’s what I see happening instead.
\n\nThe long-term success of these companies now hinges on how difficult it is to create the hardware product that’s wrapped around their crappy software. Car makers, for example, are probably safe from software upstarts (if not from other car makers). The barrier to entry in the auto industry is immense, and the remaining successful car makers have deep expertise in their craft. If Tesla succeeds, for example, it won’t be because MyFord Touch is slow and unintuitive.
\n\nTV makers, on the other hand, should be worried. Most of the hardware they make is already a component of the industries dominated by the software guys. The proliferation of “smart” TV features is fueled by the fear of becoming a mere component supplier. Unfortunately for the companies involved, the terrible quality of these features may actually end up hastening the transitions from “TV maker” to “panel maker.”
\n\nAt this point, the only thing keeping the hounds at bay is the reality that a TV with non-crappy software requires a much deeper cooperation with content providers. So while Apple can whip up a TV running iOS in its sleep, giving that software something useful to do requires talking to content owners—and possibly also cable companies and ISPs, who are even more keen to keep the content owners in their camp, and who have barriers to entry that the auto industry would die for. And this is before even considering the fragmentation of TV and Internet access in the US and around the world.
\n\nThe hardware barriers that protect ISPs and car makers will probably hold up (much to our detriment, in the case of US ISPs), but I think the TV content owners will eventually come around—or be routed around. When that happens, the market for formerly “software-neutral” hardware devices like TVs will rapidly follow the same path as the mobile phone market. If it happens soon enough, it may even be the same familiar handful of companies that gobble up all the losers: Apple, Samsung, Google, maybe even Microsoft.
\n\nUntil then, we’ll all just have to suffer through—or find a way to ignore—this avalanche of software that’s slowly making our a/v equipment, appliances, and vehicles more annoying to use.
", "date_modified" : "2013-01-07T16:31:00-05:00", "date_published" : "2013-01-07T16:31:00-05:00", "id" : "http://hypercritical.co/2013/01/07/ces-worse-products-through-software", "title" : "CES: Worse Products Through Software", "url" : "http://hypercritical.co/2013/01/07/ces-worse-products-through-software" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "\n\nThis article originally appeared in issue 2 of The Magazine on October 25, 2012.
\n\nJourney for the PlayStation 3 is the best video game I’ve played in a long time. I’m going to use it to illustrate a larger point about technology, and in doing so, I’m going to spoil the game. If you have any interest in video games at all, I strongly recommend that you do not read any further until you’ve played it.
\n\nOnline discourse can be harsh. Nowhere is this more true than in multiplayer video games. It’s nearly impossible to play a popular online game without being exposed to — or worse, being the target of — the most vile kinds of behaviors and insults, including sexist, racist, and homophobic slurs.
\n\nThis problem is not confined to video games. Even something as seemingly benign as a comment form on a popular technology blog can trigger profoundly bad behavior. A well-known Penny Arcade comic sums up the phenomenon nicely in the form of John Gabriel’s Greater Internet Fuckwad Theory, which states: Normal Person + Anonymity + Audience = Total Fuckwad.
\n\nMany remedies have been tried: moderation, the use of “real names” (whatever that means), increasingly complex privacy settings, user voting, karma scores, etc. Sometimes these things help, but often only a little — and they all require constant vigilance.
\n\nIn frustration, many users and content creators choose to take out the big hammer and end discourse entirely. Eliminate blog comments. Mute all voice chat. Disable communication between players on opposing teams. The only winning move is not to play.
\n\nSo goes the conventional wisdom. But then there’s Journey, a $15 video game for the PlayStation 3. When you start playing Journey, it’s not even obvious that it’s a multiplayer game. When other players appear, they are not announced in any way, nor are you directed to interact with them. Some players choose to ignore them and complete the game on their own. Others dismiss them as computer-controlled NPCs. This is the first part of Journey’s solution: interaction with others is optional.
\n\nThose who choose to engage with others have only a few choices. Players can move, jump, and “sing” by pressing a single button, causing a musical note to play and a unique glyph to appear on screen. The glyph is not selected or drawn by the player; it’s automatically chosen by the game (so penis-themed griefing is out of the question). There is no text or voice chat. Singing is the only way to communicate, and the only control the player has over the note that’s played is the volume and duration.
\n\nMost critically, none of these actions can harm other players. Even movement can’t be used as a weapon; players simply pass through each other, making it impossible to bump other players off a high ledge or otherwise perturb their progress. Movement can’t even be used to race ahead and steal a desirable in-game item before another player can get to it, because power-ups are not consumed when acquired: they remain in place for future players to receive.
\n\nAll of this may sound like it stops just short of banning communication entirely. Will players even bother to interact with each other? Surely, such a limited palette of options will render the multiplayer aspects of Journey trite and inconsequential.
\n\nBut that’s not what happens at all. Instead, Journey players find themselves having some of the most meaningful and emotionally engaging multiplayer experiences of their lives. How is this possible?
\n\nThough players can’t harm each other, they can help each other. Touching another player recharges the power used to leap and (eventually) fly. In cold weather, touching warms both players, fighting back the encroaching frost. More experienced players can guide new players to secret areas and help them through difficult parts of the game.
\n\nJourney players are not better people than Call of Duty players or Halo players. In fact, they’re often the same people. The difference is in the design of the game itself. By so thoroughly eliminating all forms of negative interaction, all that remains is the positive.
\n\nPlayers do want to interact; real people are much more interesting than computerized entities. In Journey, players inevitably find themselves having positive interactions with others. And, as it turns out, many people find these positive, cooperative interactions even more rewarding than their usual adversarial gaming experiences.
\n\nDoes this mean that playing Journey turns players into relaxed, peace-loving, spiritually enlightened beings? Certainly not — but the limited communication system works in more ways than one.
\n\nIn the same way that you can imagine the actors in a subtitled film (speaking in a language you don’t understand) are all giving Oscar-worthy performances, it’s natural to assume that every other Journey player has only the best intentions. After all, while we may judge ourselves by our motivations, we tend to judge others by their actions. The actions in Journey are all either neutral or positive, so that’s how players perceive each other.
\n\nJourney players are also anonymous during the game. The unique player glyphs are only shown next to PlayStation Network account names when the game is over, and they change on each play-through. Again, this plays into that subtitled-movie optimism. It’s much easier to believe that the anonymous player with the winged glyph is the most caring, thoughtful person in the world when you don’t know his PSN account name is K1LLSh0t99.
\n\nIf you want some evidence of the deep feelings triggered by this game, look no further than the Journey Apologies thread in the official forum for the game. Here, players apologize to the anonymous others they feel they have disappointed in the game. It’s like missed connections for gamers. Here’s an example post:
\n\n\n\nTo my friend in the fifth area: I never wanted to leave you. I just whiffed really badly on a jump. I miss you. And I’m sorry.
Journey may be just a game, but the lessons it teaches about ourselves and the things we’re capable of creating can be applied to all of human endeavor.
\n\nThroughout history, we humans have invented many different sets of rules for ourselves. Some have worked better than others, but all of them have been exploited. As anyone with children knows, if there’s one thing humans are good at, it’s finding loopholes.
\n\nWhen a system of rules is applied to many people, thoroughly codified, and consistently enforced, you have something approaching a government. But for governments, even the most successful change occurs slowly and often happens painfully. This can lead even the most optimistic person to despair.
\n\nHuman history is long, but how many different sets of rules have really been tried? In meatspace, it’s so difficult to establish a new set of rules or change the existing ones that the rate of design iteration is severely limited.
\n\nThis is not so in the relatively consequence-free worlds of video games and the Internet. In the digital realm, wild experimentation and rapid iteration are the norm. It’s also much easier to establish and enforce an iron-clad set of rules in a virtual world than in the real one. This is the environment that created Journey, and its rarity is why it’s such a joy.
\n\nThe lesson of Journey is that success is possible, even in an area like online multiplayer interaction which has seemed so hopeless for so long over so many thousands of iterations. Success is possible.
\n\nBut let’s go further. Our digital lives increasingly affect our real lives. Consider Twitter, another system for online interaction that has succeeded in large part thanks to its novel set of rules and limitations. There’s a whole world of bad behavior that doesn’t fit into 140 characters and doesn’t work when producer/consumer relationships are asymmetrical. Twitter isn’t just a game; its influence extends into the real world, in ways we don’t yet fully understand.
\n\nAs another US presidential election season grinds on and I become freshly disillusioned with the seemingly intractable problems in our system of government, Journey and Twitter give me hope. They make me believe that maybe, just maybe, the digital world can be both a laboratory for new ideas and, eventually, a giant lever with which to change the formerly unchangeable.
", "date_modified" : "2012-11-27T09:42:00-05:00", "date_published" : "2012-11-27T09:42:00-05:00", "id" : "http://hypercritical.co/2012/11/27/strange-game", "title" : "Strange Game", "url" : "http://hypercritical.co/2012/11/27/strange-game" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "As I have for the past 13 years (yikes!), I wrote a review of the latest major release of the Mac operating system, OS X 10.8 Mountain Lion, for Ars Technica. There are several ways to read it.
\n\nHere are my thoughts on the various reading options.
\n\nI consider the web version to be the canonical version, and the version with the best formatting and the most features. I believe that good writing for the web includes a lot of links. A web browser is the best place to inspect and follow those links.
\n\nThe free web version has ads, and it’s split up into multiple “pages” (which are actually much longer than a single printed page). This kind of pagination annoys some people. I actually like it for very long articles because it helps me keep my place across multiple reading sessions. I can remember I was on page 8 instead of remembering the exact point in a very long, scrolling web page.
\n\nThat said, I also really like how an Ars Premier subscription eliminates all ads from the Ars Technica website and gives me the option to view any article on a single page. I use single-page view on very long articles when I’m searching for some text using my web browser’s “Find…” feature. I use it all the time on short articles.
\n\nSome people think Ars Technica forces me to break my article up into many tiny pages. That’s not the case. I choose how to paginate the article. I like to break it up on logical section boundaries, which means that the “pages” vary widely in length. This year, Ars Technica actually asked me to merge several pages together to reduce the total number of pages (and I did).
\n\nThis year, I created the Kindle and EPUB versions of the article myself. They’re both generated from the canonical HTML version of the article. Both ebook formats have severe limitations, most of which are imposed by the reader software.
\n\nReading the Kindle version using a device or application that supports Kindle Format 8 provides the best experience of any of the ebook formats. Kindle Format 8 readers support amazing new technologies such as text that flows around images and the ability to tie a caption to an image. Yes, that was sarcasm.
\n\nUnfortunately many Kindle reading devices and applications don’t support Kindle Format 8. Most notably, the iOS Kindle app still does not support Kindle Format 8. The Mac version does, however, as does the Kindle Fire.
\n\nThe Kindle ebook is a single file that contains two versions of the content: one in Kindle Format 8, and one in the older Kindle format. Open the same ebook file in both the Mac and iOS Kindle reader applications and you’ll see two very different appearances.
\n\nApple’s iBooks app displays the EPUB version of the book almost as well as the Kindle Format 8 readers, but it has an annoying habit of stretching the content to fit the vertical space of the page when a large image causes a mid-page break. This can cause the image captions to be separate from their associated images by a big swath of whitespace.
\n\nLesser reader applications and devices display the Kindle and EPUB files in progressively more depressing ways. Most (all?) ebook reader applications also don’t provide a nice way to have a text link briefly display an image on top of the content, or to show a larger, un-cropped version of an inline image. I really wish ebook readers had the same capabilities and behaviors as a modern web browser.
\n\n(I was not involved in the creation of the PDF version, but I imagine I’d find the limitations of the PDF format similarly frustrating.)
\n\nThe Mountain Lion Kindle ebook is $4.99, which is the same as last year’s Lion ebook. I considered a lower price, but Amazon’s ebook royalty system is definitely geared towards higher-priced (or maybe just smaller) ebooks. Even at $4.99, more than half the purchase price is going to Amazon. You can read Amazon’s pricing page and do the math for yourself for a 7.5 MB Kindle ebook.
\n\nAt various times, people have asked me if I have a flattr account or something similar through which they can send me money. It’s always felt weird to me for anyone to be sending me money “just because.” I’m much more confortable creating something and then selling it to people who want it. My Mountain Lion review provides just such an opportunity.
\n\nLast year was the first year that Ars Technica tried selling ebook versions of my writing. The results certainly exceeded my expectations, but I didn’t get any part of the ebook profits. This year, I will.
\n\nSo if you’re one of those people who has asked about sending money to thank me for my writing, my podcast, or whatever, only to be rebuffed by my discomfort with receiving “money for nothing,” now’s your chance to pay money for something: buy the Kindle ebook or subscribe to Ars Premier for a month or a year.
\n\nIn an earlier post, Highlights from 2011, I worried that the audience for my brand of tech writing was an ever-shrinking portion of a much larger, broader market. I often feel the same way about my podcast, Hypercritical—the third thing to share this name. (In order: 2009, 2010, 2011.)
\n\nBut the web traffic and ebook sales from last year’s Lion review showed me that, at the very least, my audience is still growing in absolute numbers even as it may be shrinking as a percentage of the whole. For that, I continue to be very grateful, and I hope this year turns out just as well. Thanks to all of my fellow nerds for allowing me to continue to do this.
", "date_modified" : "2014-10-17T08:59:48-04:00", "date_published" : "2012-07-25T08:40:00-04:00", "id" : "http://hypercritical.co/2012/07/25/mountain-lion", "title" : "About My Mountain Lion Review", "url" : "http://hypercritical.co/2012/07/25/mountain-lion" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "I like pasta. I’d like to help people make better pasta. It pains me to think about all the poorly prepared pasta being served and eaten in America. My advice will focus on plain old store-bought dried pasta. Nothing fancy. You’ve probably made some yourself.
\n\nI’m specifically not talking about preparing or cooking fresh pasta, how to execute any particular pasta recipe, or why you should never, ever buy pasta sauce in a jar. (You really shouldn’t, though.) This is just about the basics: how to cook and serve dried pasta as part of some larger recipe, the details of which are out of scope, for now.\n\n
Here’s my advice, in no particular order.
\n\nDo not overcook your pasta.
\n\nPlease, I beg you, do not overcook your pasta. Every time you serve a pile of starchy, gelatinous mush, an Italian grandmother sheds a single, silent tear. Overcooking is by far the most common pasta sin in America. (As evidence, consider that Olive Garden, the gold standard for incorrectly prepared Italian food, intentionally overcooks its pasta.)
\n\nThese days, the cooking times on most boxes of dried pasta are in the ballpark, but there are exceptions. Boxed macaroni and cheese and other “children’s” pasta products routinely have cooking times that should be cut in half. But even in the best case, cooking times are just estimates. The actual cooking time will depend on the temperature and humidity of your kitchen, the mass and thermal conductivity of your cookware, the power of your cooktop, and on and on.
\n\nAs you gain experience, you’ll be able to tell when pasta is ready by “feel” (with a pair of tongs or a stirring spoon). But the old fashioned way is still the most reliable: taste a piece. Drop the pasta in the boiling water (see the next section for more on that), set a timer for 1-2 minutes less than the time on the box of your trusted dried pasta brand, and start tasting when it goes off.
\n\nThere’s an old saying about cooking eggs: done in the pan, overdone on the plate. The same goes for pasta. It will continue to cook after you remove it from the pot, and even more so when you put it directly into another hot pan or combine it with other hot, moist ingredients.
\n\nDried pasta in hot water cooks from the outside in. The very last part to be cooked is the part that’s the least accessible to the hot water (e.g., the “knot” in the middle of a farfalle bow tie). Once the pasta is “cooked through,” meaning there’s no longer any trace of hard, dried pasta at the center, you’ve probably already waited too long to take it out of the water.
\n\nHere’s a good heuristic for string-shaped pasta like spaghetti. Fold the pasta back on itself and pinch it near the end, forming a small loop where it makes a u-turn. If that loops closes easily and completely collapses on itself, leaving no hole at all, you’ve waited too long to remove it from the water.
\n\nOne last tip on cooking times. Pasta with a lot of surface area (e.g., rotini) cooks faster, and it also overcooks faster. It can take only a few seconds to go from “just right” to “too late.” Be aware of your pasta shape. The more surface area, the smaller the margin for error.
\n\nI’m going to continue to my next point, but cooking time will come up again. If you learn only one thing from reading this, it should be that doneness is the most complicated, difficult, and important aspect of cooking pasta.
\n\nCook your pasta in a sufficient amount of boiling, salted water.
\n\nHow much is a “sufficient” amount? A good rule of thumb is 4-6 quarts of water for each pound of dried pasta. (Most boxes of dried pasta are 1 pound.) You can probably get away with using less, but I think that leads to a pot that feels too crowded.
\n\nFill your pot with cold water from the tap. Hot water is more likely to pick up unpleasant stuff from the pipes. Salt the water until it tastes like the ocean1. (If you don’t know what ocean water tastes like, please take a break now and find out. This blog post will be here when you return.) Nothing other than salt needs to be in the water. Do not add oil.
\n\nI’ve heard people say they add oil to the water to prevent the pasta from sticking to itself. This is misguided on multiple levels. First, the pasta will spend most of its time below the surface of the water, far from the oil which will all stay on the surface of the water. Second, you want pasta’s natural, starchy surface to be exposed upon exiting the water so the pasta can absorb the flavorful ingredients you’re about to combine it with. An oil coating would impair that.
\n\nAs with most kitchen myths, there is a kernel of truth behind the notion of oil in the pasta water: pasta that sticks together is bad. You do not want pasta to stick to other pieces of pasta, or to any part of the pot you’re boiling it in. But the solution to this problem is simple: stir the pasta at a few key points during the cooking process.
\n\nStir right after you dump the pasta into the water. Adding the pasta will decrease the temperature of the water, and may even take it off the boil. This is fine, but it does mean that the bubbling action won’t be there to keep the pasta from settling to the bottom and sticking to itself or the hot surface of the pot.
\n\nStir again as the boil comes back, to confirm that the pieces really are all separate and not sticking to each other. With any luck, the bubbles will keep everything moving and all the pieces of pasta separated for the rest of the cooking time.
\n\nLong, stringy pasta shapes require the most stirring later in the cooking process because you can’t agitate them well until they become pliable, and at that point they may have been pressing up against their neighbor strands in hot water for a while. Be vigilant. If a few get away from you, tongs can help separate strands once the boil is rolling along again.
\n\n(And please, do not break long, stringy pasta. Cook and eat it at its natural length. You’ll figure out the fork-twirling thing with a little practice.)
\n\nFinish cooking your pasta in the sauce.
\n\nPasta should go directly from the hot water where it (mostly) cooked into a vessel where it will be combined with the rest of the ingredients in the finished dish. It could be a traditional tomato sauce, olive oil with garlic, or a complicated multi-ingredient mixture. Whatever it is, the pasta must immediately meet it.
\n\nYou should use a colander if it will take more than 15 seconds to fish out the pasta with tongs or other utensils. Remember, it’s still cooking! If you do use a colander, do not rinse your pasta. Just think of the colander as a really large utensil for separating the pasta from the water and bringing it to its next vessel.
\n\nWhen combining the pasta with the other ingredients, try to coat each and every piece of pasta. If possible, undercook the pasta slightly (i.e., leave a tiny bit of uncooked dry pasta at the center) and really finish cooking it in the sauce. This is most practical when combining a small amount of pasta with a sauce prepared in a very wide pan, preferably one that contains some liquid. If liquid is lacking, a bit of the water that the pasta cooked in can be added. (A splash of starchy pasta water is a common liquid thickener in many simple pasta recipes.)
\n\nSauce your pasta, but don’t over-sauce it.
\n\nIn case this doesn’t go without saying, if there’s a large volume of sauce, like a giant simmering pot of tomato sauce, don’t dump the pasta into it. You will need some other pot or pan in which to mix the pasta and just the right amount of sauce.
\n\nOnce the hot water has been removed from it, the pot the pasta cooked in makes the perfect mixing vessel (and you won’t have to dirty another pot). You may want to put a ladle full of sauce in the bottom of the pot before you dump the freshly drained pasta into it, lest a few pieces stick to the hot bottom. Ladle in more sauce a bit at a time and mix until every piece of pasta is coated.
\n\nIt seems to be the inclination of Americans to put on too much sauce, so when in doubt, under-do it. Sauce should touch every piece of pasta, but that doesn’t mean every piece should be covered with an opaque red coating.
\n\nAt the opposite end of the spectrum is the bowl of pale, virgin pasta with a giant mound of tomato sauce on top of it—a tasteless starch ball with a red hat. This is almost as big a sin as overcooking (and is usually combined with it, naturally).
\n\nRemember, sauce (or oil or whatever) must touch every piece. You have to mix it in before serving. Yes, even if you plan to provide more sauce on the side for people to add. If you learn only two things from reading this, let the second be that you must never, ever serve a single piece of pasta that looks like it just came out of hot water and never touched another ingredient.
\n\nPasta should be served in warm bowls.
\n\nIf you plan to put the pasta in a large serving bowl, warm that bowl, and also warm all the individual bowls for each place setting. The easiest way to warm bowls is to pour the hot pasta water into them. If using a colander, line the bottom of your sink with bowls (stacking if necessary) and put the colander into one of them. Then pour the pasta water into the bowls, ending by pouring the last of the water and the pasta itself into the colander. If you have a fancy “warming drawer,” that works too. But you’re going to have a bunch of hot water on hand anyway, so you might as well use it.
\n\nThis may all sound crazy—warm bowls? really?—but trust me, it makes a difference. Putting hot, freshly sauced pasta into a massive, cold, ceramic dish will instantly suck the life out of it. Warm bowls. Seriously.
\n\nServe and eat immediately.
\n\nBaked pasta dishes are an exception; they almost always need to rest a while before serving. But hot pasta mixed with sauce or other ingredients and not put into an oven must be served and eaten as soon as it’s ready. This usually means that the pasta shouldn’t even be dropped into the hot water until everyone is in the process of coming to the table. Some dishes can stand up to a few minutes on the table in a (warm) serving bowl, but the clock is ticking.
\n\nMaintain perspective.
\n\nIf this all sounds pedantic and overwrought, well, it is. But like anything else in cooking, it all becomes second nature if repeated enough times. Just note your mistakes each time and try to do the opposite next time.
\n\nI’m sure there are people reading this who have literally never undercooked pasta in their lives. Try that next time. See if you can intentionally undercook some pasta. You may find it harder than you think. Once you’ve done that, go back in the other direction. Eventually, you’ll home in on “just right.”
\n\nIt’s often the case that the simpler the food, the more important the ingredients and the preparation techniques become. This is true for eggs, and it’s definitely true for pasta.
\n\nAnd speaking of ingredients, please do buy the best you can afford when making pasta dishes. Dried pasta itself is incredibly inexpensive, and you shouldn’t be smothering it in sauce. Spend your money on a little bit of good olive oil, fresh garlic, and real cheese. Yes, parmigiano reggiano is over $20 per pound these days, but a little goes a long way. And when that freshly grated cheese hits the hot surface of that perfectly cooked pasta sitting in its warmed bowl, you’ll know it’s all been worth it.
\n\nBonus tip: pasta in soups.
\n\nMany soup recipes include pasta: elbow macaroni, tiny stars, wide noodles, etc. Pasta will overcook in soup just as easily as it will overcook in water. To prevent this, cook the pasta ahead of time, undercooking it slightly. After removing the pasta from the water, do something I just told you never to do: rinse the pasta in cold water to stop the cooking process, coat it in olive oil to prevent it from sticking to itself, then set it aside.
\n\nWhen the time comes to serve the soup, add just the right amount of pasta to each individual bowl. The (relatively) cool pasta will warm up quickly in the hot soup, and finish cooking through by the time the first bite is taken. It will also help lower the temperature of the soup sightly, making it easier to eat with less blowing and potential tongue burning.
\n\nWell, not really. The “salty as the ocean” rule works because most people under-salt their pasta water, and the right amount tastes like their memory of the sea. For actual percentages, see this article at seriouseats.com. ↩
\nThis past year was an eventful one for someone like me who has already passed most of the common milestones of adulthood (college, marriage, home ownership, children). The highlights:
\n\nI started a weekly podcast with Dan Benjamin, named after this blog (which, in turn, was named after something I wrote for Ars Technica in 2009). I’ve been amazed by the popularity of the show and the quality of the listener feedback and participation. Special thanks to Jeremy Mack, creator of showbot.me, and Justin Michael, creator of 5by5illustrated.com.
\n\nI’ve also become a devoted fan of several other podcasts on the 5by5 network, co-hosted by Dan Benjamin: Back to Work with Merlin Mann, Build and Analyze with Marco Arment, The Ihnatko Almanac with Andy Ihnatko, and The Talk Show with John Gruber. And for dessert, Roderick on the Line with John Roderick and Merlin Mann.
\nThough it started in 2010, The Incomparable, a geek ensemble podcast on which I’m proud to be a semi-regular guest, really hit its stride in 2011, with some great episodes about Star Wars (ANH part 1 and part 2; ESB part 1 and part 2), Pixar (part 1 and part 2), giant fantasy novels (The Name of the Wind and The Wise Man’s Fear), plus a bushel of episodes about Dr. Who and other TV shows and movies.
\n\nI enjoy being on this podcast all out of proportion to the number of listeners it’s managed to gather. If you have even a fraction of the fun listening as I do recording this show, you should definitely give it a try. (And if you’re already a listener, why not rate it or write a review in iTunes?)
In June, I made my first trip to WWDC in San Francisco, which was also my first trip farther west than Colorado. Ostensibly, I made the trip because I was afraid that Mac OS X 10.7 Lion would be released after WWDC but before Apple published videos of the sessions for non-attendees. (I rely on the information presented at WWDC when writing my Mac OS X reviews for Ars Technica.) But really, going to WWDC is something I’d always wanted to do.
\n\nThe trip was expensive, and I had to take time off work to do it, but it was so worth it. I saw what turned out to be Steve Jobs’s final keynote presentation. I met tons of people in person that I’d known for years online, and made several new friends. I also got to talk to a handful of famous (well, “nerd famous”) people in the Apple community that I’d never imagined I’d ever have any contact with. I refuse to name-drop them, lest it cheapen the experience (and no, sadly, Steve Jobs was not one of them), but the suffice it to say that it exceeded all my expectations. I’m not sure when or if I’ll make it to WWDC again, but it’ll be extremely hard to top my first time.
Apple’s release of Mac OS X 10.7 Lion in July meant that my trip to WWDC was indeed a wise choice. In the two years since my last Mac OS X review at Ars Technica, the site has grown tremendously. Amazing feature stories on all sorts of subjects were pulling in huge traffic numbers, well beyond what my past Mac OS X reviews had drawn. I worried that the audience for my brand of tech writing was no longer significant enough to matter.
\n\nWhen my Lion review was published, I was grateful to be proven wrong. Thanks to everyone who continues to read what I write. Thanks for indulging my idiosyncrasies and continuing to hold me to the same high standards that I demand of the things I write about. And thanks to everyone at Ars for so many years of loyalty and for building an amazing publication that I’m proud to be even a small part of.
Steve Jobs died in October, and it affected me more than I’d expected it to. I wrote about it on Ars, talked about it on my podcast, and still think about it pretty regularly.
Some smaller 2011 milestones:
\n\nMy seven-year-old son finished Ico, his first three Zelda games (Wind Waker, Ocarina of Time, and Twilight Princess) and is deep into his fourth (Skyward Sword), with only a little help from dad on the harder bosses. His gaming education is coming along nicely.
Hardware upgrades: MacBook Pro 15-inch replaced with a 13-inch MacBook Air and a 27\" Thunderbolt display; 4th generation iPod touch replaced with an iPhone 4S; Canon PowerShot S3-IS replaced with a Canon PowerShot S100. Hardware firsts: first SSD, first camera that can shoot RAW, first iPhone. (Note: the iPhone is my wife’s, not mine.)
I almost posted more than one thing to this blog.
The following movies were released in the summer of 1982.
\n\nIs it just nostalgia, or does that lineup positively trounce any summer in recent memory? What a perfect blend of popcorn summer blockbusters, kid-friendly films, and just plain great movies. Can anyone find a summer that beats this one?
", "date_modified" : "2011-01-02T13:37:00-05:00", "date_published" : "2011-01-02T13:37:00-05:00", "id" : "http://hypercritical.co/2011/01/02/summer-movies-1982", "title" : "Summer Movies: 1982", "url" : "http://hypercritical.co/2011/01/02/summer-movies-1982" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Here’s my brief entry in the speculation derby surrounding the departure of Mark Papermaster from Apple. Assuming Papermaster is out at least partially due to the iPhone 4 antenna and not some completely unrelated matter, and assuming Apple really did know about the iPhone 4’s antenna problems even before Papermaster was hired, it may seem strange or even unfair that he’s ended up as the fall guy. I won’t comment on the fairness of the decision, but I can certainly imagine a scenario where his ouster is well within the expectations of a job as a high-level executive in a big corporation.
\n\nImagine the following events. Papermaster is hired by Apple and put in charge of the iPhone 4 hardware. He’s brought up to speed on the project, including the unique characteristics of the external antenna. At some point later, a final decision has to be made on the design: go or no go?
\n\nWhile it’s clear that the buck stops with Steve Jobs on all decisions at Apple, that doesn’t mean he makes all the decisions. This is why Apple hires people like Mark Papermaster in the first place. It’s reasonable to expect that Jobs would defer to the guy he fought to hire when it came to this question. And so Jobs would ask Papermaster, is the design ready to go or not? And what about that antenna touching issue? Is that a big deal, or will most people not even notice?
\n\nNow imagine that Papermaster tells Jobs that, yes, it’s a real limitation in the antenna design, but that the advantages—increased range and room for a bigger battery—more than make up for it. Now imagine Jobs pushes further: “While you may feel that way, Mark, will the public agree? Will this end up being an issue?” And now suppose Papermaster says no, it won’t be an issue.
\n\nEither implicitly or explicitly, Papermaster would be putting his reputation on the line. This is what his job is all about: making decisions. This particular decision is not about technology or manufacturing; it’s a judgement call about how the public (and press) will react to something. But that’s part of his job too. And the harder he fought for this particular decision, the more he’d have on the line when he turned out to be wrong.
\n\nAnyway, like I said, this is all just speculation. I really have no idea why Mark Papermaster left Apple. But I find the scenario described above eminently plausible. Furthermore, if it were true, I don’t think it would speak ill of Papermaster. Executive management at this level is a high-stakes endeavor. The rewards are big, but so are the risks—and no one can be right all the time. If you’re the new guy and this is your first big call on the biggest project in the company, well, you can end up back in the job market much sooner than you expected. C’est la vie.
", "date_modified" : "2010-08-08T16:01:00-04:00", "date_published" : "2010-08-08T16:01:00-04:00", "id" : "http://hypercritical.co/2010/08/08/papermaster", "title" : "Papermaster", "url" : "http://hypercritical.co/2010/08/08/papermaster" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "Many years ago, I recall talking with some of my Mac-nerd friends about how strange it was, after Apple’s near-death experiences of the late 1990s, to be living in a world where it’s just assumed that any tech luminary will mostly likely use a Mac. A year or two later, Tim O’Reilly gave a name to this prognostication technique: watching the “alpha geeks.”
\n\nThis trend of Mac adoption among alpha geeks was a sign of good things to come for Apple, and generally a bad sign for its competitors. Today, James Gosling’s departure from the remains of Sun brought to mind a similar trend—one that’s not so good for Apple.
\n\nThese days, when a high-profile technical professional leaves his position at the company where he’s done his most important work, everyone’s first guess as to where he’ll end up is…well, do I really have to name the place? The point is, it’s not Apple.
\n\n(This mostly applies to programmers and other engineers. People on the more creative side of the technology world are much harder to predict. But then, who can truly fathom the mind of an artist?)
\n\nThere are many trend lines that contribute to a company’s overall trajectory, and nearly all of Apple’s are still pointing in the right direction. But the emergence of Google as a huge gravitational sink for engineering talent in the past five years has definitely put a kink in at least one those graphs.
", "date_modified" : "2010-04-11T11:20:00-04:00", "date_published" : "2010-04-11T11:20:00-04:00", "id" : "http://hypercritical.co/2010/04/11/black-hole-sun", "title" : "Black Hole Sun", "url" : "http://hypercritical.co/2010/04/11/black-hole-sun" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "2012 is an awful movie. I knew this when I added it to my Netflix queue, but I wanted to stay up to date on the latest in computer-generated apocalyptic destruction. I’m a fan of special effects in general and stories about the end of the world in particular.
\n\nAll the boxes were ticked: absurd “science,” impossible escapes, a nonsensical plan to save humanity, familial and romantic problems resolved during the crisis, unintentionally slapstick character deaths, etc. What I didn’t expect was how upsetting it would be—which is to say, that it was upsetting at all.
\n\nThe most heartless, lizard-brained humans are pre-teen boys. Teens and young adult men have usually built up a tough emotional core, but are generally too distracted by puberty to ever match the hardness of their unenlightened, toad-exploding youths. As men age, they become progressively more sensitive. The biggest spike (or dip?) in the graph occurs when a man becomes a father.
\n\nIn my experience, this manifests itself most noticeably in a reduced ability to enjoy any story where children are in peril. And so it was for me with 2012. As bad as the movie was, I was still bothered by the repeated use of children in danger as a dramatic device. This, despite the fact that there is never any mystery about who will live and who will die in any given scene. My brain understood, but my body still twinged.
\n\nSo let this be a lesson to you, young men. You may feel tough now, and you may remain rational and intelligent your entire lives. But you will age, and someday you may even become a father. When you do, watch out. You too—yes, even you, you, and you—will someday become an unintentional victim of your own emotions. (A “mush,” as I’ve heard it called.)
\n\nI always ponder this situation when I see a movie or read a book. It seems to me that our ability to enjoy a story depends on our personal experiences to a degree that people don’t want to consider. For example, a common occurrence on this Internet of ours is to encounter an impassioned screed condemning some work of fiction as offensive. Like clockwork, this is followed by a retaliatory condemnation of the offended party as “too sensitive” or “crazy.” The phrase “give me a break” is featured.
\n\nThe overall point that the inherent worth of a work of art is not determined by the bad reactions of a few people is pretty solid. But the glib denigration of the offended party is definitely on shaky ground. The unfortunate truth is that, through no fault of the artist or the viewer, entire avenues of entertainment can be closed off by life experiences.
\n\nIf your wife died in a car accident, you may find yourself unable to enjoy movies that feature car crashes. If you had an abusive parent, you may be upset by any scene where a parent yells at a child. And yes, if you simply have one or more happy, healthy children, you may not even be able to smirk your way through a comically bad disaster movie which happens to feature children.
\n\nNone of this has to reach the level of trauma (e.g., a veteran being unable to watch war movies). In fact, it’s most insidious when it’s much less dramatic, just a mild pin-prick of discomfort happening entirely outside—and often in opposition to—your conscious mind.
\n\nAnd is this the fault of the artist? Is the comedy actually less funny because there’s a gag involving turbulence on an airplane? And on the other side, can you really blame the viewer? I say no on all counts, as long as everyone involved has a clear head about the situation. For the viewer, that means no blanket denunciation of a work of art based solely on your own unexamined emotional reaction. For the artist, that means understanding that some people will be legitimately upset by your creation for reasons beyond your ability to predict or control.
\n\nSo yeah, thumbs down on 2012, but not because I’m a father of two and a giant mush. It’s bad for all the usual reasons a movie is bad: script, story, characters, etc. Maybe if you don’t have kids, you can appreciate it as a \"good ‘bad movie.’\" Maybe.
\n\nFinally, lest you young men get depressed about your inevitable futures as wussy old(er) men, there is actually an upside. A good movie that happens to intersect with your newly altered emotional landscape can be made all the more better by the interaction. For example, I enjoyed reading The Road, which is a much more intense story of the apocalypse and a child in danger than 2012. Here’s hoping the movie adaptation doesn’t suck.
", "date_modified" : "2013-02-05T19:47:24-05:00", "date_published" : "2010-03-15T12:35:00-04:00", "id" : "http://hypercritical.co/2010/03/15/no-movie-for-old-men", "title" : "No Movie for Old Men", "url" : "http://hypercritical.co/2010/03/15/no-movie-for-old-men" }, { "author" : { "name" : "John Siracusa", "url" : "http://hypercritical.co" }, "content_html" : "I’ve never considered Obama a very good speaker. It may be because he speaks slowly and pauses a lot, all of which drives my fast-talking-Italian-New-York-native self up a wall. Whatever the reason, my low opinion of his speaking ability meant that I was willing to believe that the Obama teleprompter gibes could very well be indicative of a real problem. Those jokes fed my fear that Obama lacked substance, that he was just a pretty voice able to dazzle people (though not me, apparently) with speeches he didn’t write or fully understand.
\n\nThat fear was put to rest by Obama’s recent performance in front of a gathering of Republicans. No teleprompter, no questions received ahead of time, no softballs. I was amazed at how well he did when I read the transcript. When I watched the video, I still didn’t like his delivery (maybe I should have watched it at 1.5x) but it’s good to know that our president has a brain in his head.
\n\nThat’s what was important to me regarding the teleprompter issue, and that’s why I care little about what Sarah Palin does unless it changes my existing opinion of her. Learning that she wrote notes on her hand before a speech doesn’t do that, and it sure as hell has no effect on what I think Obama’s use of the teleprompter does or doesn’t signify, regardless of which situation is more likely to resonate with the American people.
", "date_modified" : "2010-02-07T14:04:00-05:00", "date_published" : "2010-02-07T14:04:00-05:00", "id" : "http://hypercritical.co/2010/02/07/obamas-teleprompter", "title" : "Obama’s Teleprompter", "url" : "http://hypercritical.co/2010/02/07/obamas-teleprompter" } ], "title" : "Hypercritical", "version" : "https://jsonfeed.org/version/1" }