MacBook Pro 16-inch (Review)

Om du inte får den du älskar, får du älska den du får.

Setting

My previous MacBook Pro was a mid-2014, 15-inch model, purchased somewhere shortly after the introduction of the ill-fated USB-C MacBook in 2015. In the first few months, I saw the butterfly keyboard essentially crash and burn in the press, the USB-C “dongle life” get ridiculed, and the Force Touch trackpad also land in a MacBook Pro revision.

Then, later on in the year, the first MacBook Pro with a Touch Bar was introduced and the butterfly keyboard and USB-C adopted wholesale. That’s when I got the sinking feeling that, without serious backlash, there’s a significant risk Apple has made up its mind and is going to turn a corner. Very rarely are hardware changes, once introduced, reverted. Floppy drive and pre-USB IO goes, it’s done. Unibody manufacturing, glass trackpads? Kept forever. Occasionally, a whole line is dropped, like the PowerMac G4 Cube, or the rare first-time-around MacBook aluminium is retconned into a MacBook Pro, and a unibody plastic MacBook introduced. Change is possible, but was it likely?

Odds against me, having just gotten a new one, I decided to sit this out. (I’m also sitting out developing for iOS while they only have an App Store, so, you know, I’m that guy.)

Fevered

I wasn’t the only one to miss the old MacBooks Pro though. Even if the whole keyboard debacle can be elided at this point (and strangely underreported by the non-Mac parts of the web, maybe under the impression that being failed by a keyboard was a mark of the parodically effete, and that surely not even Apple could so botch a keyboard), the picture of the old MacBook Pro generation being a stalwart warrior of comfort and utility earned some cultural purchase and nodding agreement. As much as you may like Thunderbolt 3 or USB-C, no one likes the dongle life per se. And as updates didn’t seem to address much aside from bumping specs (and prices), thermals cast yet more shadows, leaving recent Intel chips to throttle in a sweltering aluminium prison.

Earlier this year, rumors started spreading about another model which would finally clean up some of the weaknesses, and the days thence filled with silent prayer: let this bastard hold until the new one’s out. The rubber feet cracked and peeled off, and the battery bloated to push up against the trackpad, rendering it cumbersome to press and starting its own countdown to a required exchange (which would be the third of its kind for that generation of MacBook Pro for me, just as a reminder that they had their issues too). But my fingers now bounce on a reborn scissor-switch keyboard inside a Space Grey chassis; this is the very first MacBook Pro 16”. Our long national nightmare is over.

Leap

I find myself in the fresh embrace of multiple new technologies that aren’t really new to this model. The SSD is now NVMe-based, lapping the previous model by three or four times in both read and write performance. The display is capable of going almost disgustingly bright and covering the DCI P3 color space, although I’ve never discerned going to or from such a display.

Also new-but-not-really are several advances in input and human-computer interaction.

The Touch Bar

The idea behind the Touch Bar is, in isolation, admirable. F1 through F12 are originally remnants of terminal clients from the 70’s, and we deserve a clearer instrument for what is now often labeled “commanding”, the act of issuing an instruction to an application. Although I disagree with its implementation, and my immediate reaction to the physical Escape key being added back was that they were now ¹⁄₁₃ of the way to a solution, I have tried to give it its due.

To me, there are two major usability flaws. The first is that the commanding is constantly changing and dependent on state. This may sound strange, since this dynamism is the primary observation from which the Touch Bar springboards. But at least my eyes are constantly on the screen. The Touch Bar morphs to offer commands as you make changes or make your way across windows and input states. Knowing that some button is in some place will take repetitive learning.

Combined with the second flaw, that any sense of tactile or haptic feedback is missing, it adds up to an environment hostile to pattern forming. When you look at a tablet or a phone, your eyes are already looking at where your fingers are tapping. There’s no second step needed to acquire a new target and verify that it’s what you thought — you just do. The Touch Bar, at least set to the app-contextual “App Controls” commands, constantly needs these affirmations, landing you in the odd situation of diverting your attention from a ~1120 points high screen to one that is 30 points high, and from a tactile lattice of keys that we all know to operate from muscle memory built up over decades to a shallow strip with no feedback except visual. Since effective commanding is often about being able to do things blindfolded, this gets in the way.

The Touch Bar does open a few doors – it can display colors (and therefore also emoji), it can offer fine motor control without affecting the mouse cursor or requiring precision in two axes. If you can fit your workflow into scrubbing or sliding or panning, you can operate it without needing to look, allowing a new kind of gesture not offered by the trackpad with both precision and agility, and at least local efficiency gains.

A Matter of Control

I consciously have been keeping track of all F-key presses, and they add up to 0 now. (In sharp contrast to Escape key presses, which I lost count of the first day.) I mostly don’t use the Function keys themselves at all in my daily routine; many people do, especially developers, gamers, professional application users (which this thing is ostensibly for) and double/triple-booting multi-platform users, but not me. What I do use are the global control commands – controlling volume, music playback and brightness (both screen and key backlighting – the new backlighting looks much better). I use enough of the commands that they don’t fit well into the condensed right-edge “Control Strip” partition, which essentially kills the App Controls view for me, since the extra command to show the full Control Strip is to tap the smallest known target in the whole Touch Bar, an expander chevron sitting right next to one and often two commands I don’t wish to activate.

Set to show the Expanded Control Strip at all times, the predictability helps, but at what cost? An opportunity lost – App Controls tossed to the wolves because of how they shadowed the, for me, much more common global controls; the Touch Bar only existing to poorly and lossily reincarnate what was previously there. “So don’t get a Touch Bar” is a valid complaint, and if I could have gotten a model with plain function keys, I certainly would have. Or, better yet, a model with a Touch Bar set in a row beyond the function keys. Certainly in the 16” model, there’s room for another row up top, maybe at the cost of being slightly uncomfortable to reach for.

If you live your computing life inside the auspices of the Touch Bar’s functionality and manage to use it to the hilt, you may well feel fonder of its flexibility than I do, especially if it adds significantly to your personal workflow. For me, it just plain doesn’t – and I miss the utility, tactility, predictability and dependability of what was there before. It is tempting but harsh to say “the reason it’s called a bar is because it makes you want to drink”, but it nevertheless sums up my experience with it even after giving it a chance.

Touch ID

Touch ID exists and works. It works well, it works fast, it’s integrated in many places, it’s available to third-party applications and it’s convenient, even making double-clicking the Apple Watch side button (which also serves as authentication) feel like a comparative struggle. It is the best kind of feature, implemented in the best kind of way; it is also incredibly boring in its perfection.

Force Touch trackpad

The main conceit of the Force Touch trackpad — besides having an eyebrow-raising name — is the observation that administering haptic feedback from below (using the Taptic Engine linear actuator) feels, to the finger, indistinguishable from the trackpad button actually depressing, in the way of traditional “diving board” trackpads. For this to work, force has to be sensed, and the click simulated beyond a threshold, and this sensing is also used to provide “deeper” presses with alternate meanings.

As a side note: Being used to a trackpad misbehaving with a swelling battery underneath it, my clicks were forceful/deep enough to naturally trigger deep presses unless I took care to go feather light and feel for the activation point during the setup process, which lead me to activate the “deep press lookup” feature on many links, instead of merely following them to skip steps — a fate I posit is both common and confusing, since the force touch capability is not obvious and not explained along the way.

The trackpad allows configuring a click stiffness, as well as turning off deep presses entirely. In theory I like the idea of deep pressing to trigger lookups, and also to “press harder” to fast forward even faster, but I have switched the lookup feature to use the old three finger tap instead, since it was too easy to activate by accident; similarly, dragging files in the Finder is likely to accidentally trigger renaming. I am hopeful that, having some haptic feedback, there is room for muscle memory to relearn these parameters, unlike with the Touch Bar.

Lastly, the trackbar is also comically huge at first glance. The palm rejection is stellar – I have not given related errant input once in the form of clicks. As a consequence of its virtual nature, it also allows clicking at any point on its surface, which comes in handy but doesn’t yet readily occur to me.

Running with Scissors

Aside from the Escape key, the previously described features were all present and in full force on the models I resigned to stay away from. Readily absent, however, is the butterfly mechanism. The scissor switch is nicked almost wholesale from the standalone Magic Keyboard, with the notable adjustments of a slightly lower profile key and reintroducing the inverted-T layout of the arrow keys, which serves as a necessary locating feature. It feels different than the previous scissor switch mechanism, but is so close that it will take weeks to game out if there even is a winner in comparison.

San Francisco ably replaces VAG Rounded in the clear legend (and for what it’s worth, Apple maintains their decision to keep separate keyboards for Swedish/Finnish, Danish and Norwegian lettering, bucking the trend of laptop manufacturers smearing all Nordic languages onto the same key caps into an unreadable mess leaving some keys looking like an explosion in the glyph factory). Overall, it’s just plain good, and returns to the Apple laptop keyboard tradition of yore of being unremarkable in its silent exceptionalism.

Brawn

I am not the right person to objectively judge the performance of this model, due to my upgrade path. A lot of things happen in five years. But thanks to the ability to load it up with significant GPU memory (a presumed funnel point; individual Safari tabs would often glitch out massively post-Catalina upgrade), RAM and SSD, there’s once again a lot of headroom for many medium-sized tasks, and I haven’t ran into being bottleneck by really anything. It plows through everything with ease. This makes sense: the thermal envelope has been raised by reconfiguring the inside to free up more airflow and allow the fans to do a better job, leading the CPU to throttle less.

The SSD upgrade pricing is extortionate, but notably less so than before a price adjustment earlier this year, and the 8 TB upgrade option is somewhat reasonable in its ridiculousness due to the lack of such options across the industry. I am expecting this model to last at least as long as the previous model, and to deliver solid performance for at least the first four years.

The model also has much improved speakers and microphone; I will not be mentioning them due to not listening a lot to speakers directly. But the going consensus is that they are genuine technical achievements and worth reading up on. And on the flip side, the 720p camera is laughable, and a pro laptop launched at this date should include Wi-Fi 6 (802.11ax), already present in this year’s iPhones.

USB-C

There are four Thunderbolt 3, USB-C ports and one 3.5 mm headphone jack, and that’s still it. USB-C is more commonplace now, so the market has moved the needle and improved Apple’s justification.

It still carries all the previous problems of a budding universal connector, though; it connects directly to almost nothing, even for the many people who have made the deliberate transition. Not only early 2015 MacBook Pro models but also most laptops in the marketplace today come with additional ports for USB-A (the traditional plug), HDMI and, not unusually, an SD card slot. MacBook Pro also hosted the magnetically attached MagSafe 2 power connector, which absense is sorely felt not only in ease of detachment but in a charging LED indicator. (Its cable shroud is not known for its reliability, but I predict neither will the identical form used on the USB-C power cables be.) There are third-party alternatives available, but all of them look significantly clunkier, and almost none provide the full 96W offered by the included AC adapter.

Phil Schiller offers a reasonable-ish defense when pushed – that in their place, the connectors that deliver the widest landscape of opportunities are the Thunderbolt 3 ports, an argument clothed in the “giving the pros what they wanted” angle Apple chose for the release of this model. To me, it sounds off – Apple already makes plenty of sacrifices in the name of a more compact, more clean product. If giving the pros what they wanted was a high priority, delivering at least one of MagSafe, a USB-A port, an HDMI port or an SD card slot should have made the cut. (As would making the Touch Bar optional.) And the workaround dongles everyone will have to if not carry regularly, then at least acquire, do nothing to bolster a compact setup with a clean profile, and everything to highlight its inadequacy.

Displays, near and far

The major determinant of the product, the 16-inch display, has mostly gone unmentioned. Like Touch ID, it performs admirably in a boring way; unlike it, there are interesting details to mention. The slimmer bezels do a lot to make every other MacBook model look dated. True Tone (adjusting the white point to ambient lighting) is present and accounted for, and works as well here as every other Apple device it’s been implemented in (including the Touch Bar). The leap in resolution is not tremendous, but the added space is welcome. Although Apple is late to bring its ProMotion variable refresh rate (or high refresh rates of any kind) to any MacBook, it does offer manually selectable additional refresh rates that are integrally divisible with common refresh rates and thus avoids pull down.

The Catalina/iOS 13 feature Sidecar — using a recent iPad as a secondary display — is not new to this device, but it is the first time all the requirements have lined-up for me to be able to test it. It’s a wireless connection powering a retina-level display; knowing AirPlay’s performance, it shouldn’t work, but it is undeniably smooth with no discernable lag and runs miles around AirPlay in my brief testing (I’m not in the target audience). Look for matching wireless technology in a future Apple TV to upgrade AirPlay’s performance. I did see the Touch Bar having issues showing up, and also the triggering of another glitch, bringing us swiftly to the well-known nugget:

“Never buy an Apple first-generation product”

I have seen two glitches, both related to waking the MacBook Pro up. Some of the time, the display color reproduction gets out of wack with a saturated posterization and toggling True Tone, the refresh rate or automatic brightness adjustment is usually enough to get it to snap out of it. (The Sidecar tie-in is that changing its settings could result in triggering this condition, as well as causing the MacBook Pro display to forget its resolution and revert to the default.)

Worse, a significant fraction of the time, waking it from sleep will show either a login window that hasn’t finished initialization (no placeholder label in the login password field, for example), or a partially intact scene from one of the full-screen spaces, atop which a cursor forever beachballs, blissfully ignoring clicks, keys, Touch ID or Apple Watch unlocks. There’s no indication that this hang ever resolves.

(Both of these issues present on macOS Catalina 10.15.1, Supplemental Update included.)

Perfectly Conflicted

A machine is more than the sum of its parts. It can also be a way of communicating changed priorities. There is no contest that this is the most powerful MacBook Pro ever offered; it would be mad for any new MacBook Pro to be less powerful, but how this model manages to be that much more powerful than its immediate predecessor while still using the same CPUs is noteworthy.

It also reconsiders blind allies, some of which I was starting to give up on as lost causes. It makes decisions that I love, and it makes decisions, still, that make me roll my eyes. In ways that previous MacBooks Pro have been sub-par, it is thoroughly, stupendously good, but it is also shockingly expensive compared to what it should cost. And at least during testing, it had impactful, worrying glitches, my recommendation hinging on their timely resolution.

In a sentence, it is both perfect and stupid, rolled into one. It is not the computer I wished for, but it has things in it that I wished for, and it is enough to get me through the next few years and feel like charting that course wasn’t a fool’s errand — money given to a company that’s still willing to listen to its customers, even if it could work on its humility and attitude.

As the old Swedish saying goes: If you don’t get what you love, you get to love what you get.

(As tested: 2.4GHz 8‑core Intel Core i9; 64GB of DDR4 RAM; 4TB SSD; AMD Radeon Pro 5500M with 8GB of GDDR6 memory.)

Let a Hundred Flowers Bloom

South China Morning Post:

“By allowing its platform to clear the way for an app that incites illegal behaviour, [does Apple] not worry about damaging its reputation and hurting the feelings of consumers?” said a bellicose commentary published on the app of People’s Daily, the Chinese Communist Party mouthpiece.

Apple’s current situation between a Communist dictatorship/market-that-the-stock-market-would-rather-it-found-its-future-growth-in and an increasingly concerned user base is entirely their own fault.

When the iPhone App Store first launched, downloadable and installable applications had been documented fact for several years, and even several generations of smartphones. What Steve Jobs - and there is significant evidence that as the last holdout when everyone else were foaming at the mouth to give developers permission and tools to make apps and games, it really was literally him - wrought upon the world was a form of developer platform that dressed up the closed market of an authoritarian state as a convenience and an enabler of security and trust.

I have discussed the flaws of the App Store at length, but looking at it from the prism of the current situation, an alternate universe emerges, where apps were possible to plainly and easily distribute, and Apple, among with other platforms, could have played the role it looks to define for itself as makers of tools for the misfits, rebels and square pegs in round holes. The alternate universe was the default, and Apple bent the arch of history towards the current situation.

Apple may not have created the first App Store in existence, but just as there was a before and after iPhone, there was a before and after App Store, and this model has now been picked up by most other platforms, making life easier not for the developer or customer, but for the platform runner, and certainly for illegitimate, repressive regimes like the one holding a sixth of the world’s population hostage.

Put a dent in the universe, indeed.

Fan-spec

CUPERTINO, Calif.

For immediate release.

Apple Inc. today announced its all-new revolutionary line of MacBook Pro notebooks, headlined by the brand new 16-inch MacBook Pro. This model has a 120 Hz ProMotion HDR Retina display, powered by a built-in Vega II graphics processor and fitting one more inch of display into the device dimensions of the 15-inch MacBook Pro.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

All MacBook Pro models sport a stunning, time-proven, industry-leading scissor-switch keyboard, improving upon the radical butterfly mechanism. Additionally, customers get a choice of a full Touch Bar + Touch ID configuration or function keys + Touch ID.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

The Thunderbolt 3, USB-C ports and 3.5 mm headphone jack are joined by two USB-A ports, an SDXC card slot and an HDMI 2.1 port, capable of fully powering a 4K external display at 120 Hz or an 8K display at 60 Hz.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

The new MacBook Pro line also contains numerous other improvements, like the latest Intel processors, a new user-serviceable battery with 20% longer all-day battery life, a 4K FaceTime camera with a built-in privacy slider, even faster flash storage, up to 128 GB RAM and a matte display configuration.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

While no one was looking, Apple also announced a 5K Pro HDR Display for $999 including a Pro Stand, and a modular Mac Pro mini starting at $1499 with an 8-core Intel i7 processor, 8 GB of RAM, 256 GB of NVMe M.2 storage and three PCI Express expansion slots.

“We think the significant fraction of our customers who don’t work at movie lots with functionally infinite resources will appreciate our new additions to the pro—“ attempted Vice President of Special Projects and previous head of Mac Hardware Engineering Bob Mansfield, yelling from outside a third-level Apple Park press briefing room window, standing on a metal ladder that appeared to be indistinctly yanked away by an autonomous vehicle.

“This is what our customers have been dreaming of for years,” said mildly perspiring Senior Vice President of Worldwide Marketing Phil Schiller.

No Escape

Apple, to the extent they ever were, has stopped being a company that can move quickly. They have long pipelines, characterized even by Tim Cook as a “treadmill of innovation”. They know which product they will put out, roughly, in a year, which OS it will align with, and which new standards will be ready by then that can be taken advantage of. I think this is why I am having so conflicted feelings about what’s going on now as different products transition in and out.

The new Mac Pro was introduced, and it again embraces what a fully loaded up computer can be and the power it gives to its user, going as far as to bake in the “cheese grater” worship into the visual. (They knew.) But it also starts at a wallet-melting $5999. (You know how much money that is? That’s, like, six Pro Stands!) The grater everyone was wishing for started at $2499, and was within range of many more Mac users.

The (12-inch, one-USB-C-port) MacBook was scrapped today, but so was the MacBook Pro “Escape”, the 13-inch model without a Touch Bar. The new models are upgraded and better, unless you want to do advanced things like press down a key and be reasonably sure which letter shows up on the screen and how many of them.

The last four or five years have been like a walk in the desert for Apple. They are exceptionally good at some things, like miniaturization and betting on new standards and “skating to where the puck is going to be”, and in a world where you risk getting stranded atop local maxima, it’s a good tool to have in your belt. But that’s all it is - it’s a tool.

Starting roughly around the 12-inch MacBook, they let it be their only virtue. The problem is that they are the only vendor in their own platform, and have an increasing number of people with a wide range of problems to solve. There’s nothing wrong with having a laptop with only USB-C ports, but if several years on people haven’t dropped the other ports, it’s quite possible it’s a good idea to have a computer with both USB-C and other ports on it. It’s quite possible you could shrink the Mac Pro down a bit to not be quite so monstrous, sell it with a moderately powerful i7/i9 (or AMD Ryzen, once USB 4 comes around and Thunderbolt support doesn’t have to be dropped) at less than half the price and rule the galaxy. It’s quite possible you could offer MacBook Pros with both Touch Bars and no Touch Bars, and letting people choose which they like.

None of this means they’ll have to stop doing what they were previously doing - which shouldn’t matter, but since to Apple “not being completely right” seems to be heart-aching, world-view-shattering anathema, maybe it helps. I notice that in the grand scale of things, a more capable computer in the MacBook Air won out either in the marketplace or in Apple’s plans (probably both) over the sleek-for-sleekness-sake 12” MacBook, and that seems promising.

Simplification is a useful tool, too. Having fewer products is better. But it’s only better as long as you end up making the right computer for your user base.

(Edit: Another positive sign I missed - the SSD upgrades have gone from armed robbery to mere pickpocketing. When you can slough off $1400 for an upgrade and prices are still high, at least you know they were extortionate to begin with.)

Exit, Not Pursued by a Bear

There are many things to say about WWDC, and I may say some of them in other posts, but the more I look at SwiftUI, the more I like it. Marzipan/Project Catalyst/UIKit on Mac/“iPad apps on Mac” is still as much of a stop-gap money grab as it ever was, but I was wrong to assume that it was the totality of what was up the Cupertonian sleeve.

SwiftUI is in software what so many of the hardware hits have been - a hundred small things that individually have been done before, but put together in a coherent package and seemingly done well. Whether you find precedence in Elm, Svelte, React or WPF/XAML, SwiftUI is an amalgam of sane, well-chosen ideas, mixed with some new ones, like the ostensive compaction of wrapper views down to a sparse and efficient rendered layer. And for once, SwiftUI isn’t a misnomer. It builds on years of wrangling a new language to the place that it allows something like it, like the pervasiveness of a deep and dependable mutability model, without which the checks signed by all the features couldn’t be cashed.

So many of Apple’s decisions, especially for operating systems and frameworks, have been made from a position of weakness and under the pressure of deadlines. If there had been no SwiftUI, this would still have been the biggest WWDC for many years. But SwiftUI looks like it’s been a new idea that’s been allowed to grow and mature; built after a long, hard think, driven by exploration and ideas instead of forced deadlines.

I rhetorically asked for a “Cocoa X” rethink. SwiftUI is only the UI, but it is a fundamental rethink of that problem. It’s additive and incremental, and still looks and feels native, because it is; no iPad-looking concoction transplanted into the middle of your Mac app, or vice versa.

James Joyce said: “History is a nightmare from which I’m trying to awake.” Show last week’s Objective-C code written against the iOS 12 SDK UIKit to a NeXTStep developer, and they might still recognize most things. Code doesn’t rot, but new ideas do come around. It is worth looking back at the past, looking at the timeless gist of the problem, and wondering if, 32 years later, we don’t have a better way to get where we need to go.

This is The Moment Everything Changes

Preface

Anyone reading Waffle as a matter of course for the past dozen years or so know that most of it has been aimed at discussing Apple, or changes in the technology landscape, or preferably the overlap thereof. Most of what I write tends to come out saying the same things, which could be pointedly but not indefensibly summed up as change is wrong. I am a strange creature, both pouncing on the new and the exciting, rushing to extol its virtues and celebrate seeing things from a new angle, but also demanding what’s good about the old is maintained, lessons not lost like drawings in the sand.

In raving about the new, I am rarely alone, and in supplying critique, I am rarely the best, so it would not be a surprise to me being viewed as a one-trick pony, an obsessed kook. I should choose wider subjects, but I write about these things because the potential (and all too often actual) downsides affect all of us, and I see so few attempts to cover them end to end over time, in depth, with the focus of someone who for better or worse thinks about it all the time. Not only do I make these kinds of decisions for the software I work on, everyone else’s decisions have consequences for everyone, me included. I don’t think I can talk back the tide, but if the tide is going to swallow something I love, at least it shouldn’t go silently.

WWDC

On Monday, June 3rd, Apple holds its annual Worldwide Developers Conference, WWDC, in San José, and is expected to set out its direction on a number of issues. It will be a momentuous day. I have at times made a series of predictions, but this time I thought it was better to talk about them instead.

Marzipan

First, Apple is supposed to complete the curveball it threw all of us for in last year’s WWDC, when it started to say that iOS apps are not coming to Mac, and following it up by saying that instead, UIKit is coming to the Mac, presenting four new apps that looked like straight-up ports from their new iOS incarnations that had in fact not even been ports. Supposedly, UIKit on Mac is codenamed “Marzipan”, and this codename leaked well ahead of last year’s WWDC.

I am not wild about Marzipan. Steve Troughton-Smith, who has at least a five year record of believing UIKit is the future of all UIs anywhere, likes to frame it as a fear of change. Of course there’s a point that if you start a new application framework design in 2005-2006, have the learnings of Mac OS X in fresh memory and are restrained from doing wild stuff by weak hardware and few resources, you’ll probably end up with a lot of fat trimmed, and many mistakes not repeated anew.

The reason I’m not wild about Marzipan is because wanting to use a Mac in the first place has always been about liking the way things are subtly different and subtly better. The Marzipan apps so far have been completely bled of this quality. They make the same mistake “Universal” Windows applications did, which is to believe that taking a touch interface and sprinkling keyboard-and-mouse adaptiveness on top of it is “enough”. It is “enough” for a dropdown menu to be one of those scrollable list pickers - the ones designed for a finger to swipe through on a constrained display, with haptic feedback guiding you. (This was UI that Apple actually shipped in an app that wasn’t just a major feature of an OS update but a flagship app of a new framework.) At least the UWP applications can more readily expect the screen on a laptop to respond to touch.

The thought of Marzipan being capable of delivering something Mac users will recognize and praise as Mac-like is laughable; the thought of it subsuming Cocoa to become the recommended default is offensive. Cocoa eclipsed Carbon because it was better at providing a Mac-like experience. For all the recent iOSsification of macOS, I still don’t see this being the case without extensive surgery. If anything, the way forward should have been a “Cocoa X”, designed from scratch with the learnings of both UIKit and AppKit/Cocoa in mind. The current Marzipan apps are abominations, not aspirations.

Mac Pro

The Mac Pro timeline for the past 8 years is near comical. The classical big honkin’ tower Mac Pro (the only one with a traditional desktop form factor and support for expansion cards) was left without updates long enough that people worried where it was going, until finally, Apple revealed a cylindrical Mac Pro with almost no built-in expansion, under the promise of more to come, only to leave it, too, hung out with no updates for several years, until they invited a few journalists to leak that they were working on a new “modular” Mac Pro. This was now more than two years ago.

The Mac Pro doesn’t affect me personally - but it affects me in so far as it defines the bounds of the platform. If Apple wanted to give exactly zero figs about professional usage (such as it traditionally applied to the platform; primarily scientific work and media production), the time for them to silently drop their involvement, Xserve-style, has come and gone. They have dug in, and with the iMac Pro has produced a stopgap model that while not perfect is at least a milk bone to this demographic.

My prediction for the Mac Pro is that their opaque talk about “modular” doesn’t mean that they have vectored back to their customers’ wishes, despite them being inconveniently fueled by actual needs and requirements. They are going to produce a computer that is physically somewhere between a Mac mini and a “Shuttle PC”, with extremely minimal, if any, internal expansion, and most use cases still routed to external (and expensive) Thunderbolt 3 (or possibly extremely early USB4, which subsumes it to some extent) devices and chassis. The Mac tower will remain dead, “modular” will refer only to that of not including the display in the body of the computer and opinions driven by facts will be vented and dismissed as “emotional” due to their inclusion in the ongoing facepalm saga that is the modern Mac Pro era.

The Future of the Mac

Marzipan will be the banner headline of a macOS release that will, years later, scarcely be remembered for anything aside from this. The way the Mac Pro goes will also give a clear signal of what’s most important to Apple at the end of the day. It has become apparent over the past few years that Apple is more interested in what is sleek and minimalist than what is actually useful, usable and powerful.

Apple’s first advertisement announced that “simplicity is the ultimate sophistication”; regular Mac users can enumerate many cases where the decision has gone for the simplistic or sophistry instead. People are holding on to their several year old laptops, hoping they don’t break, because the new keyboard is such a marvel of engineering it can’t successfully do things asked of every other keyboard on the planet. Steve Jobs once said that Apple doesn’t know how to build a $500 computer that isn’t a piece of shit - it’s now a worrying possibility that it has forgot how to supply a keyboard at any price that isn’t worse than the cheapest Dell pack-in.

But the keyword is “forgot”. There used to be a time when Apple had no trouble pumping out regular updates to its Macs, leaving the generational upgrades for every few years, and just putting out a spec bump now and then. Over the past few months they have seemingly been trying to get back into the habit of doing it again. It’s nothing a company jousting for the position of the world’s most highly valued should beat itself on the chest over, but when a sign of health has been missing, its recurrence is appreciated.

The maligned butterfly keyboard is still there on the MacBook Pro just bumped a few weeks back, and bizarrely listed the same day on the list of eligibility for the keyboard repair program. The favorable interpretation is that it’s there to calm customers, but that by definition it can’t be a status quo that lasts forever, so it’s a tacit confirmation that a keyboard free of all these issues is again being planned for future products. (When that’s the favorable interpretation, you know you’ve fucked up.)

Be it the increasingly tightened application environment (in the name of security), the inscrutable hardware decisions, the software quality issues and the increasing lack of a long-term roadmap, Mac users have been stuck in a time loop for years now. New OS versions bring few new features but many incompatibility worries, and applications not updated recently risk falling by the wayside, as do developers not ready to jump into whichever incremental feature or user interface fashion refresh ultimately will not benefit macOS users as much as a good old focus on bringing the productivity, usability and flexibility up.

Apple is a big company, devoting medium company resources to a small company mindset. Being a startup in terms of being agile and willing to take risks is great, but it’s now juggling both macOS Mac and iOS iPad as competing computing visions, where both can be said to be troubled, stymied by hardware and increasingly unwilling to let developers unleash their own creativity for the benefit of their users’ productivity and flexibility.

Whether I’ll like the outcome or not, the cards are stacked for Apple to weigh in heavily on all these things (including possibly by inaction, to focus much more on iOS) come Monday. If optimism left me easily, I would be typing this on a capable PC laptop instead (although possibly swearing equally at a UWP Windows future). But I am holding my breath, because one way or another, when all of WWDC has been summed up, we’ll be able to look back at it and say that it was the moment where everything finally, ultimately, irrevocably changed.


(Postscript, five minutes before the keynote: I see via Twitter that I have left out contact details on this weblog. Since I am indeed a Comic Book Guy-like curmudgeon who can be dismissed as such, you should not feel the need to send any emails, since it would probably be a pointless exercise. Better yet, write up the reasons why I’m wrong and post them somewhere! That way you’ll inform more people than me. It’s okay if it takes more than 280 characters. And yes, it’s also okay if someone halfway across the Internet is wrong, puerile or misinformed.)

Tiny

There’s no real way to look at the Panic Playdate and see hard-edged, economically shrewd value. The metric itself leads you astray, overvaluing 8000-in-one White Label thingama-SouljaGames that are far closer to the predictable accusations of hipster indulgences.

What I love about it is a recently recurring theme that’s, amidst a polarized and increasingly de-humanized society, been easy to disregard: the glimmers of hope. A group of under a dozen people can still create a little thing like this, including its own damn OS, just because they love the feel of technology built by those who care.

There were a thousand reasons to not build it. There were a thousand reasons to run in the opposite direction, to give up, to completely cede the ground to consoles and touch and game streaming, to things that can be screen captured to Twitch.

But there are also a thousand reasons to do it. The reason our world is crap is because of the funneling of everything into gargantuan seas of milky mediocrity. The biggest entrant wins by subsuming everyone else, by swallowing and outspending and walking all over the competition. The only way out is for life to be a puzzle again, a challenge, for someone’s charming ideas and pet projects to be valued beyond digits on a bank balance readout.

Technology, science, human progress all exists so that we may stand on the shoulders of our forebearers. For decades we have bent semiconductors and materials to our will, but to see that a collection of people who can fit inside the average kitchen can build something more or less from the ground up with so much character but also so significantly outside of their comfort zone is truly inspiring. In a world full of cynicism and derivative madness, what could be better?

Who knows if I’ll get one, but I’m on their side.

The Ones Who See Things Differently

Think Different was about respect for creators. It was about creativity, unconventional thinking and real courage, to change people’s lives, turn the tide, bring education and humanity and a better understanding of the world to the masses. Think Different was about having people like that as your heroes, and wanting them and other people to have a tool that met them halfway and let them focus on what mattered to them.

It was a justification to do something differently than the behemoth.

Today’s event was about an Apple that may still have their sights of some important values compared to other behemoths, but where the focus is on the fervent belief that whole-banana-ism needs to extend to every corner of everyone’s life. Behemoths like Apple, Amazon, Google and Microsoft act as if they need to have fingers in every pie, provide solutions to every problem, build complete stacks.

Even people who loved Think Different and who still love Apple know there’s more to reading News than Apple News+ (or indeed the normal Apple News app), more to games than Apple Arcade (and indeed indie games have been turning conventions inside out for decades) and more to TV than Apple TV+. Regardless of whether they include good products, Apple is starting to insult both the people who use their products and the heroes they hopefully still respect.

The Internet has torn down walls and connected people, and even though everyone has a full stack and a streaming platform, no single place is a catch-all any longer. Every bucket of “exclusives” is a dated prayer for a dream of control and containment and world domination. What we all crave is for a world that understands interests and respects choice and diversity, where you choose what you want without juggling worries of incompatibility. Not snooty, self-important “curators”, claiming themselves the world’s greatest in fields they have not entered into before, when they can’t even keep fake “antivirus” apps out of their own decade-old App Store, stabilize spiraling software quality years in the making, and when their user interface vocabulary has you pressing a “Share” button to use a “Find on page” command.

Desktop

The conceit

Picking a desktop PC platform right now is a classic case of picking your poison. Apple cares way less about their desktop platform for every passing day, but so apparently does the rest of the world. I use Windows 10 every day, and I wish I didn’t - if Microsoft was still interested in advancing their platform in the way they did between Windows Vista and Windows 7, I might switch to Windows tomorrow and make my life much easier.

It’s frighteningly clear that no one at Apple or Microsoft values the way macOS and Windows respectively has worked enough to not see turning parts of it into a tablet OS – irrespective of the fact that tablets have not gone beyond gimmicks for most desktop users – as progress.

The third place

The Linux crowd is what’s left, and they cook their own punch, so they seem like an obvious refuge. I’ve been giving them short shrift for reasons that make sense to me, but may not be obvious.

The words “user experience” is thrown around a lot these days in place of “UI” and sadly also in place of “usability”. I’ve always been curious and tried using many platforms. I cut my teeth learning to use computers in the System 6 era of the Macintosh, during the period where there was a clear chasm between the Macintosh OS and Windows 3.11. The Mac was a coherently designed platform, with cohesion, with a sense of nuance and personality. It, and Windows 95 after it, had a culture that became synonymous with the OS, that put clear expectations in your head as a user and that you could fulfill as a developer. You don’t need a single company to do this, but you need a place of authority that’s open to criticism and change, that will adapt and survive, that exudes longevity.

Unix is a platform like this, and it has a coherent and cohesive user experience. It has clear rules and ideals that make sense - they’re all about doing one thing and being able to be reused. The problem isn’t Unix. The problem is that if you don’t want to live your life inside a command prompt (no offense to the people who do), it starts to fall apart. Package managers work great. X is a trainwreck, fixing it is impossible and getting people to move to Wayland is still an uphill battle. The Unix idioms and ideas are being applied to graphical UIs, where they don’t make sense, and what results is a cacaphonic mess, not conducive to a pleasant and effortless user experience; things will look different and work different and be largely programmed, structured and maintained by people who do not care about usability. There are people who do care about it, but in this culture they tend to move on to other things.

All this said, the significant time I’ve spent in various forms of GNOME or KDE over two decades has not once instilled confidence in me. (Again, aside from package managers. Although with systemd now being a sprawling, hot mess prone to extreme security flaws but still heading to near universal adoption, maybe I just have no idea what’s going on, and all of this can be ignored.)

The continuation

The previous section was not simply a detour. You may notice that other platforms have their own signature viewed through the platform cohesion prism. Windows 95 and Windows XP were both very cohesive. iOS started out being incredibly cohesive, has gone here and there, and is arriving at a point where I may not agree with all the decisions, but at least the apps in the OS feel reasonably consistent.

The Web is a difficult beast. Web pages, by and large, are easily understood. They have an easier job because they’re not asking for a lot of interaction. They are mostly vehicles for information (text, video) with limited interactivity. Web pages are cohesive, for the same reason reading a magazine is cohesive. There’s no learning curve.

Web apps? It depends on the app itself. The original version of Gmail looked like someone had made a UI toolkit out of Google’s own web site, but it was easy enough to use and solved more problems than it created.

Web apps as a way to implement the range of functionality most often associated with full-on desktop applications? Only very rarely done well, and I can’t think of a good example off the top of my head.

Web apps are part of the problem, but they’re not even the only part of the problem. The other part is the copy-that-floppy road of lazily emergent (read: lack of) user experience design. People wanted to make early mobile apps that put many choices at a user’s fingertips, so they invented the hamburger button, opening a sidebar menu, as a reduced example of a menu bar. Other people wanted to not change user paradigms, so they put the same interface on a tablet app. Yet other people wanted to make desktop apps look modern, so they made them look like tablet apps. And so what we now have are web implementations of 3.5”-ish touch screen assumptions for a desktop platform. If your goal is to make an application that users of desktop applications will find familiar, you couldn’t start from a worse place if you tried.

It’s not that desktop apps should look the same if you leave the planet and come back ten years later. It’s that they should at least not completely give up everything that was put in place to make them understandable and efficient; and that if they do replace those things, that they are replaced with things designed with desktop applications in mind. Shrugging your shoulders and saying “but everyone has a phone and everyone uses the web” is like arguing the door on a microwave should look like an actual house door or the lid of a toilet.

The future

So what’s the answer?

Wipe the slate clean. I don’t mean of influences - I mean to take everything that has worked at some point or another, everything that wasn’t just ported over from somewhere else in the name of expediency, and build something new from those parts. Flat design for things that are interactive is a usability disaster because not being able to tell when things are different will slow you down. (If nothing is interactive, you don’t have a problem that a magazine art director from 1965 couldn’t solve; go about your day.)

We’re at a saddle point in history right now, where the road back makes you look old, and the road forward is daunting because who even talks about desktop PC environments anymore, right? But that line of thinking has gotten us 10-15 years of a desktop rat king, made from tablets, phones, web pages and a little backported, misappropriated good old magazine layouting. It’s time for someone to sit in a hammock for a year and work this out.

Like Giving a Glass of Hell to Somebody in Ice Water

Being one of the world’s highest valued companies means you can be brilliant at some things, completely useless at other things and have your head up your ass about most things.

So if all you can talk about here is how wrong they are, what’s keeping you?

This is a complicated question with a complicated answer. (If it makes things more expedient for you, feel free to just call me a shill, cult member and/or idiot.)

The desktop environment

Picking a desktop PC platform right now is a classic case of picking your poison. Apple cares way less about their desktop platform for every passing day, but so apparently does the rest of the world. I use Windows 10 every day, and I wish I didn’t - if Microsoft was still interested in advancing their platform in the way they did between Windows Vista and Windows 7, I might switch tomorrow.

At the risk of disappearing up my own butt for a moment, it’s frighteningly clear that no one at Apple or Microsoft values the way macOS and Windows respectively has worked enough to not see turning parts of it into a tablet OS – irrespective of the fact that tablets have not gone beyond gimmicks for most desktop users – as progress.

So just switch to Linux

The Linux crowd is what’s left, and they cook their own punch, so they seem like an obvious refuge. I’ve been giving them short shrift for reasons that make sense to me, but may not be obvious.

The words “user experience” is thrown around a lot these days in place of “UI” and sadly also in place of “usability”. I’ve always been curious and tried using many platforms. I cut my teeth learning to use computers in the System 6 era of the Macintosh, during the period where there was a clear chasm between the Macintosh OS and Windows 3.11. The Mac was a coherently designed platform, with cohesion, with a sense of nuance and personality. It, and Windows 95 after it, had a culture that became synonymous with the OS, that put clear expectations in your head as a user and that you could fulfill as a developer. You don’t need a single company to do this, but you need a place of authority that’s open to criticism and change, that will adapt and survive, that exudes longevity.

Unix is a platform like this, and it has a coherent and cohesive user experience. It has clear rules and ideals that make sense - they’re all about doing one thing and being able to be reused. The problem isn’t Unix. The problem is that if you don’t want to live your life inside a command prompt (no offense to the people who do), it starts to fall apart. Package managers work great. X is a trainwreck, fixing it is impossible and getting people to move to Wayland is still an uphill battle. The Unix idioms and ideas are being applied to graphical UIs, where they don’t make sense, and what results is a cacaphonic mess, not conducive to a pleasant and effortless user experience; things will look different and work different and be largely programmed, structured and maintained by people who do not care about usability. There are people who do care about it, but in this culture they tend to move on to other things.

All this said, the significant time I’ve spent in various forms of GNOME or KDE over two decades has not once instilled confidence in me. (Again, aside from package managers. Although with systemd now being a sprawling, hot mess prone to extreme security flaws but still heading to near universal adoption, maybe I just have no idea what’s going on, and all of this can be ignored.)

The desktop environment (cont’d)

The previous section was not simply a detour. You may notice that other platforms have their own signature viewed through the platform cohesion prism. Windows 95 and Windows XP were both very cohesive. iOS started out being incredibly cohesive, has gone here and there, and is arriving at a point where I may not agree with all the decisions, but at least the apps in the OS feel reasonably consistent.

The Web is a difficult beast. Web pages, by and large, are easily understood. They have an easier job because they’re not asking for a lot of interaction. They are mostly vehicles for information (text, video) with limited interactivity. Web pages are cohesive, for the same reason reading a magazine is cohesive. There’s no learning curve.

Web apps? It depends on the app itself. The original version of Gmail looked like someone had made a UI toolkit out of Google’s own web site, but it was easy enough to use and solved more problems than it created.

Web apps as a way to implement the range of functionality most often associated with full-on desktop applications? Only very rarely done well, and I can’t think of a good example off the top of my head.

Web apps are part of the problem, but they’re not even the only part of the problem. The other part is the copy-that-floppy road of lazily emergent (read: lack of) user experience design. People wanted to make early mobile apps that put many choices at a user’s fingertips, so they invented the hamburger button, opening a sidebar menu, as a reduced example of a menu bar. Other people wanted to not change user paradigms, so they put the same interface on a tablet app. Yet other people wanted to make desktop apps look modern, so they made them look like tablet apps. And so what we now have are web implementations of 3.5” touch screen assumptions for a desktop platform. If your goal is to make an application that users of desktop applications will find familiar, you couldn’t start from a worse place if you tried.

It’s not that desktop apps should look the same if you leave the planet and come back ten years later. It’s that they should at least not completely give up everything that was put in place to make them understandable and efficient; and that if they do replace those things, that they are replaced with things designed with desktop applications in mind. Shrugging your shoulders and saying “but everyone has a phone and everyone uses the web” is like arguing the door on a microwave should look like an actual house door or the lid of a toilet.