Let a Hundred Flowers Bloom

South China Morning Post:

“By allowing its platform to clear the way for an app that incites illegal behaviour, [does Apple] not worry about damaging its reputation and hurting the feelings of consumers?” said a bellicose commentary published on the app of People’s Daily, the Chinese Communist Party mouthpiece.

Apple’s current situation between a Communist dictatorship/market-that-the-stock-market-would-rather-it-found-its-future-growth-in and an increasingly concerned user base is entirely their own fault.

When the iPhone App Store first launched, downloadable and installable applications had been documented fact for several years, and even several generations of smartphones. What Steve Jobs - and there is significant evidence that as the last holdout when everyone else were foaming at the mouth to give developers permission and tools to make apps and games, it really was literally him - wrought upon the world was a form of developer platform that dressed up the closed market of an authoritarian state as a convenience and an enabler of security and trust.

I have discussed the flaws of the App Store at length, but looking at it from the prism of the current situation, an alternate universe emerges, where apps were possible to plainly and easily distribute, and Apple, among with other platforms, could have played the role it looks to define for itself as makers of tools for the misfits, rebels and square pegs in round holes. The alternate universe was the default, and Apple bent the arch of history towards the current situation.

Apple may not have created the first App Store in existence, but just as there was a before and after iPhone, there was a before and after App Store, and this model has now been picked up by most other platforms, making life easier not for the developer or customer, but for the platform runner, and certainly for illegitimate, repressive regimes like the one holding a sixth of the world’s population hostage.

Put a dent in the universe, indeed.

Fan-spec

CUPERTINO, Calif.

For immediate release.

Apple Inc. today announced its all-new revolutionary line of MacBook Pro notebooks, headlined by the brand new 16-inch MacBook Pro. This model has a 120 Hz ProMotion HDR Retina display, powered by a built-in Vega II graphics processor and fitting one more inch of display into the device dimensions of the 15-inch MacBook Pro.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

All MacBook Pro models sport a stunning, time-proven, industry-leading scissor-switch keyboard, improving upon the radical butterfly mechanism. Additionally, customers get a choice of a full Touch Bar + Touch ID configuration or function keys + Touch ID.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

The Thunderbolt 3, USB-C ports and 3.5 mm headphone jack are joined by two USB-A ports, an SDXC card slot and an HDMI 2.1 port, capable of fully powering a 4K external display at 120 Hz or an 8K display at 60 Hz.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

The new MacBook Pro line also contains numerous other improvements, like the latest Intel processors, a new user-serviceable battery with 20% longer all-day battery life, a 4K FaceTime camera with a built-in privacy slider, even faster flash storage, up to 128 GB RAM and a matte display configuration.

“This is what our customers have been dreaming of for years,” said Senior Vice President of Worldwide Marketing Phil Schiller.

While no one was looking, Apple also announced a 5K Pro HDR Display for $999 including a Pro Stand, and a modular Mac Pro mini starting at $1499 with an 8-core Intel i7 processor, 8 GB of RAM, 256 GB of NVMe M.2 storage and three PCI Express expansion slots.

“We think the significant fraction of our customers who don’t work at movie lots with functionally infinite resources will appreciate our new additions to the pro—“ attempted Vice President of Special Projects and previous head of Mac Hardware Engineering Bob Mansfield, yelling from outside a third-level Apple Park press briefing room window, standing on a metal ladder that appeared to be indistinctly yanked away by an autonomous vehicle.

“This is what our customers have been dreaming of for years,” said mildly perspiring Senior Vice President of Worldwide Marketing Phil Schiller.

No Escape

Apple, to the extent they ever were, has stopped being a company that can move quickly. They have long pipelines, characterized even by Tim Cook as a “treadmill of innovation”. They know which product they will put out, roughly, in a year, which OS it will align with, and which new standards will be ready by then that can be taken advantage of. I think this is why I am having so conflicted feelings about what’s going on now as different products transition in and out.

The new Mac Pro was introduced, and it again embraces what a fully loaded up computer can be and the power it gives to its user, going as far as to bake in the “cheese grater” worship into the visual. (They knew.) But it also starts at a wallet-melting $5999. (You know how much money that is? That’s, like, six Pro Stands!) The grater everyone was wishing for started at $2499, and was within range of many more Mac users.

The (12-inch, one-USB-C-port) MacBook was scrapped today, but so was the MacBook Pro “Escape”, the 13-inch model without a Touch Bar. The new models are upgraded and better, unless you want to do advanced things like press down a key and be reasonably sure which letter shows up on the screen and how many of them.

The last four or five years have been like a walk in the desert for Apple. They are exceptionally good at some things, like miniaturization and betting on new standards and “skating to where the puck is going to be”, and in a world where you risk getting stranded atop local maxima, it’s a good tool to have in your belt. But that’s all it is - it’s a tool.

Starting roughly around the 12-inch MacBook, they let it be their only virtue. The problem is that they are the only vendor in their own platform, and have an increasing number of people with a wide range of problems to solve. There’s nothing wrong with having a laptop with only USB-C ports, but if several years on people haven’t dropped the other ports, it’s quite possible it’s a good idea to have a computer with both USB-C and other ports on it. It’s quite possible you could shrink the Mac Pro down a bit to not be quite so monstrous, sell it with a moderately powerful i7/i9 (or AMD Ryzen, once USB 4 comes around and Thunderbolt support doesn’t have to be dropped) at less than half the price and rule the galaxy. It’s quite possible you could offer MacBook Pros with both Touch Bars and no Touch Bars, and letting people choose which they like.

None of this means they’ll have to stop doing what they were previously doing - which shouldn’t matter, but since to Apple “not being completely right” seems to be heart-aching, world-view-shattering anathema, maybe it helps. I notice that in the grand scale of things, a more capable computer in the MacBook Air won out either in the marketplace or in Apple’s plans (probably both) over the sleek-for-sleekness-sake 12” MacBook, and that seems promising.

Simplification is a useful tool, too. Having fewer products is better. But it’s only better as long as you end up making the right computer for your user base.

(Edit: Another positive sign I missed - the SSD upgrades have gone from armed robbery to mere pickpocketing. When you can slough off $1400 for an upgrade and prices are still high, at least you know they were extortionate to begin with.)

Exit, Not Pursued by a Bear

There are many things to say about WWDC, and I may say some of them in other posts, but the more I look at SwiftUI, the more I like it. Marzipan/Project Catalyst/UIKit on Mac/“iPad apps on Mac” is still as much of a stop-gap money grab as it ever was, but I was wrong to assume that it was the totality of what was up the Cupertonian sleeve.

SwiftUI is in software what so many of the hardware hits have been - a hundred small things that individually have been done before, but put together in a coherent package and seemingly done well. Whether you find precedence in Elm, Svelte, React or WPF/XAML, SwiftUI is an amalgam of sane, well-chosen ideas, mixed with some new ones, like the ostensive compaction of wrapper views down to a sparse and efficient rendered layer. And for once, SwiftUI isn’t a misnomer. It builds on years of wrangling a new language to the place that it allows something like it, like the pervasiveness of a deep and dependable mutability model, without which the checks signed by all the features couldn’t be cashed.

So many of Apple’s decisions, especially for operating systems and frameworks, have been made from a position of weakness and under the pressure of deadlines. If there had been no SwiftUI, this would still have been the biggest WWDC for many years. But SwiftUI looks like it’s been a new idea that’s been allowed to grow and mature; built after a long, hard think, driven by exploration and ideas instead of forced deadlines.

I rhetorically asked for a “Cocoa X” rethink. SwiftUI is only the UI, but it is a fundamental rethink of that problem. It’s additive and incremental, and still looks and feels native, because it is; no iPad-looking concoction transplanted into the middle of your Mac app, or vice versa.

James Joyce said: “History is a nightmare from which I’m trying to awake.” Show last week’s Objective-C code written against the iOS 12 SDK UIKit to a NeXTStep developer, and they might still recognize most things. Code doesn’t rot, but new ideas do come around. It is worth looking back at the past, looking at the timeless gist of the problem, and wondering if, 32 years later, we don’t have a better way to get where we need to go.

This is The Moment Everything Changes

Preface

Anyone reading Waffle as a matter of course for the past dozen years or so know that most of it has been aimed at discussing Apple, or changes in the technology landscape, or preferably the overlap thereof. Most of what I write tends to come out saying the same things, which could be pointedly but not indefensibly summed up as change is wrong. I am a strange creature, both pouncing on the new and the exciting, rushing to extol its virtues and celebrate seeing things from a new angle, but also demanding what’s good about the old is maintained, lessons not lost like drawings in the sand.

In raving about the new, I am rarely alone, and in supplying critique, I am rarely the best, so it would not be a surprise to me being viewed as a one-trick pony, an obsessed kook. I should choose wider subjects, but I write about these things because the potential (and all too often actual) downsides affect all of us, and I see so few attempts to cover them end to end over time, in depth, with the focus of someone who for better or worse thinks about it all the time. Not only do I make these kinds of decisions for the software I work on, everyone else’s decisions have consequences for everyone, me included. I don’t think I can talk back the tide, but if the tide is going to swallow something I love, at least it shouldn’t go silently.

WWDC

On Monday, June 3rd, Apple holds its annual Worldwide Developers Conference, WWDC, in San José, and is expected to set out its direction on a number of issues. It will be a momentuous day. I have at times made a series of predictions, but this time I thought it was better to talk about them instead.

Marzipan

First, Apple is supposed to complete the curveball it threw all of us for in last year’s WWDC, when it started to say that iOS apps are not coming to Mac, and following it up by saying that instead, UIKit is coming to the Mac, presenting four new apps that looked like straight-up ports from their new iOS incarnations that had in fact not even been ports. Supposedly, UIKit on Mac is codenamed “Marzipan”, and this codename leaked well ahead of last year’s WWDC.

I am not wild about Marzipan. Steve Troughton-Smith, who has at least a five year record of believing UIKit is the future of all UIs anywhere, likes to frame it as a fear of change. Of course there’s a point that if you start a new application framework design in 2005-2006, have the learnings of Mac OS X in fresh memory and are restrained from doing wild stuff by weak hardware and few resources, you’ll probably end up with a lot of fat trimmed, and many mistakes not repeated anew.

The reason I’m not wild about Marzipan is because wanting to use a Mac in the first place has always been about liking the way things are subtly different and subtly better. The Marzipan apps so far have been completely bled of this quality. They make the same mistake “Universal” Windows applications did, which is to believe that taking a touch interface and sprinkling keyboard-and-mouse adaptiveness on top of it is “enough”. It is “enough” for a dropdown menu to be one of those scrollable list pickers - the ones designed for a finger to swipe through on a constrained display, with haptic feedback guiding you. (This was UI that Apple actually shipped in an app that wasn’t just a major feature of an OS update but a flagship app of a new framework.) At least the UWP applications can more readily expect the screen on a laptop to respond to touch.

The thought of Marzipan being capable of delivering something Mac users will recognize and praise as Mac-like is laughable; the thought of it subsuming Cocoa to become the recommended default is offensive. Cocoa eclipsed Carbon because it was better at providing a Mac-like experience. For all the recent iOSsification of macOS, I still don’t see this being the case without extensive surgery. If anything, the way forward should have been a “Cocoa X”, designed from scratch with the learnings of both UIKit and AppKit/Cocoa in mind. The current Marzipan apps are abominations, not aspirations.

Mac Pro

The Mac Pro timeline for the past 8 years is near comical. The classical big honkin’ tower Mac Pro (the only one with a traditional desktop form factor and support for expansion cards) was left without updates long enough that people worried where it was going, until finally, Apple revealed a cylindrical Mac Pro with almost no built-in expansion, under the promise of more to come, only to leave it, too, hung out with no updates for several years, until they invited a few journalists to leak that they were working on a new “modular” Mac Pro. This was now more than two years ago.

The Mac Pro doesn’t affect me personally - but it affects me in so far as it defines the bounds of the platform. If Apple wanted to give exactly zero figs about professional usage (such as it traditionally applied to the platform; primarily scientific work and media production), the time for them to silently drop their involvement, Xserve-style, has come and gone. They have dug in, and with the iMac Pro has produced a stopgap model that while not perfect is at least a milk bone to this demographic.

My prediction for the Mac Pro is that their opaque talk about “modular” doesn’t mean that they have vectored back to their customers’ wishes, despite them being inconveniently fueled by actual needs and requirements. They are going to produce a computer that is physically somewhere between a Mac mini and a “Shuttle PC”, with extremely minimal, if any, internal expansion, and most use cases still routed to external (and expensive) Thunderbolt 3 (or possibly extremely early USB4, which subsumes it to some extent) devices and chassis. The Mac tower will remain dead, “modular” will refer only to that of not including the display in the body of the computer and opinions driven by facts will be vented and dismissed as “emotional” due to their inclusion in the ongoing facepalm saga that is the modern Mac Pro era.

The Future of the Mac

Marzipan will be the banner headline of a macOS release that will, years later, scarcely be remembered for anything aside from this. The way the Mac Pro goes will also give a clear signal of what’s most important to Apple at the end of the day. It has become apparent over the past few years that Apple is more interested in what is sleek and minimalist than what is actually useful, usable and powerful.

Apple’s first advertisement announced that “simplicity is the ultimate sophistication”; regular Mac users can enumerate many cases where the decision has gone for the simplistic or sophistry instead. People are holding on to their several year old laptops, hoping they don’t break, because the new keyboard is such a marvel of engineering it can’t successfully do things asked of every other keyboard on the planet. Steve Jobs once said that Apple doesn’t know how to build a $500 computer that isn’t a piece of shit - it’s now a worrying possibility that it has forgot how to supply a keyboard at any price that isn’t worse than the cheapest Dell pack-in.

But the keyword is “forgot”. There used to be a time when Apple had no trouble pumping out regular updates to its Macs, leaving the generational upgrades for every few years, and just putting out a spec bump now and then. Over the past few months they have seemingly been trying to get back into the habit of doing it again. It’s nothing a company jousting for the position of the world’s most highly valued should beat itself on the chest over, but when a sign of health has been missing, its recurrence is appreciated.

The maligned butterfly keyboard is still there on the MacBook Pro just bumped a few weeks back, and bizarrely listed the same day on the list of eligibility for the keyboard repair program. The favorable interpretation is that it’s there to calm customers, but that by definition it can’t be a status quo that lasts forever, so it’s a tacit confirmation that a keyboard free of all these issues is again being planned for future products. (When that’s the favorable interpretation, you know you’ve fucked up.)

Be it the increasingly tightened application environment (in the name of security), the inscrutable hardware decisions, the software quality issues and the increasing lack of a long-term roadmap, Mac users have been stuck in a time loop for years now. New OS versions bring few new features but many incompatibility worries, and applications not updated recently risk falling by the wayside, as do developers not ready to jump into whichever incremental feature or user interface fashion refresh ultimately will not benefit macOS users as much as a good old focus on bringing the productivity, usability and flexibility up.

Apple is a big company, devoting medium company resources to a small company mindset. Being a startup in terms of being agile and willing to take risks is great, but it’s now juggling both macOS Mac and iOS iPad as competing computing visions, where both can be said to be troubled, stymied by hardware and increasingly unwilling to let developers unleash their own creativity for the benefit of their users’ productivity and flexibility.

Whether I’ll like the outcome or not, the cards are stacked for Apple to weigh in heavily on all these things (including possibly by inaction, to focus much more on iOS) come Monday. If optimism left me easily, I would be typing this on a capable PC laptop instead (although possibly swearing equally at a UWP Windows future). But I am holding my breath, because one way or another, when all of WWDC has been summed up, we’ll be able to look back at it and say that it was the moment where everything finally, ultimately, irrevocably changed.


(Postscript, five minutes before the keynote: I see via Twitter that I have left out contact details on this weblog. Since I am indeed a Comic Book Guy-like curmudgeon who can be dismissed as such, you should not feel the need to send any emails, since it would probably be a pointless exercise. Better yet, write up the reasons why I’m wrong and post them somewhere! That way you’ll inform more people than me. It’s okay if it takes more than 280 characters. And yes, it’s also okay if someone halfway across the Internet is wrong, puerile or misinformed.)

Tiny

There’s no real way to look at the Panic Playdate and see hard-edged, economically shrewd value. The metric itself leads you astray, overvaluing 8000-in-one White Label thingama-SouljaGames that are far closer to the predictable accusations of hipster indulgences.

What I love about it is a recently recurring theme that’s, amidst a polarized and increasingly de-humanized society, been easy to disregard: the glimmers of hope. A group of under a dozen people can still create a little thing like this, including its own damn OS, just because they love the feel of technology built by those who care.

There were a thousand reasons to not build it. There were a thousand reasons to run in the opposite direction, to give up, to completely cede the ground to consoles and touch and game streaming, to things that can be screen captured to Twitch.

But there are also a thousand reasons to do it. The reason our world is crap is because of the funneling of everything into gargantuan seas of milky mediocrity. The biggest entrant wins by subsuming everyone else, by swallowing and outspending and walking all over the competition. The only way out is for life to be a puzzle again, a challenge, for someone’s charming ideas and pet projects to be valued beyond digits on a bank balance readout.

Technology, science, human progress all exists so that we may stand on the shoulders of our forebearers. For decades we have bent semiconductors and materials to our will, but to see that a collection of people who can fit inside the average kitchen can build something more or less from the ground up with so much character but also so significantly outside of their comfort zone is truly inspiring. In a world full of cynicism and derivative madness, what could be better?

Who knows if I’ll get one, but I’m on their side.

The Ones Who See Things Differently

Think Different was about respect for creators. It was about creativity, unconventional thinking and real courage, to change people’s lives, turn the tide, bring education and humanity and a better understanding of the world to the masses. Think Different was about having people like that as your heroes, and wanting them and other people to have a tool that met them halfway and let them focus on what mattered to them.

It was a justification to do something differently than the behemoth.

Today’s event was about an Apple that may still have their sights of some important values compared to other behemoths, but where the focus is on the fervent belief that whole-banana-ism needs to extend to every corner of everyone’s life. Behemoths like Apple, Amazon, Google and Microsoft act as if they need to have fingers in every pie, provide solutions to every problem, build complete stacks.

Even people who loved Think Different and who still love Apple know there’s more to reading News than Apple News+ (or indeed the normal Apple News app), more to games than Apple Arcade (and indeed indie games have been turning conventions inside out for decades) and more to TV than Apple TV+. Regardless of whether they include good products, Apple is starting to insult both the people who use their products and the heroes they hopefully still respect.

The Internet has torn down walls and connected people, and even though everyone has a full stack and a streaming platform, no single place is a catch-all any longer. Every bucket of “exclusives” is a dated prayer for a dream of control and containment and world domination. What we all crave is for a world that understands interests and respects choice and diversity, where you choose what you want without juggling worries of incompatibility. Not snooty, self-important “curators”, claiming themselves the world’s greatest in fields they have not entered into before, when they can’t even keep fake “antivirus” apps out of their own decade-old App Store, stabilize spiraling software quality years in the making, and when their user interface vocabulary has you pressing a “Share” button to use a “Find on page” command.

Like Giving a Glass of Hell to Somebody in Ice Water

Being one of the world’s highest valued companies means you can be brilliant at some things, completely useless at other things and have your head up your ass about most things.

So if all you can talk about here is how wrong they are, what’s keeping you?

This is a complicated question with a complicated answer. (If it makes things more expedient for you, feel free to just call me a shill, cult member and/or idiot.)

The desktop environment

Picking a desktop PC platform right now is a classic case of picking your poison. Apple cares way less about their desktop platform for every passing day, but so apparently does the rest of the world. I use Windows 10 every day, and I wish I didn’t - if Microsoft was still interested in advancing their platform in the way they did between Windows Vista and Windows 7, I might switch tomorrow.

At the risk of disappearing up my own butt for a moment, it’s frighteningly clear that no one at Apple or Microsoft values the way macOS and Windows respectively has worked enough to not see turning parts of it into a tablet OS – irrespective of the fact that tablets have not gone beyond gimmicks for most desktop users – as progress.

So just switch to Linux

The Linux crowd is what’s left, and they cook their own punch, so they seem like an obvious refuge. I’ve been giving them short shrift for reasons that make sense to me, but may not be obvious.

The words “user experience” is thrown around a lot these days in place of “UI” and sadly also in place of “usability”. I’ve always been curious and tried using many platforms. I cut my teeth learning to use computers in the System 6 era of the Macintosh, during the period where there was a clear chasm between the Macintosh OS and Windows 3.11. The Mac was a coherently designed platform, with cohesion, with a sense of nuance and personality. It, and Windows 95 after it, had a culture that became synonymous with the OS, that put clear expectations in your head as a user and that you could fulfill as a developer. You don’t need a single company to do this, but you need a place of authority that’s open to criticism and change, that will adapt and survive, that exudes longevity.

Unix is a platform like this, and it has a coherent and cohesive user experience. It has clear rules and ideals that make sense - they’re all about doing one thing and being able to be reused. The problem isn’t Unix. The problem is that if you don’t want to live your life inside a command prompt (no offense to the people who do), it starts to fall apart. Package managers work great. X is a trainwreck, fixing it is impossible and getting people to move to Wayland is still an uphill battle. The Unix idioms and ideas are being applied to graphical UIs, where they don’t make sense, and what results is a cacaphonic mess, not conducive to a pleasant and effortless user experience; things will look different and work different and be largely programmed, structured and maintained by people who do not care about usability. There are people who do care about it, but in this culture they tend to move on to other things.

All this said, the significant time I’ve spent in various forms of GNOME or KDE over two decades has not once instilled confidence in me. (Again, aside from package managers. Although with systemd now being a sprawling, hot mess prone to extreme security flaws but still heading to near universal adoption, maybe I just have no idea what’s going on, and all of this can be ignored.)

The desktop environment (cont’d)

The previous section was not simply a detour. You may notice that other platforms have their own signature viewed through the platform cohesion prism. Windows 95 and Windows XP were both very cohesive. iOS started out being incredibly cohesive, has gone here and there, and is arriving at a point where I may not agree with all the decisions, but at least the apps in the OS feel reasonably consistent.

The Web is a difficult beast. Web pages, by and large, are easily understood. They have an easier job because they’re not asking for a lot of interaction. They are mostly vehicles for information (text, video) with limited interactivity. Web pages are cohesive, for the same reason reading a magazine is cohesive. There’s no learning curve.

Web apps? It depends on the app itself. The original version of Gmail looked like someone had made a UI toolkit out of Google’s own web site, but it was easy enough to use and solved more problems than it created.

Web apps as a way to implement the range of functionality most often associated with full-on desktop applications? Only very rarely done well, and I can’t think of a good example off the top of my head.

Web apps are part of the problem, but they’re not even the only part of the problem. The other part is the copy-that-floppy road of lazily emergent (read: lack of) user experience design. People wanted to make early mobile apps that put many choices at a user’s fingertips, so they invented the hamburger button, opening a sidebar menu, as a reduced example of a menu bar. Other people wanted to not change user paradigms, so they put the same interface on a tablet app. Yet other people wanted to make desktop apps look modern, so they made them look like tablet apps. And so what we now have are web implementations of 3.5” touch screen assumptions for a desktop platform. If your goal is to make an application that users of desktop applications will find familiar, you couldn’t start from a worse place if you tried.

It’s not that desktop apps should look the same if you leave the planet and come back ten years later. It’s that they should at least not completely give up everything that was put in place to make them understandable and efficient; and that if they do replace those things, that they are replaced with things designed with desktop applications in mind. Shrugging your shoulders and saying “but everyone has a phone and everyone uses the web” is like arguing the door on a microwave should look like an actual house door or the lid of a toilet.

Desktop

The conceit

Picking a desktop PC platform right now is a classic case of picking your poison. Apple cares way less about their desktop platform for every passing day, but so apparently does the rest of the world. I use Windows 10 every day, and I wish I didn’t - if Microsoft was still interested in advancing their platform in the way they did between Windows Vista and Windows 7, I might switch to Windows tomorrow and make my life much easier.

It’s frighteningly clear that no one at Apple or Microsoft values the way macOS and Windows respectively has worked enough to not see turning parts of it into a tablet OS – irrespective of the fact that tablets have not gone beyond gimmicks for most desktop users – as progress.

The third place

The Linux crowd is what’s left, and they cook their own punch, so they seem like an obvious refuge. I’ve been giving them short shrift for reasons that make sense to me, but may not be obvious.

The words “user experience” is thrown around a lot these days in place of “UI” and sadly also in place of “usability”. I’ve always been curious and tried using many platforms. I cut my teeth learning to use computers in the System 6 era of the Macintosh, during the period where there was a clear chasm between the Macintosh OS and Windows 3.11. The Mac was a coherently designed platform, with cohesion, with a sense of nuance and personality. It, and Windows 95 after it, had a culture that became synonymous with the OS, that put clear expectations in your head as a user and that you could fulfill as a developer. You don’t need a single company to do this, but you need a place of authority that’s open to criticism and change, that will adapt and survive, that exudes longevity.

Unix is a platform like this, and it has a coherent and cohesive user experience. It has clear rules and ideals that make sense - they’re all about doing one thing and being able to be reused. The problem isn’t Unix. The problem is that if you don’t want to live your life inside a command prompt (no offense to the people who do), it starts to fall apart. Package managers work great. X is a trainwreck, fixing it is impossible and getting people to move to Wayland is still an uphill battle. The Unix idioms and ideas are being applied to graphical UIs, where they don’t make sense, and what results is a cacaphonic mess, not conducive to a pleasant and effortless user experience; things will look different and work different and be largely programmed, structured and maintained by people who do not care about usability. There are people who do care about it, but in this culture they tend to move on to other things.

All this said, the significant time I’ve spent in various forms of GNOME or KDE over two decades has not once instilled confidence in me. (Again, aside from package managers. Although with systemd now being a sprawling, hot mess prone to extreme security flaws but still heading to near universal adoption, maybe I just have no idea what’s going on, and all of this can be ignored.)

The continuation

The previous section was not simply a detour. You may notice that other platforms have their own signature viewed through the platform cohesion prism. Windows 95 and Windows XP were both very cohesive. iOS started out being incredibly cohesive, has gone here and there, and is arriving at a point where I may not agree with all the decisions, but at least the apps in the OS feel reasonably consistent.

The Web is a difficult beast. Web pages, by and large, are easily understood. They have an easier job because they’re not asking for a lot of interaction. They are mostly vehicles for information (text, video) with limited interactivity. Web pages are cohesive, for the same reason reading a magazine is cohesive. There’s no learning curve.

Web apps? It depends on the app itself. The original version of Gmail looked like someone had made a UI toolkit out of Google’s own web site, but it was easy enough to use and solved more problems than it created.

Web apps as a way to implement the range of functionality most often associated with full-on desktop applications? Only very rarely done well, and I can’t think of a good example off the top of my head.

Web apps are part of the problem, but they’re not even the only part of the problem. The other part is the copy-that-floppy road of lazily emergent (read: lack of) user experience design. People wanted to make early mobile apps that put many choices at a user’s fingertips, so they invented the hamburger button, opening a sidebar menu, as a reduced example of a menu bar. Other people wanted to not change user paradigms, so they put the same interface on a tablet app. Yet other people wanted to make desktop apps look modern, so they made them look like tablet apps. And so what we now have are web implementations of 3.5”-ish touch screen assumptions for a desktop platform. If your goal is to make an application that users of desktop applications will find familiar, you couldn’t start from a worse place if you tried.

It’s not that desktop apps should look the same if you leave the planet and come back ten years later. It’s that they should at least not completely give up everything that was put in place to make them understandable and efficient; and that if they do replace those things, that they are replaced with things designed with desktop applications in mind. Shrugging your shoulders and saying “but everyone has a phone and everyone uses the web” is like arguing the door on a microwave should look like an actual house door or the lid of a toilet.

The future

So what’s the answer?

Wipe the slate clean. I don’t mean of influences - I mean to take everything that has worked at some point or another, everything that wasn’t just ported over from somewhere else in the name of expediency, and build something new from those parts. Flat design for things that are interactive is a usability disaster because not being able to tell when things are different will slow you down. (If nothing is interactive, you don’t have a problem that a magazine art director from 1965 couldn’t solve; go about your day.)

We’re at a saddle point in history right now, where the road back makes you look old, and the road forward is daunting because who even talks about desktop PC environments anymore, right? But that line of thinking has gotten us 10-15 years of a desktop rat king, made from tablets, phones, web pages and a little backported, misappropriated good old magazine layouting. It’s time for someone to sit in a hammock for a year and work this out.

Predictions for the Microsoft Build 2018 Keynote

  • Absolutely nothing that in any way challenges the grab-first-ask-questions-later approach to privacy in Windows 10, or the constant disrespect of user opinions and decisions by pushing Edge, Cortana and Microsoft Store, bundling gobs of uninstallable apps and a smaller amount of installable but unwanted apps (Candy Crush on a server, anyone?) and then redoing it all again on every semi-annual Update.
  • Continued dead reckoning in the Windows 8 course to a) imagine we all love touch, b) pretend dipping all of Windows in Metro, sorry, Modern Design, sorry, Fluent was not a mistake and c) continuing letting the ~100% of Windows 10 users who do not use it on tablets, phones or mixed reality devices put up with this bullshit because Microsoft would rather light its own intestines on fire with a firecracker than listen to its users and speak to their concerns.
  • Fucks given about introducing a persistent timeline of user activities, syncing to their cloud and activated without explicit user consent within three weeks of GDPR taking legal effect in Europe: roughly zero.
  • Assurances that all the above is okay because some (reasonable) things are now open source and hey, you can now pay them money to run Linux stuff in Azure, how ‘bout that.