Tiny

There’s no real way to look at the Panic Playdate and see hard-edged, economically shrewd value. The metric itself leads you astray, overvaluing 8000-in-one White Label thingama-SouljaGames that are far closer to the predictable accusations of hipster indulgences.

What I love about it is a recently recurring theme that’s, amidst a polarized and increasingly de-humanized society, been easy to disregard: the glimmers of hope. A group of under a dozen people can still create a little thing like this, including its own damn OS, just because they love the feel of technology built by those who care.

There were a thousand reasons to not build it. There were a thousand reasons to run in the opposite direction, to give up, to completely cede the ground to consoles and touch and game streaming, to things that can be screen captured to Twitch.

But there are also a thousand reasons to do it. The reason our world is crap is because of the funneling of everything into gargantuan seas of milky mediocrity. The biggest entrant wins by subsuming everyone else, by swallowing and outspending and walking all over the competition. The only way out is for life to be a puzzle again, a challenge, for someone’s charming ideas and pet projects to be valued beyond digits on a bank balance readout.

Technology, science, human progress all exists so that we may stand on the shoulders of our forebearers. For decades we have bent semiconductors and materials to our will, but to see that a collection of people who can fit inside the average kitchen can build something more or less from the ground up with so much character but also so significantly outside of their comfort zone is truly inspiring. In a world full of cynicism and derivative madness, what could be better?

Who knows if I’ll get one, but I’m on their side.

The Ones Who See Things Differently

Think Different was about respect for creators. It was about creativity, unconventional thinking and real courage, to change people’s lives, turn the tide, bring education and humanity and a better understanding of the world to the masses. Think Different was about having people like that as your heroes, and wanting them and other people to have a tool that met them halfway and let them focus on what mattered to them.

It was a justification to do something differently than the behemoth.

Today’s event was about an Apple that may still have their sights of some important values compared to other behemoths, but where the focus is on the fervent belief that whole-banana-ism needs to extend to every corner of everyone’s life. Behemoths like Apple, Amazon, Google and Microsoft act as if they need to have fingers in every pie, provide solutions to every problem, build complete stacks.

Even people who loved Think Different and who still love Apple know there’s more to reading News than Apple News+ (or indeed the normal Apple News app), more to games than Apple Arcade (and indeed indie games have been turning conventions inside out for decades) and more to TV than Apple TV+. Regardless of whether they include good products, Apple is starting to insult both the people who use their products and the heroes they hopefully still respect.

The Internet has torn down walls and connected people, and even though everyone has a full stack and a streaming platform, no single place is a catch-all any longer. Every bucket of “exclusives” is a dated prayer for a dream of control and containment and world domination. What we all crave is for a world that understands interests and respects choice and diversity, where you choose what you want without juggling worries of incompatibility. Not snooty, self-important “curators”, claiming themselves the world’s greatest in fields they have not entered into before, when they can’t even keep fake “antivirus” apps out of their own decade-old App Store, stabilize spiraling software quality years in the making, and when their user interface vocabulary has you pressing a “Share” button to use a “Find on page” command.

Like Giving a Glass of Hell to Somebody in Ice Water

Being one of the world’s highest valued companies means you can be brilliant at some things, completely useless at other things and have your head up your ass about most things.

So if all you can talk about here is how wrong they are, what’s keeping you?

This is a complicated question with a complicated answer. (If it makes things more expedient for you, feel free to just call me a shill, cult member and/or idiot.)

The desktop environment

Picking a desktop PC platform right now is a classic case of picking your poison. Apple cares way less about their desktop platform for every passing day, but so apparently does the rest of the world. I use Windows 10 every day, and I wish I didn’t - if Microsoft was still interested in advancing their platform in the way they did between Windows Vista and Windows 7, I might switch tomorrow.

At the risk of disappearing up my own butt for a moment, it’s frighteningly clear that no one at Apple or Microsoft values the way macOS and Windows respectively has worked enough to not see turning parts of it into a tablet OS – irrespective of the fact that tablets have not gone beyond gimmicks for most desktop users – as progress.

So just switch to Linux

The Linux crowd is what’s left, and they cook their own punch, so they seem like an obvious refuge. I’ve been giving them short shrift for reasons that make sense to me, but may not be obvious.

The words “user experience” is thrown around a lot these days in place of “UI” and sadly also in place of “usability”. I’ve always been curious and tried using many platforms. I cut my teeth learning to use computers in the System 6 era of the Macintosh, during the period where there was a clear chasm between the Macintosh OS and Windows 3.11. The Mac was a coherently designed platform, with cohesion, with a sense of nuance and personality. It, and Windows 95 after it, had a culture that became synonymous with the OS, that put clear expectations in your head as a user and that you could fulfill as a developer. You don’t need a single company to do this, but you need a place of authority that’s open to criticism and change, that will adapt and survive, that exudes longevity.

Unix is a platform like this, and it has a coherent and cohesive user experience. It has clear rules and ideals that make sense - they’re all about doing one thing and being able to be reused. The problem isn’t Unix. The problem is that if you don’t want to live your life inside a command prompt (no offense to the people who do), it starts to fall apart. Package managers work great. X is a trainwreck, fixing it is impossible and getting people to move to Wayland is still an uphill battle. The Unix idioms and ideas are being applied to graphical UIs, where they don’t make sense, and what results is a cacaphonic mess, not conducive to a pleasant and effortless user experience; things will look different and work different and be largely programmed, structured and maintained by people who do not care about usability. There are people who do care about it, but in this culture they tend to move on to other things.

All this said, the significant time I’ve spent in various forms of GNOME or KDE over two decades has not once instilled confidence in me. (Again, aside from package managers. Although with systemd now being a sprawling, hot mess prone to extreme security flaws but still heading to near universal adoption, maybe I just have no idea what’s going on, and all of this can be ignored.)

The desktop environment (cont’d)

The previous section was not simply a detour. You may notice that other platforms have their own signature viewed through the platform cohesion prism. Windows 95 and Windows XP were both very cohesive. iOS started out being incredibly cohesive, has gone here and there, and is arriving at a point where I may not agree with all the decisions, but at least the apps in the OS feel reasonably consistent.

The Web is a difficult beast. Web pages, by and large, are easily understood. They have an easier job because they’re not asking for a lot of interaction. They are mostly vehicles for information (text, video) with limited interactivity. Web pages are cohesive, for the same reason reading a magazine is cohesive. There’s no learning curve.

Web apps? It depends on the app itself. The original version of Gmail looked like someone had made a UI toolkit out of Google’s own web site, but it was easy enough to use and solved more problems than it created.

Web apps as a way to implement the range of functionality most often associated with full-on desktop applications? Only very rarely done well, and I can’t think of a good example off the top of my head.

Web apps are part of the problem, but they’re not even the only part of the problem. The other part is the copy-that-floppy road of lazily emergent (read: lack of) user experience design. People wanted to make early mobile apps that put many choices at a user’s fingertips, so they invented the hamburger button, opening a sidebar menu, as a reduced example of a menu bar. Other people wanted to not change user paradigms, so they put the same interface on a tablet app. Yet other people wanted to make desktop apps look modern, so they made them look like tablet apps. And so what we now have are web implementations of 3.5” touch screen assumptions for a desktop platform. If your goal is to make an application that users of desktop applications will find familiar, you couldn’t start from a worse place if you tried.

It’s not that desktop apps should look the same if you leave the planet and come back ten years later. It’s that they should at least not completely give up everything that was put in place to make them understandable and efficient; and that if they do replace those things, that they are replaced with things designed with desktop applications in mind. Shrugging your shoulders and saying “but everyone has a phone and everyone uses the web” is like arguing the door on a microwave should look like an actual house door or the lid of a toilet.

Desktop

The conceit

Picking a desktop PC platform right now is a classic case of picking your poison. Apple cares way less about their desktop platform for every passing day, but so apparently does the rest of the world. I use Windows 10 every day, and I wish I didn’t - if Microsoft was still interested in advancing their platform in the way they did between Windows Vista and Windows 7, I might switch to Windows tomorrow and make my life much easier.

It’s frighteningly clear that no one at Apple or Microsoft values the way macOS and Windows respectively has worked enough to not see turning parts of it into a tablet OS – irrespective of the fact that tablets have not gone beyond gimmicks for most desktop users – as progress.

The third place

The Linux crowd is what’s left, and they cook their own punch, so they seem like an obvious refuge. I’ve been giving them short shrift for reasons that make sense to me, but may not be obvious.

The words “user experience” is thrown around a lot these days in place of “UI” and sadly also in place of “usability”. I’ve always been curious and tried using many platforms. I cut my teeth learning to use computers in the System 6 era of the Macintosh, during the period where there was a clear chasm between the Macintosh OS and Windows 3.11. The Mac was a coherently designed platform, with cohesion, with a sense of nuance and personality. It, and Windows 95 after it, had a culture that became synonymous with the OS, that put clear expectations in your head as a user and that you could fulfill as a developer. You don’t need a single company to do this, but you need a place of authority that’s open to criticism and change, that will adapt and survive, that exudes longevity.

Unix is a platform like this, and it has a coherent and cohesive user experience. It has clear rules and ideals that make sense - they’re all about doing one thing and being able to be reused. The problem isn’t Unix. The problem is that if you don’t want to live your life inside a command prompt (no offense to the people who do), it starts to fall apart. Package managers work great. X is a trainwreck, fixing it is impossible and getting people to move to Wayland is still an uphill battle. The Unix idioms and ideas are being applied to graphical UIs, where they don’t make sense, and what results is a cacaphonic mess, not conducive to a pleasant and effortless user experience; things will look different and work different and be largely programmed, structured and maintained by people who do not care about usability. There are people who do care about it, but in this culture they tend to move on to other things.

All this said, the significant time I’ve spent in various forms of GNOME or KDE over two decades has not once instilled confidence in me. (Again, aside from package managers. Although with systemd now being a sprawling, hot mess prone to extreme security flaws but still heading to near universal adoption, maybe I just have no idea what’s going on, and all of this can be ignored.)

The continuation

The previous section was not simply a detour. You may notice that other platforms have their own signature viewed through the platform cohesion prism. Windows 95 and Windows XP were both very cohesive. iOS started out being incredibly cohesive, has gone here and there, and is arriving at a point where I may not agree with all the decisions, but at least the apps in the OS feel reasonably consistent.

The Web is a difficult beast. Web pages, by and large, are easily understood. They have an easier job because they’re not asking for a lot of interaction. They are mostly vehicles for information (text, video) with limited interactivity. Web pages are cohesive, for the same reason reading a magazine is cohesive. There’s no learning curve.

Web apps? It depends on the app itself. The original version of Gmail looked like someone had made a UI toolkit out of Google’s own web site, but it was easy enough to use and solved more problems than it created.

Web apps as a way to implement the range of functionality most often associated with full-on desktop applications? Only very rarely done well, and I can’t think of a good example off the top of my head.

Web apps are part of the problem, but they’re not even the only part of the problem. The other part is the copy-that-floppy road of lazily emergent (read: lack of) user experience design. People wanted to make early mobile apps that put many choices at a user’s fingertips, so they invented the hamburger button, opening a sidebar menu, as a reduced example of a menu bar. Other people wanted to not change user paradigms, so they put the same interface on a tablet app. Yet other people wanted to make desktop apps look modern, so they made them look like tablet apps. And so what we now have are web implementations of 3.5”-ish touch screen assumptions for a desktop platform. If your goal is to make an application that users of desktop applications will find familiar, you couldn’t start from a worse place if you tried.

It’s not that desktop apps should look the same if you leave the planet and come back ten years later. It’s that they should at least not completely give up everything that was put in place to make them understandable and efficient; and that if they do replace those things, that they are replaced with things designed with desktop applications in mind. Shrugging your shoulders and saying “but everyone has a phone and everyone uses the web” is like arguing the door on a microwave should look like an actual house door or the lid of a toilet.

The future

So what’s the answer?

Wipe the slate clean. I don’t mean of influences - I mean to take everything that has worked at some point or another, everything that wasn’t just ported over from somewhere else in the name of expediency, and build something new from those parts. Flat design for things that are interactive is a usability disaster because not being able to tell when things are different will slow you down. (If nothing is interactive, you don’t have a problem that a magazine art director from 1965 couldn’t solve; go about your day.)

We’re at a saddle point in history right now, where the road back makes you look old, and the road forward is daunting because who even talks about desktop PC environments anymore, right? But that line of thinking has gotten us 10-15 years of a desktop rat king, made from tablets, phones, web pages and a little backported, misappropriated good old magazine layouting. It’s time for someone to sit in a hammock for a year and work this out.

Predictions for the Microsoft Build 2018 Keynote

  • Absolutely nothing that in any way challenges the grab-first-ask-questions-later approach to privacy in Windows 10, or the constant disrespect of user opinions and decisions by pushing Edge, Cortana and Microsoft Store, bundling gobs of uninstallable apps and a smaller amount of installable but unwanted apps (Candy Crush on a server, anyone?) and then redoing it all again on every semi-annual Update.
  • Continued dead reckoning in the Windows 8 course to a) imagine we all love touch, b) pretend dipping all of Windows in Metro, sorry, Modern Design, sorry, Fluent was not a mistake and c) continuing letting the ~100% of Windows 10 users who do not use it on tablets, phones or mixed reality devices put up with this bullshit because Microsoft would rather light its own intestines on fire with a firecracker than listen to its users and speak to their concerns.
  • Fucks given about introducing a persistent timeline of user activities, syncing to their cloud and activated without explicit user consent within three weeks of GDPR taking legal effect in Europe: roughly zero.
  • Assurances that all the above is okay because some (reasonable) things are now open source and hey, you can now pay them money to run Linux stuff in Azure, how ‘bout that.

Human

I am coming to terms with what I’m really missing most in apps, web sites, software, interface design, etc is the allowance for me to be human.

Being human sometimes means being gruff and conservative, sometimes wide-eyed and progressive, sometimes strict, sometimes random, sometimes irreverent. What it really boils down to is that you are many different things, often contradictory, all at the same time.

Take John

For example: When my sometimes acquaintance John Gruber writes the latest instalment in his personal bet that the 3.5 millimeter headphone jack is not long for this world and insisting that sentimentalism must not inpinge on the need for progress, he is doing so on a Movable Type installation that has been tweaked over years, on top of a background color that’s been set so long it had a Flickr group before it had a Twitter hashtag.

There’s good reason for John to do this. He writes for a living. He invented (together with the late Aaron Swartz) his own formatting language Markdown, to minimize the distance between the raw text and the desired formatting, and before that the quote curler SmartyPants, to keep typography alive on a platform that seems to have forgotten them. His site has looked more or less the exact same for at least 10 years now, and his writing process seems to be identical too. (My guess is that he’s still using BBEdit, a piece of software that debuted in 1992 for System 6.)

He has molded his tools to fit him like a glove, the way you do with something you care about. You pick through alternatives, you try new things, you settle on what works well for you.

50 Shades of Grey

Recently I attempted to survive a week with my phone set to the greyscale color filter, under the premise that it might seem less appealing and cool down the desire to activate it during every idle moment and bounce around between the apps. I am more hooked than I think, but it also drove home just how dysfunctional flat design (or “iOS 7 Thought”) can be.

The primary differentiator in iOS for an active button is color. Wash out colors and you’re hanging on by trying to discern the level of saturation, which varies not only from app to app, but from screen to screen. The colors are all different, since the color is intended to be a primary method of personalizing and branding apps. For me, this hell is self-imposed - for the color blind or visually impaired, this is constant. There are color filters for improving life for people living with color blindness, and there are accessibility options to turn on “button shapes”, just as to minimize motion and translucency, make text bold and set the font size.

I was going to write this post partially about Agenda, a promising app that seems to take an approach to continuous note taking that I’ve long requested. But when I downloaded it and opened it and looked around, even though it had got the mental model closer to what I’d like, it was too flat, too barren. And while iOS has a single dial for font size, Agenda has none, and no way to zoom in or increase the font size. I tried typing but it did not feel right to me. When I type posts or articles, I want my text a bit bigger; when I type other things, I want it smaller, when I read, I go back and forth.

I’m instead writing this post in Visual Studio Code, of all things, because it is a competent Markdown experience also keeping related files in reach. It’s not particularly important for this post, but it’s worth noting that it also has a flat design motif without once leaving it up for question what’s clickable. If it’s there, it’s either text in the editor or clickable - it is most likely selectable and the state of the tabs is never confusing. There are issues with the app as a whole, but they got this more right than iOS.

Physical

The phone or tablet iOS runs on is a physical object. It was considered in endless variants. The volume buttons, the lock button, the home button (something else possibly not long for this world) has to work for everyone. The user interface, the software, the soul of the device, does not have to. It is adaptive. All of iOS can change with a software update. We can go from having buttons that are clearly demarcated as buttons to buttons that we just know are buttons because of contextual cues and because the text is blue.

It’s not worth imagining this is a fundamental insight or new information, but iOS’s flat soul was inspired by advertisement materials, by page layouted brochures. The hurdle in this is that a brochure has very few, if any, interactive parts, while software is almost all interactive parts. Wanting for the elimination of computer administrative debris is worthwhile and even enviable. Wanting it to the elimination of usability, of utility, of cognitive agility is less enviable.

Needing bolder text and bigger fonts and inverted colors is fine. Needing fundamental tweaks to the aesthetics to make the user interface tractable at all means that you have failed at your job as a user interface designer. Apple’s WWDC theme is intriguing because it suggests a return in some form to user interface elements as semi-physical objects with weight, with depth, with importance and character. Laying the red herring of loupe-inspected linen textures aside, hints to inform the character of each element make the interface a thousand times smoother.

Flat Jack

I am not going to (successfully) unite the frayed ends into a red thread. But there is something fundamentally similar in a working headphone jack being thrown out, in a flat design esthetic with little to no backing in user interface lore being forced on an entire platform and in the feeling both programmers, pro users and everyday people get when something they use has been all of a sudden “updated”. So often, the new does not maintain the core of what was useful and expedient. So often, the desire to simply progress shortchanges the priorities of the people of flesh and blood who are relying on this tool every day.

By all means, introduce new things, and by all means get rid of stuff when it’s old and busted. But new iPhones don’t make it impossible to use new wired headphones - they make it impossible to use good, common, reasonably priced wired headphones without an adapter. Bluetooth headphones may be great, but they’re not for everyone, and we are left high and dry.

And iOS’s flat aesthetic gives it an airy and maybe subjectively less dated feel, but one which still has reasonably basic usability issues even four versions in. In particular, its options may be confused for personalization at a distance, but amount to escape hatches unworthy of the world’s second most used mobile operating system. It should all be much clearer than this, and this kind of upheaval says, first and foremost, that respecting the tool’s ability to fit you like a glove is not as high a priority as, say, Apple’s desire to dip everything in new paint, some of which is vibrantly acrylic, and or ham-handedly attempt to copy the dynamic Material Design headers.

Stuck

Brent Simmons:

We could be excused for thinking that Micro.blog is like App.net — a Twitter alternative greeted with enthusiasm but that eventually closed.

It’s not the same thing, though, and I’ll explain why.

[reasonable explanation worth reading on its own merits elided]

You might think this is too difficult for normal people, that it’s all too nerdy, and that it won’t make headway against Twitter, so who cares.

My reply: it’s okay if this is a work in progress and isn’t ready for everybody yet. It’s okay if it takes time. We don’t know how it will all work in the end.

We’re discovering the future as we build it.

I don’t fundamentally have a problem with this line of reasoning. But I do have a problem with where we end up.

Everything Old is New Again

You don’t need to tell me that Twitter was created as a sidecar to other projects, and was intended as a way to “check in” so that the rest of the original bunch of people knew what you were up to. The problem with Twitter isn’t that. The problem with Twitter isn’t the role that it’s played in the past few years, it’s not in a set character limit.

The problem with Twitter is that it encourages conversations with people you don’t know in a medium that is almost uniquely poorly designed to follow a conversation. It is the software equivalent to setting up a conversationalist convention, and holding it in a dark, smelly, crowded room with loud music playing and having a blood alcohol level requirement before you can enter.

And furthermore, the problem with Twitter being created, the reason it was created, is that time and time again the simple solutions get reinvented.

Unwind the tape to the start of the century. “Everyone” had a weblog. (Since I still use that term, I was obviously present.) “Everyone” hosted it themselves and could duct tape Movable Type or debate the merits of Textile. In retrospect, it is easy to understand that this wasn’t some Sheikah-like ancient civilization where every interesting person on the face of the Earth knew these things - it’s just that it was chiefly open to them.

The Medium is The Message

What’s happened since then? If you’re tempted to say Facebook and Twitter, I don’t disagree with that. Tumblr also happened. Tumblr was first and more brazen about taking the very simplest expressions and pulling them into individual posts, and then gradually turning it into a “social network” with reposts.

The problem isn’t that Tumblr handles the “Mac-and-cheese” of publishing. The problem is that it is so successful at doing so that it moves the spectrum. If people have Tumblr, they are more likely to feed it with found items and paraphernalia, and less likely to write engaging things. If people have Twitter, they are more likely to feed it with one-liners and retweets and the briefest of observations. And if people have a crafted Movable Type installation, they are more likely to publish posts.

Giving a damn

Facebook and Twitter and dozens of other platforms hooked us on the idea that the web does not have to be funny shaped and unruly and different. Twitter’s Bootstrap is helping out with the normalization process, but we’re all drawn to use the simplest thing we can figure out to share our own thoughts and the most voluble thing we can figure out to ingest those of others.

RSS and its related technologies is in some way at the core of all this. RSS had to be refined to podcasting and brought into a semi-benevolently maintained directory (Apple’s) to reach the success and adoption that it has. It’s not that no one listened - it’s that for everyone to listen, it has to come down to pointing and clicking.

There’s been precious few technologies like this for personal publishing. Those that have not had an agenda have been providing their value in a shrinking ecosystem. Even if ten useful, new and fully developed solutions popped up tomorrow, they would be subject to the same realities. And this is one half of the problem of the Do It With Open Technologies path.

The other half is that they are trying to pull together some aspect that the closed platforms handle. Most problems that the closed platforms handle are legitimate research efforts. Navigating the friend graph on Facebook, indexing tweets for hashtags, even deciding on a tweet’s ID are Hard Problems. And we’re trying to do it in a distributed way. Ask anyone doing distributed systems if this way is easier. We’re not literally trying to reconstruct and solve every problem, of course. But the result is that whatever we build, whatever our solution looks like, when viewed from a distance, is not going to be as coherent as the closed platforms.

If you’re yelling at your screen that this is the point, and that we’re not trying to remake Facebook, you’re correct. But someone moved the goal posts on us.

The App

I am not going to touch on actual apps. But being relegated to a 3.5” prison meant that clarity had to come to the fore. Things had to be easy. Getting around had to be obvious. (And it was, before someone starting making hamburgers.) We were all taught to think in terms of the unit, and to deliver the unit, to concentrate on the morsel, which possibly could be expanded into the short scroll view.

The app for Twitter is obvious. The app for Facebook is obvious. The apps for Tumblr, Reddit, Medium and Instagram are all obvious. We could all close our eyes and imagine how they would work. The app for weblog-shaped objects is stuck at being the RSS reader. There’s nothing wrong with the RSS reader, but the focus is entirely on the technology. Maybe you could argue that it has to be to some extent because in a decentralized system you have to be able to discover things and add in other things.

Consider the web browser - if people still relied on typing in web addresses, we would all browse much, much less every day. But we don’t. We have our favorite search engine and we have links. They all depend on, but abstract, the technology below. The RSS reader is best described, when masking away particular technologies, as a “feed” reader. People understand what a web is. Most people don’t like that they have to tend to, curate, maintain a list of feeds. Maybe they wisely consider that the friction of adding a feed, not only in finding the subscribe button but in thenceforth reading more stuff, leads them to listen to fewer or more palatable opinions.

The apps that exist and are somewhat successful are the platform apps of individual actors, like Tumblr, Medium, Reddit and news aggregation apps like Apple News. (These all tend to be local, so it’s hard to exemplify them.) They all have the podcasting problem too. But being decentralized, like the web, means that everyone has to run a little piece of infrastructure wherever they host what they write. And it means that there will have to be a few open vendors to collect or refine the result of this infrastructure (Technorati then, Disqus now).

Seven red lines, all perpendicular to each other

This is where I’m stuck as I think about this. I am less inclined than ever to imagine that people will start demanding sharper crayons from their hosting providers - or to consider whichever way they publish their thinking a “hosting provider”. I am less inclined than ever to imagine that individual vendors will pop up to solve the hard problems or to provide the bird’s eye view that would bring more clarity to the process.

Medium is interesting to prove that people haven’t stopped wanting to think and to publish. They’ll do so given the tools available to them. But Medium itself is still just another closed platform that would rather you read more from its network. It’s not in Medium’s best interest to point readers to things hosted outside of Medium, and it’s not even obvious to me how Medium will survive for the next few years.

Decentralization tends to lead to parts that survive on their lonesome no matter what. It would be great if this resiliency, this freedom could form an open platform. But like asking for seven red lines, all of them strictly perpendicular, some with green ink and some with transparent, there are boulders in our path that I don’t know how to work around, that make me think that what I remember wasn’t so much a golden age to which we can return but a fortuitous pocket in history, a reality that was inevitable to lose and that we would have to work very hard to bring back.

Look for something else, something big, something that changes the equation, something that makes it easier to write and less frustrating to follow and discover. For lack of anything new, we may as well all be wearing red hats.

Up Top

This is an attempt to conceptualize the previously hypothesized pins in a more approachable and concrete way. That post was almost like a manifest, this will hopefully be more down to earth and reasoning and sell you on the benefits.

As always, there’s a risk by taking a bunch of ideas and putting them together in a concrete way that someone might say “I don’t like that, therefore the whole thing is crap”, and that’s fine. But there’s also a chance it can get people thinking “why don’t we have anything like this now?” or “why is it that no one is thinking in these terms?” and that’s what I want, more than Apple or Google or Samsung or HTC or Huawei taking this thing and literally implementing it.

So let’s start describing Up Top as if it existed and was already implemented. (This is not a rumored feature. This is just me dreaming.)

What is Up Top?

Up Top is a new part of using your phone. Up Top is where your ongoing things end up. Just like Siri is a person you can ask, Up Top is a place you can look.

It’s called Up Top because it’s above the status bar. Pull down on the status bar (or the left ear of the iPhone X) and you pull down Up Top. Pull even further to pull down the lock screen and see your notifications.

What’s in Up Top?

Anything you put there. We’ll call the things you put in Up Top items for now. There should be a snappier name for them, I just haven’t been able to think of one yet.

In a transit app, you can put the countdown to a bus or train there. In your calendar, you can put an event there, for quick access. Turn-based games like Words with Friends or Chess can put an ongoing match there.

Each of these items are the size of two app icons - you can fit two of them side by side, and if there’s a lot of information you’d like to see, you can let an item be twice as wide and fill the screen.

Sounds like notifications to me.

Notifications are a bit messy - they show up out of nowhere, go away, all show up in the same list from which they have to be cleared out. Items in Up Top are more focused. They are each associated with a thing, so they are persistent, and they’re like a little part of the app you add them from. Each item is like a widget dedicated to the thing you added.

So if you added a bus or train ride, you see the countdown in minutes, and you see how far away you are, and you can tap the item to hop straight into the app and go directly to the ride to see the rest of the info. And once you’re done with it, you can toss it.

So this is just for things that are about to happen?

No - you can also add places and people to Up Top and associate/tag other items or other things with them. You can add a store across town, and tag it with a review or shopping list. You can add a friend, and tag them with something you want to show them the next time you see them, like a link or a photo album.

That must get very crowded.

Yes and no. You can put a lot of items there, but it won’t be as crowded as you might think.

By default, what you see is the current items (like the bus countdown) and the items that are relevant to you right now.

If you have a conversation with your friend open in an app and pull down Up Top and you added them to it, their items will show up there. Same with places - if you’re physically close to a place and pull down Up Top, the items you linked with the place will show up. The friend and the place will be like folders of items - you tap them to show everything.

And of course, you can choose to see everything you put Up Top, and search for what you found.

So I have to pull down Up Top to look at things?

Yes and no.

For one thing, Up Top shows up on the lock screen too, for the same reason your recent notifications do. They’re relevant to you right now. But Up Top items also integrate with notifications.

Now, if you have a countdown or a stock quote there, it would be very distracting if they sent notifications every time it changed, every time there’s a new minute, every time there’s a new stock market tick, so that’s not what happens. But for example, the bus countdown could send you a notification when it’s about time to leave. A game match could send you a notification when it’s your turn. And you could have a stop or limit or alert for the stock quote, so that it would tell you when the market price has reached a certain point.

And when you get a notification from an item, both the item itself and the notification message will slide down and give you much more information and context than if just the notification message had appeared - which means that the notification message itself can be shorter. You can pull down Up Top and take a look at your items as they are, and when something notable happens, like your ride is here or your order has shipped, the item slides down along with the notification.

Okay, so this sounds fine, but what about the more fleeting items? I’ll have to go into apps to readd these things all the time.

You could. But when the app lets you add an item to Up Top, it can also recommend a pattern or attach to the people and places you already have there.

For example: If you have your home as a place in Up Top, you could have a template of sorts for keeping track of the next ride with the bus you take to work. It wouldn’t send you any notifications or be visually conspicuous, but if you have to head off to the bus, you slide down Up Top, tap Home and tap the template item for your bus. It turns into the real item, you see the countdown, you start getting any related notifications, and so on - all without you having to set it up again from scratch.

So what’s the difference between this and a widget then?

A widget is one thing to summarize the state of an entire app. It’s an overview (like the next few upcoming calendar events), or it’s a mirror of the most important function in an app (like a mini calculator or the result of the number in a clipboard for a directory app). Widgets are still useful for what they are. Up Top items are like what would happen if the particular thing you were interested in in one app got to have its own widget.

A Letter to the Responsible Individuals at Major Internet Publishers

This message is brought to you by the new Federal Motors 2019 Hoodwinker Saloon. Experience unparalleled joy with industry leading mileage* and up to eight** cupholders! Pay your local dealer a visit for a test drive and receive two tickets to Star Wars®: The Last Jedi™***! Starting at just $12,345****. Click here to sk–

* Mileage may vary.
** Available with sports package.
*** While supplies last. Limited to two per household. Star Wars is a registered trademark of the Walt Disney Corporation.
**** Suggested retail price.

Our experts investigated your site - you won’t believe what they found!

Today’s edition of Waffle is brought to you by Netmix! The video content you enjoy, now on every screen. Season 10 of House of Fjords now available.

Dear senior executive,

You may also enjoy: Fictional letters to the editor throughout history

I sat down to read an article shared to me by my grandmother on FACEBOOK, INC. [FB; 184.33; -0.34 (-0.18%); at close: 4:00PM EST; stock prices provided by Varyzen for informational uses only] and I was simply put, flooredfor quality flooring, enjoy Lowe’s new line of walnut parquet – by the clarity and efficacy of your original contentcreate your own weblog now at WordPress.com.

In related news: Net Neutrality: worst idea in American history or World history? We survey ISP marketing departments

Not only was it relevant and insightful, I was unable to locate any promotional consideration. As a concerned customer, I must insist that you include more advertisements, so as to ensure your continued survivalstock your go bag and bunker with quality MREs at The End is Nigh.

Around the web – provided by Hoogleboot Content Services
Remember these celebrities from the 70’s? Here’s what they’re up to today
This one weird trick solves hunger forever
Russia is planning to attack the US – read the secret plans

Once again, my warmest applause for your friction-freestop slipping this winter with Gary’s Gravel – reading experience. Truly, we are living in the golden age of journalism.

Autoplaying in two seconds: Why there is not ever a reason for anyone who still maintains ownership of a soul to engage a content blocker

Regards,
The Average Internet Content Consumer

Chunky

There’s nothing wrong with Apple today that can’t be solved with a bit of Howard Moskowitz.

Basis

If you are one of the five now living people who have not yet seen one of the first TED talks to break when they started being published, allow me to summarize the Malcolm Gladwell talk on spaghetti sauce:

Howard Moskowitz is a psychophysicist - an expert in what satisfies us. One of his clients is Pepsi, who is developing Diet Pepsi and demands of him to find the right level of sweetness in a small span of allowed sweetness. He does the obvious thing of making small batches of Pepsi at each interval of sweetness, but there’s no dominant, obvious perfect match. This bedevils him long past the end of the contract until he figures out that there’s not necessarily one perfect, platonic Diet Pepsi.

He sets out to broadcast this notion, at significant loss for his career. Eventually he is hired by Prego spaghetti sauce, where he tries out his theories by varying many constituent parts, and uncovering the need (and untapped market demand) for chunky spaghetti sauce, which is launched to great commercial success. With products that cater better to many different stances, customers will be happier, because they get something closer to what they desire.

Essential

The arrow of Apple, over time, is to strip down products to bare essentials. Fewer parts. Design that does not hit you over the head with how someone sweated over something. (You may think that Apple products look ostentatious, but go pick up a “gaming” oriented product and count the number of discrete surfaces, or angles, or glowing LEDs, or trackpads placed above the keyboard, and consider how much worse anodizing the whole thing rose gold really would have been.)

This is fine and good, but it also means that they want to remove as much as they can possibly remove. Make thinner as much as they possibly can make thinner. Recently, we’re seeing this desire fight actual purposefulness.

Apple has to keep around last generation’s MacBook Pro and MacBook Air for the increasing number of people who do not want to live the dongle life, or worry that keys will be put out of commission by pieces of dust. And Apple had to publicly admit defeat in having bet on the wrong, minimalistic, two-GPUs-bolted-to-the-side-of-a-cooling-core horse and remake the Mac Pro from scratch.

What I’m proposing is that it doesn’t have to be a tug of war. There is a way to solve this. It’s called introducing another product.

Pro Proposal Prose

Slide the current Touch Bar MacBook Pro down and call it MacBook Pro Mini. (This or Pro Air. It’s not a perfect name, but they currently sell both the iMac Pro and the Mac Pro and they’re both pricey Pro desktops that differ by four years and one letter, so get off my back.) Leave it as is - allow it to get even thinner, even.

Introduce a new MacBook Pro, that is as “thick” as the previous MacBook Pro. Put MagSafe 2 on it, put Thunderbolt 3 on it, put USB-A 3.1 on it, put an SD card slot on it, put a headphone jack on it. Put its existing keyboard on it, function keys and all. Upgrade it to some variation of the butterfly mechanism if the improvements involve figuring out a way for it not to be disabled by individual specks of dust.

What about the Touch Bar? Make it a customizable option and put it above the keyboard (between it and the hinge) where it rests outside of the keyboard area.

Stuff it full of batteries. Stuff it full of discrete graphics capable of running two 5K displays. Stuff it full of 2 TB worth of SSD, 32 GB full of RAM.

Slice off some of it for the 13” version, to make it all fit. For extra credit, make a 17” version again and go nuts.

Simplicity is the Ultimate Sophistication

There’s nothing controversial about this. (Maybe the Touch Bar.) I know the traumas and the war stories, but this is not a return to Steve Jobs coming to Apple and finding 15 lines of Macs whose strengths he couldn’t figure out. This is the same number of Mac laptops Apple already sells today, and with significantly improved messaging. This is a single spectrum: thinner means less capable but also less power hungry, and, well, thinner. Thicker means more capable. And even the thickest model isn’t thicker than what Apple is comfortable selling today.

If you are comfortable living the dongle life and not having function keys, pick the Pro Mini. If you need the power, pick the Pro. If your needs are the computational equivalent of subsisting on water and plankton and plug nothing in ever, pick the MacBook. No labyrinthine flowcharts required. No need to axe products that already exist. Just to offer more beside them, so that those of us who care can still do our job.

Apple isn’t uncomfortable making tradeoffs, it’s uncomfortable admitting to them. Don’t keep around last year’s model - simply make more models with clear and meaningful distinctions that appeal to more people, and put the desire for optimization to good use in making fewer debilitating compomises for each of them. Don’t pretend that you can only make one.