- Absolutely nothing that in any way challenges the grab-first-ask-questions-later approach to privacy in Windows 10, or the constant disrespect of user opinions and decisions by pushing Edge, Cortana and Microsoft Store, bundling gobs of uninstallable apps and a smaller amount of installable but unwanted apps (Candy Crush on a server, anyone?) and then redoing it all again on every semi-annual Update.
- Continued dead reckoning in the Windows 8 course to a) imagine we all love touch, b) pretend dipping all of Windows in Metro, sorry, Modern Design, sorry, Fluent was not a mistake and c) continuing letting the ~100% of Windows 10 users who do not use it on tablets, phones or mixed reality devices put up with this bullshit because Microsoft would rather light its own intestines on fire with a firecracker than listen to its users and speak to their concerns.
- Fucks given about introducing a persistent timeline of user activities, syncing to their cloud and activated without explicit user consent within three weeks of GDPR taking legal effect in Europe: roughly zero.
- Assurances that all the above is okay because some (reasonable) things are now open source and hey, you can now pay them money to run Linux stuff in Azure, how ‘bout that.
I am coming to terms with what I’m really missing most in apps, web sites, software, interface design, etc is the allowance for me to be human.
Being human sometimes means being gruff and conservative, sometimes wide-eyed and progressive, sometimes strict, sometimes random, sometimes irreverent. What it really boils down to is that you are many different things, often contradictory, all at the same time.
For example: When my sometimes acquaintance John Gruber writes the latest instalment in his personal bet that the 3.5 millimeter headphone jack is not long for this world and insisting that sentimentalism must not inpinge on the need for progress, he is doing so on a Movable Type installation that has been tweaked over years, on top of a background color that’s been set so long it had a Flickr group before it had a Twitter hashtag.
There’s good reason for John to do this. He writes for a living. He invented (together with the late Aaron Swartz) his own formatting language Markdown, to minimize the distance between the raw text and the desired formatting, and before that the quote curler SmartyPants, to keep typography alive on a platform that seems to have forgotten them. His site has looked more or less the exact same for at least 10 years now, and his writing process seems to be identical too. (My guess is that he’s still using BBEdit, a piece of software that debuted in 1992 for System 6.)
He has molded his tools to fit him like a glove, the way you do with something you care about. You pick through alternatives, you try new things, you settle on what works well for you.
Recently I attempted to survive a week with my phone set to the greyscale color filter, under the premise that it might seem less appealing and cool down the desire to activate it during every idle moment and bounce around between the apps. I am more hooked than I think, but it also drove home just how dysfunctional flat design (or “iOS 7 Thought”) can be.
The primary differentiator in iOS for an active button is color. Wash out colors and you’re hanging on by trying to discern the level of saturation, which varies not only from app to app, but from screen to screen. The colors are all different, since the color is intended to be a primary method of personalizing and branding apps. For me, this hell is self-imposed - for the color blind or visually impaired, this is constant. There are color filters for improving life for people living with color blindness, and there are accessibility options to turn on “button shapes”, just as to minimize motion and translucency, make text bold and set the font size.
I was going to write this post partially about Agenda, a promising app that seems to take an approach to continuous note taking that I’ve long requested. But when I downloaded it and opened it and looked around, even though it had got the mental model closer to what I’d like, it was too flat, too barren. And while iOS has a single dial for font size, Agenda has none, and no way to zoom in or increase the font size. I tried typing but it did not feel right to me. When I type posts or articles, I want my text a bit bigger; when I type other things, I want it smaller, when I read, I go back and forth.
I’m instead writing this post in Visual Studio Code, of all things, because it is a competent Markdown experience also keeping related files in reach. It’s not particularly important for this post, but it’s worth noting that it also has a flat design motif without once leaving it up for question what’s clickable. If it’s there, it’s either text in the editor or clickable - it is most likely selectable and the state of the tabs is never confusing. There are issues with the app as a whole, but they got this more right than iOS.
The phone or tablet iOS runs on is a physical object. It was considered in endless variants. The volume buttons, the lock button, the home button (something else possibly not long for this world) has to work for everyone. The user interface, the software, the soul of the device, does not have to. It is adaptive. All of iOS can change with a software update. We can go from having buttons that are clearly demarcated as buttons to buttons that we just know are buttons because of contextual cues and because the text is blue.
It’s not worth imagining this is a fundamental insight or new information, but iOS’s flat soul was inspired by advertisement materials, by page layouted brochures. The hurdle in this is that a brochure has very few, if any, interactive parts, while software is almost all interactive parts. Wanting for the elimination of computer administrative debris is worthwhile and even enviable. Wanting it to the elimination of usability, of utility, of cognitive agility is less enviable.
Needing bolder text and bigger fonts and inverted colors is fine. Needing fundamental tweaks to the aesthetics to make the user interface tractable at all means that you have failed at your job as a user interface designer. Apple’s WWDC theme is intriguing because it suggests a return in some form to user interface elements as semi-physical objects with weight, with depth, with importance and character. Laying the red herring of loupe-inspected linen textures aside, hints to inform the character of each element make the interface a thousand times smoother.
I am not going to (successfully) unite the frayed ends into a red thread. But there is something fundamentally similar in a working headphone jack being thrown out, in a flat design esthetic with little to no backing in user interface lore being forced on an entire platform and in the feeling both programmers, pro users and everyday people get when something they use has been all of a sudden “updated”. So often, the new does not maintain the core of what was useful and expedient. So often, the desire to simply progress shortchanges the priorities of the people of flesh and blood who are relying on this tool every day.
By all means, introduce new things, and by all means get rid of stuff when it’s old and busted. But new iPhones don’t make it impossible to use new wired headphones - they make it impossible to use good, common, reasonably priced wired headphones without an adapter. Bluetooth headphones may be great, but they’re not for everyone, and we are left high and dry.
And iOS’s flat aesthetic gives it an airy and maybe subjectively less dated feel, but one which still has reasonably basic usability issues even four versions in. In particular, its options may be confused for personalization at a distance, but amount to escape hatches unworthy of the world’s second most used mobile operating system. It should all be much clearer than this, and this kind of upheaval says, first and foremost, that respecting the tool’s ability to fit you like a glove is not as high a priority as, say, Apple’s desire to dip everything in new paint, some of which is vibrantly acrylic, and or ham-handedly attempt to copy the dynamic Material Design headers.
We could be excused for thinking that Micro.blog is like App.net — a Twitter alternative greeted with enthusiasm but that eventually closed.
It’s not the same thing, though, and I’ll explain why.
[reasonable explanation worth reading on its own merits elided]
You might think this is too difficult for normal people, that it’s all too nerdy, and that it won’t make headway against Twitter, so who cares.
My reply: it’s okay if this is a work in progress and isn’t ready for everybody yet. It’s okay if it takes time. We don’t know how it will all work in the end.
We’re discovering the future as we build it.
I don’t fundamentally have a problem with this line of reasoning. But I do have a problem with where we end up.
You don’t need to tell me that Twitter was created as a sidecar to other projects, and was intended as a way to “check in” so that the rest of the original bunch of people knew what you were up to. The problem with Twitter isn’t that. The problem with Twitter isn’t the role that it’s played in the past few years, it’s not in a set character limit.
The problem with Twitter is that it encourages conversations with people you don’t know in a medium that is almost uniquely poorly designed to follow a conversation. It is the software equivalent to setting up a conversationalist convention, and holding it in a dark, smelly, crowded room with loud music playing and having a blood alcohol level requirement before you can enter.
And furthermore, the problem with Twitter being created, the reason it was created, is that time and time again the simple solutions get reinvented.
Unwind the tape to the start of the century. “Everyone” had a weblog. (Since I still use that term, I was obviously present.) “Everyone” hosted it themselves and could duct tape Movable Type or debate the merits of Textile. In retrospect, it is easy to understand that this wasn’t some Sheikah-like ancient civilization where every interesting person on the face of the Earth knew these things - it’s just that it was chiefly open to them.
What’s happened since then? If you’re tempted to say Facebook and Twitter, I don’t disagree with that. Tumblr also happened. Tumblr was first and more brazen about taking the very simplest expressions and pulling them into individual posts, and then gradually turning it into a “social network” with reposts.
The problem isn’t that Tumblr handles the “Mac-and-cheese” of publishing. The problem is that it is so successful at doing so that it moves the spectrum. If people have Tumblr, they are more likely to feed it with found items and paraphernalia, and less likely to write engaging things. If people have Twitter, they are more likely to feed it with one-liners and retweets and the briefest of observations. And if people have a crafted Movable Type installation, they are more likely to publish posts.
Facebook and Twitter and dozens of other platforms hooked us on the idea that the web does not have to be funny shaped and unruly and different. Twitter’s Bootstrap is helping out with the normalization process, but we’re all drawn to use the simplest thing we can figure out to share our own thoughts and the most voluble thing we can figure out to ingest those of others.
RSS and its related technologies is in some way at the core of all this. RSS had to be refined to podcasting and brought into a semi-benevolently maintained directory (Apple’s) to reach the success and adoption that it has. It’s not that no one listened - it’s that for everyone to listen, it has to come down to pointing and clicking.
There’s been precious few technologies like this for personal publishing. Those that have not had an agenda have been providing their value in a shrinking ecosystem. Even if ten useful, new and fully developed solutions popped up tomorrow, they would be subject to the same realities. And this is one half of the problem of the Do It With Open Technologies path.
The other half is that they are trying to pull together some aspect that the closed platforms handle. Most problems that the closed platforms handle are legitimate research efforts. Navigating the friend graph on Facebook, indexing tweets for hashtags, even deciding on a tweet’s ID are Hard Problems. And we’re trying to do it in a distributed way. Ask anyone doing distributed systems if this way is easier. We’re not literally trying to reconstruct and solve every problem, of course. But the result is that whatever we build, whatever our solution looks like, when viewed from a distance, is not going to be as coherent as the closed platforms.
If you’re yelling at your screen that this is the point, and that we’re not trying to remake Facebook, you’re correct. But someone moved the goal posts on us.
I am not going to touch on actual apps. But being relegated to a 3.5” prison meant that clarity had to come to the fore. Things had to be easy. Getting around had to be obvious. (And it was, before someone starting making hamburgers.) We were all taught to think in terms of the unit, and to deliver the unit, to concentrate on the morsel, which possibly could be expanded into the short scroll view.
The app for Twitter is obvious. The app for Facebook is obvious. The apps for Tumblr, Reddit, Medium and Instagram are all obvious. We could all close our eyes and imagine how they would work. The app for weblog-shaped objects is stuck at being the RSS reader. There’s nothing wrong with the RSS reader, but the focus is entirely on the technology. Maybe you could argue that it has to be to some extent because in a decentralized system you have to be able to discover things and add in other things.
Consider the web browser - if people still relied on typing in web addresses, we would all browse much, much less every day. But we don’t. We have our favorite search engine and we have links. They all depend on, but abstract, the technology below. The RSS reader is best described, when masking away particular technologies, as a “feed” reader. People understand what a web is. Most people don’t like that they have to tend to, curate, maintain a list of feeds. Maybe they wisely consider that the friction of adding a feed, not only in finding the subscribe button but in thenceforth reading more stuff, leads them to listen to fewer or more palatable opinions.
The apps that exist and are somewhat successful are the platform apps of individual actors, like Tumblr, Medium, Reddit and news aggregation apps like Apple News. (These all tend to be local, so it’s hard to exemplify them.) They all have the podcasting problem too. But being decentralized, like the web, means that everyone has to run a little piece of infrastructure wherever they host what they write. And it means that there will have to be a few open vendors to collect or refine the result of this infrastructure (Technorati then, Disqus now).
This is where I’m stuck as I think about this. I am less inclined than ever to imagine that people will start demanding sharper crayons from their hosting providers - or to consider whichever way they publish their thinking a “hosting provider”. I am less inclined than ever to imagine that individual vendors will pop up to solve the hard problems or to provide the bird’s eye view that would bring more clarity to the process.
Medium is interesting to prove that people haven’t stopped wanting to think and to publish. They’ll do so given the tools available to them. But Medium itself is still just another closed platform that would rather you read more from its network. It’s not in Medium’s best interest to point readers to things hosted outside of Medium, and it’s not even obvious to me how Medium will survive for the next few years.
Decentralization tends to lead to parts that survive on their lonesome no matter what. It would be great if this resiliency, this freedom could form an open platform. But like asking for seven red lines, all of them strictly perpendicular, some with green ink and some with transparent, there are boulders in our path that I don’t know how to work around, that make me think that what I remember wasn’t so much a golden age to which we can return but a fortuitous pocket in history, a reality that was inevitable to lose and that we would have to work very hard to bring back.
Look for something else, something big, something that changes the equation, something that makes it easier to write and less frustrating to follow and discover. For lack of anything new, we may as well all be wearing red hats.
This is an attempt to conceptualize the previously hypothesized pins in a more approachable and concrete way. That post was almost like a manifest, this will hopefully be more down to earth and reasoning and sell you on the benefits.
As always, there’s a risk by taking a bunch of ideas and putting them together in a concrete way that someone might say “I don’t like that, therefore the whole thing is crap”, and that’s fine. But there’s also a chance it can get people thinking “why don’t we have anything like this now?” or “why is it that no one is thinking in these terms?” and that’s what I want, more than Apple or Google or Samsung or HTC or Huawei taking this thing and literally implementing it.
So let’s start describing Up Top as if it existed and was already implemented. (This is not a rumored feature. This is just me dreaming.)
What is Up Top?
Up Top is a new part of using your phone. Up Top is where your ongoing things end up. Just like Siri is a person you can ask, Up Top is a place you can look.
It’s called Up Top because it’s above the status bar. Pull down on the status bar (or the left ear of the iPhone X) and you pull down Up Top. Pull even further to pull down the lock screen and see your notifications.
What’s in Up Top?
Anything you put there. We’ll call the things you put in Up Top items for now. There should be a snappier name for them, I just haven’t been able to think of one yet.
In a transit app, you can put the countdown to a bus or train there. In your calendar, you can put an event there, for quick access. Turn-based games like Words with Friends or Chess can put an ongoing match there.
Each of these items are the size of two app icons - you can fit two of them side by side, and if there’s a lot of information you’d like to see, you can let an item be twice as wide and fill the screen.
Sounds like notifications to me.
Notifications are a bit messy - they show up out of nowhere, go away, all show up in the same list from which they have to be cleared out. Items in Up Top are more focused. They are each associated with a thing, so they are persistent, and they’re like a little part of the app you add them from. Each item is like a widget dedicated to the thing you added.
So if you added a bus or train ride, you see the countdown in minutes, and you see how far away you are, and you can tap the item to hop straight into the app and go directly to the ride to see the rest of the info. And once you’re done with it, you can toss it.
So this is just for things that are about to happen?
No - you can also add places and people to Up Top and associate/tag other items or other things with them. You can add a store across town, and tag it with a review or shopping list. You can add a friend, and tag them with something you want to show them the next time you see them, like a link or a photo album.
That must get very crowded.
Yes and no. You can put a lot of items there, but it won’t be as crowded as you might think.
By default, what you see is the current items (like the bus countdown) and the items that are relevant to you right now.
If you have a conversation with your friend open in an app and pull down Up Top and you added them to it, their items will show up there. Same with places - if you’re physically close to a place and pull down Up Top, the items you linked with the place will show up. The friend and the place will be like folders of items - you tap them to show everything.
And of course, you can choose to see everything you put Up Top, and search for what you found.
So I have to pull down Up Top to look at things?
Yes and no.
For one thing, Up Top shows up on the lock screen too, for the same reason your recent notifications do. They’re relevant to you right now. But Up Top items also integrate with notifications.
Now, if you have a countdown or a stock quote there, it would be very distracting if they sent notifications every time it changed, every time there’s a new minute, every time there’s a new stock market tick, so that’s not what happens. But for example, the bus countdown could send you a notification when it’s about time to leave. A game match could send you a notification when it’s your turn. And you could have a stop or limit or alert for the stock quote, so that it would tell you when the market price has reached a certain point.
And when you get a notification from an item, both the item itself and the notification message will slide down and give you much more information and context than if just the notification message had appeared - which means that the notification message itself can be shorter. You can pull down Up Top and take a look at your items as they are, and when something notable happens, like your ride is here or your order has shipped, the item slides down along with the notification.
Okay, so this sounds fine, but what about the more fleeting items? I’ll have to go into apps to readd these things all the time.
You could. But when the app lets you add an item to Up Top, it can also recommend a pattern or attach to the people and places you already have there.
For example: If you have your home as a place in Up Top, you could have a template of sorts for keeping track of the next ride with the bus you take to work. It wouldn’t send you any notifications or be visually conspicuous, but if you have to head off to the bus, you slide down Up Top, tap Home and tap the template item for your bus. It turns into the real item, you see the countdown, you start getting any related notifications, and so on - all without you having to set it up again from scratch.
So what’s the difference between this and a widget then?
A widget is one thing to summarize the state of an entire app. It’s an overview (like the next few upcoming calendar events), or it’s a mirror of the most important function in an app (like a mini calculator or the result of the number in a clipboard for a directory app). Widgets are still useful for what they are. Up Top items are like what would happen if the particular thing you were interested in in one app got to have its own widget.
This message is brought to you by the new Federal Motors 2019 Hoodwinker Saloon. Experience unparalleled joy with industry leading mileage* and up to eight** cupholders! Pay your local dealer a visit for a test drive and receive two tickets to Star Wars®: The Last Jedi™***! Starting at just $12,345****. Click here to sk–
* Mileage may vary.
** Available with sports package.
*** While supplies last. Limited to two per household. Star Wars is a registered trademark of the Walt Disney Corporation.
**** Suggested retail price.
Today’s edition of Waffle is brought to you by Netmix! The video content you enjoy, now on every screen. Season 10 of House of Fjords now available.
Dear senior executive,
You may also enjoy: Fictional letters to the editor throughout history
I sat down to read an article shared to me by my grandmother on FACEBOOK, INC. [FB; 184.33; -0.34 (-0.18%); at close: 4:00PM EST; stock prices provided by Varyzen for informational uses only] and I was simply put, floored – for quality flooring, enjoy Lowe’s new line of walnut parquet – by the clarity and efficacy of your original content – create your own weblog now at WordPress.com.
In related news: Net Neutrality: worst idea in American history or World history? We survey ISP marketing departments
Not only was it relevant and insightful, I was unable to locate any promotional consideration. As a concerned customer, I must insist that you include more advertisements, so as to ensure your continued survival – stock your go bag and bunker with quality MREs at The End is Nigh.
Around the web – provided by Hoogleboot Content Services
Remember these celebrities from the 70’s? Here’s what they’re up to today
This one weird trick solves hunger forever
Russia is planning to attack the US – read the secret plans
Once again, my warmest applause for your friction-free – stop slipping this winter with Gary’s Gravel – reading experience. Truly, we are living in the golden age of journalism.
Autoplaying in two seconds: Why there is not ever a reason for anyone who still maintains ownership of a soul to engage a content blocker
The Average Internet Content Consumer
There’s nothing wrong with Apple today that can’t be solved with a bit of Howard Moskowitz.
If you are one of the five now living people who have not yet seen one of the first TED talks to break when they started being published, allow me to summarize the Malcolm Gladwell talk on spaghetti sauce:
Howard Moskowitz is a psychophysicist - an expert in what satisfies us. One of his clients is Pepsi, who is developing Diet Pepsi and demands of him to find the right level of sweetness in a small span of allowed sweetness. He does the obvious thing of making small batches of Pepsi at each interval of sweetness, but there’s no dominant, obvious perfect match. This bedevils him long past the end of the contract until he figures out that there’s not necessarily one perfect, platonic Diet Pepsi.
He sets out to broadcast this notion, at significant loss for his career. Eventually he is hired by Prego spaghetti sauce, where he tries out his theories by varying many constituent parts, and uncovering the need (and untapped market demand) for chunky spaghetti sauce, which is launched to great commercial success. With products that cater better to many different stances, customers will be happier, because they get something closer to what they desire.
The arrow of Apple, over time, is to strip down products to bare essentials. Fewer parts. Design that does not hit you over the head with how someone sweated over something. (You may think that Apple products look ostentatious, but go pick up a “gaming” oriented product and count the number of discrete surfaces, or angles, or glowing LEDs, or trackpads placed above the keyboard, and consider how much worse anodizing the whole thing rose gold really would have been.)
This is fine and good, but it also means that they want to remove as much as they can possibly remove. Make thinner as much as they possibly can make thinner. Recently, we’re seeing this desire fight actual purposefulness.
Apple has to keep around last generation’s MacBook Pro and MacBook Air for the increasing number of people who do not want to live the dongle life, or worry that keys will be put out of commission by pieces of dust. And Apple had to publicly admit defeat in having bet on the wrong, minimalistic, two-GPUs-bolted-to-the-side-of-a-cooling-core horse and remake the Mac Pro from scratch.
What I’m proposing is that it doesn’t have to be a tug of war. There is a way to solve this. It’s called introducing another product.
Slide the current Touch Bar MacBook Pro down and call it MacBook Pro Mini. (This or Pro Air. It’s not a perfect name, but they currently sell both the iMac Pro and the Mac Pro and they’re both pricey Pro desktops that differ by four years and one letter, so get off my back.) Leave it as is - allow it to get even thinner, even.
Introduce a new MacBook Pro, that is as “thick” as the previous MacBook Pro. Put MagSafe 2 on it, put Thunderbolt 3 on it, put USB-A 3.1 on it, put an SD card slot on it, put a headphone jack on it. Put its existing keyboard on it, function keys and all. Upgrade it to some variation of the butterfly mechanism if the improvements involve figuring out a way for it not to be disabled by individual specks of dust.
What about the Touch Bar? Make it a customizable option and put it above the keyboard (between it and the hinge) where it rests outside of the keyboard area.
Stuff it full of batteries. Stuff it full of discrete graphics capable of running two 5K displays. Stuff it full of 2 TB worth of SSD, 32 GB full of RAM.
Slice off some of it for the 13” version, to make it all fit. For extra credit, make a 17” version again and go nuts.
There’s nothing controversial about this. (Maybe the Touch Bar.) I know the traumas and the war stories, but this is not a return to Steve Jobs coming to Apple and finding 15 lines of Macs whose strengths he couldn’t figure out. This is the same number of Mac laptops Apple already sells today, and with significantly improved messaging. This is a single spectrum: thinner means less capable but also less power hungry, and, well, thinner. Thicker means more capable. And even the thickest model isn’t thicker than what Apple is comfortable selling today.
If you are comfortable living the dongle life and not having function keys, pick the Pro Mini. If you need the power, pick the Pro. If your needs are the computational equivalent of subsisting on water and plankton and plug nothing in ever, pick the MacBook. No labyrinthine flowcharts required. No need to axe products that already exist. Just to offer more beside them, so that those of us who care can still do our job.
Apple isn’t uncomfortable making tradeoffs, it’s uncomfortable admitting to them. Don’t keep around last year’s model - simply make more models with clear and meaningful distinctions that appeal to more people, and put the desire for optimization to good use in making fewer debilitating compomises for each of them. Don’t pretend that you can only make one.
There is no perfect time to review a console. When it’s just on its way out to the store shelves, you’re judging based on what few games have slithered its way to the launch lineup, and the future is occluded by fog of war. When everything has shaken out for a year or two, it used to be safe, but nowadays fractional generation bumps like Xbox One X or PlayStation 4 Pro have started to take over. So I think as these things go, right now is not a bad time to take a look at the Nintendo Switch.
I have always been a Nintendo fan. It used to be a whole lot less contentious. The NES and Super NES ruled the roost during my childhood, and they were the best of their kind. I have a soft spot in my heart for the Sega Mega Drive (known to philistines as the “Genesis”), but the Super NES with its game library may have been the most superior console of its corresponding generation ever.
Ever since Nintendo 64 started dropping this ball, we have been waiting patiently for Nintendo to pick it up and resume the position as a leader, scoffing at then-newcomers Sony and Microsoft, which ended out edging out first Sega and then largely Nintendo itself from the market.
Since then, in trying to find its place, Nintendo may not have compromised too much on its values and its priorities, but it did fall into the trap of playing to its niche. Nintendo’s role used to be to provide the best platform for all games; now it is to be Nintendo, because no one else is. As a fan of theirs, I am glad that they are, but I still always long for something better.
Starting with the original Xbox, the landscape turned into Xbox vs PlayStation in various iterations. 3D games meant more visually stunning games, more capacity for storytelling and atmosphere, and the games that followed tended to be more “grown-up”. Call of Duty, Metal Gear Solid, Battlefield and so on. I don’t know that Nintendo wanted nothing to do with it, but ever following Gunpei Yokoi’s watchword Lateral Thinking with Withered Technology, once instrumental in birthing Game & Watch and Game Boy, the consoles’ relative lack of strength meant that games designed to impress on other platforms always looked worse and could not flex as far and as much on the Nintendo alternative.
True to their form, Nintendo’s only hit platform since (aside from handhelds where they have always been the only game in town) was the Wii, which had a gimmick that put it in the hands of casual gamers everywhere. It had a barely stronger GPU than the GameCube. And the Wii U, designed especially to up the brawn, ended up being so much of a disappointment that people started speculating that it would be the last Nintendo console.
Leading us at last to the Switch.
You could not swing a Wii Remote around you by this time last year without finding someone who would tell you that the Switch was the wrong move, both too late and too early, and definitely doomed to failure. Not everyone said it, but it was a completely reasonable conclusion to draw.
A Nintendo console would be based on the brains of Nvidia Tegra platform, known chiefly for its appearance in the Nvidia Shield mobile console which is remembered mostly for going absolutely nowhere. It would attempt to cater to everyone by going both mobile and stationary. It would have small, under-sized controls and motion controls, but with no sensor bar. It would have a screen and could attach to the TV, but you couldn’t use both as for the Wii U. And it would still run hot enough to necessitate a vent.
And Nintendo expected people to develop for this? Legend of Zelda: Breath of the Wild would be available, sure, but it would also be on the Wii U as originally promised. Fast forward through to the January announcement and the only other launch system seller was a game, 1-2 Switch where you sometimes weren’t even supposed to look at the screen. On the same event, a warmed over update of Mario Kart 8 is given top billing, and Nintendo managed to produce an EA representative proud to announce that FIFA (only) would be coming to the system. I don’t think I’m the only one to have a FIFA 64#Other_titles) flashback. And when Reggie assures that games won’t be a problem, the Wii U (lack of) software catalogue looms over his words.
With the Switch in hand, a few things immediately come to the fore. For the first time in recent memory, the interface of a Nintendo platform is actually snappy. It is responsive, pared-down, focused. From the moment you turn it on, you understand that it is there for one thing: games. Things load quickly, the shop can be opened from everywhere without closing the game and the interface is refreshingly non-“bubbly”. The style is more “modern iOS with a different font” than it is a refinement of the Wii U model. If not for the Miis and the Nintendo characters, it would be easy to miss that this is a Nintendo console going purely by the menus.
With everything down to games and no hopes for vague services like the ill-fated TVii to pad the offer, there’s really not much to say about the actual console. There’s rather more to say about the games. Having reached the tail end of what was forecast at January’s announcement, I can say this: this is the best year in a long while for Nintendo. Nintendo managed to do this with in part cheap ports, sure, but without full-on Virtual Console support and without even a fully developed online package.
Instead, when I wake up my Switch, I see games that aren’t simply fluff. There’s the inimitable Breath of the Wild, a genre defining game that’s the best Zelda ever. There’s the aforementioned FIFA, which may not have all features, but which is a fully competent, recognizable FIFA version, way unlike the FIFA 64 debacle, and many times better than a mobile port. There’s the radical Sonic Mania, an entry in the alternate Sonic history where they didn’t completely lose themselves in the character development and 3D wave and stop making actual Sonic games, worthy of an article in itself due to its storied background. There’s the fresh Super Mario Odyssey, a return to form for the 3D Mario platformer which makes me feel in depth and scope like I did playing Super Mario Bros. 3 for the first time. And there’s games that may not be groundbreaking, but are simply now available to me on a Nintendo console, like Worms W.M.D, which never would have come to the Wii U. (More normal people may be interested in noticing Skyrim in this roster too.)
Looking back on the first year, I get a good feeling seeing what the year has brought. It still bugs me that for every port, Nintendo wants there to be a Nintendo-ish stamp on it, like Zelda clothes in Skyrim and a whole Mario entry in the Rabbids series. But compared to the GameCube, Wii and Wii U, the Nintendo-ness has been dialed down and focused. What’s left is a competent console that has enough power to play decent and recent games, with a good conventional controller. The gimmicks turn into extras that enhance the experience, but don’t sell out the prospect of porting to it for those game makers unwilling to use them.
Breath of the Wild is a ridiculously beautiful game that’s just as ridiculously universally acclaimed. It put a lot of consoles in a lot of hands or docks, and you didn’t have to do some sort of Nintendo-specific handstand to take advantage of the feature du jour. It was just a good game that you could play on the go if you liked. Despite also being available for the Wii U, Nintendo managed to sell more copies of it than they did of the actual Switch console for the first quarter, and it set the tone for the entire console in the public mind. Good-looking enough games that were great to play. All the rest of the Nintendo-ness is there for those of us who know where to look, but for the first time since the Super NES, it doesn’t stand as an obstacle for everyone else. A solid, focused console with performance that is good enough is what lets you establish yourself as a viable alternative to owning an Xbox One S or a PlayStation 4 when half your launch lineup is literally NEO•GEO ports.
Looking into 2018, Nintendo’s own pipeline is looking a bit less bountiful. (I’ll probably get the upcoming Yoshi game, but I’m not into Metroid.) 2017 was a fantastic year whose shadow will fall heavy on the future; any year with significant Zelda and Mario outings will be hard to match, nevermind top. Nintendo’s next challenge is to make sure we don’t all fall into a collective depression over this. But the flow of third-party games is both steady and increasing, for the first time in recent memory. If you want to play the latest Battlefield, it’s still not the system for you, but it may be a viable alternative even for someone who isn’t remotely interested in Nintendo games, and that is a first.
iPhone X is the beginning of the second iPhone decade.
It’s easy to divide iPhone’s eras up by technical criteria. Screen size, pixel density, camera megapixel count, cellular radio technology. The fault line between iOS 6 and 7 is also very enticing, as is counting home screen icon rows. But the reason most people who use iPhones use iPhones is that we like the way it feels to use. In this, the iPhone X brings something new.
You can often sniff out details of upcoming iPhones by noticing small changes in iOS. iOS 10 put time and energy into refining the audio source selection - because that’s much more important when you rely on wireless methods, as in the iPhone 7 and forward. And most changes in iOS 11 seem to make much more sense in iPhone X.
To a much larger degree than before, you can “flick” out out things like viewing photos. You tap to view it, then grab it and throw it a little bit, and it zooms back to its place on the previous screen. It’s hard to come up with a physical analogue to it, but it just feels right. Apps moving to and from their “cards” and being flung back to their icons both naturally limits unexpected motion and is motion you’re personally in control of, and the potential for motion sickness goes down by putting you in the figurative (but metaphorically appropriate) driver’s seat.
When the physical home button was made into a fake button in iPhone 7 I was skeptical. The force touch trackpad on newer MacBooks works because the Taptic Engine (linear actuator) is directly below the trackpad and smaller than it, and because of the phenomenon where that feedback to your finger feels like the button is being depressed. But the Taptic Engine in the iPhone 7 is not underneath the home button. It feels like the entire phone, more or less, is giving nondescript feedback - much like the clumsy attempt at haptic feedback for some early software keyboards on BlackBerry phones. It’s not an unpleasant experience, but it’s also not a button press. You can’t fake a button press like that.
The home button strip owns this problem - it doesn’t try to. When Bret Victor describes the limits of touch screens, he is correct in observing that all you can basically do is tap and swipe. As it turns out, swiping is great for app management. Swiping left/right to speedily switch apps, or pull up to get the carousel is easy. The combination of the OLED display’s color, contrast and fast updates with a CPU that they will really have to work to feel bogged down in three years’ time with the iOS 14 update means that everything is really fast, really fluid, really responsive and never drops any frames. This can only get even better when iPhones get “ProMotion displays” with variable and higher frame rates, but everyone responsible for this part of the experience can confidently ask for a bonus this year.
(A side note on the aforementioned Bret Victor: he’s also one of the guys behind the much-maligned Touch Bar. The reason the Touch Bar is a bad idea is that it ruins the expectations while it doesn’t play to the strength of the replacement. Sure, you get a dynamic repertoire of commands, but most people are used to be able to use muscle memory to achieve the same dynamism, and the lack of key-like haptic feedback all but destroys it. And since it’s so small, although swiping can be used to provide finely-grained input, halfway down the palmrest is a trackpad, which also now is pressure-sensitive. As long as the Touch Bar has to live in that environment, the people who it doesn’t completely charm, it can’t help but disappoint.)
The iPhone X feels like a new kind of iPhone in a way that iOS 11 on an iPhone 8 doesn’t. I’m not sold on iOS 11’s visuals or typography as a whole, but it is a cohesive package, where motion, interaction and “cards” cross the entire operating system and repeat across apps. iOS 7’s introduction of physics is reaching full bloom and things react predictably and consistently. Face ID so far works so well as to assume transparency, which is what you would wish the camera/sensor notch would also assume, but which is not a noticeable problem in use in portrait. If the role of iPhone X was to set up the second decade of the iPhone as a new generation: mission accomplished.
It’s also easy to come off or be painted as a curmudgeon if you say anything about a certain 3.5 mm headphone jack.
There is great utility in being able to listen to something just by plugging in a cable. There is great freedom in being able to listen to something without worrying about battery life. There is great convenience in being able to push a button and hang up or go to the next song or adjust volume.
When Apple pulled away the jack from under our feet and said “it’s okay, we’ve got the AirPods”, they may well have introduced the smoothest functioning pair of wireless headphones ever, but they do nothing to solve those problems. A true advancement makes the previous limits irrelevant, and what Apple did is pull the Apple Maps gambit, saying “we’re close enough, this is gonna fly”, out-and-out dropping whole swaths of functionality while painting it as an improvement.
I hate the fundamental interaction model of wireless headphones, but I would maybe tolerate it if the battery lasted for a week. It doesn’t. Not even for the ones that actually do have remote control buttons. There are now wireless interfaces, little things where you’d plug in your wired headphones and vend them as wireless headphones, and I’d get one if they made one that didn’t simply match the pathetic battery life on most wireless headphones.
The sad thing about Apple being Apple is that when they’re wrong, they’re likely to stay wrong. They go all out in total war against every opposite viewpoint and make it next to impossible for themselves to back up and rethink something. If their rallying cry for the AirPods had been “these are the nicest-to-use wireless headphones ever”, they could have gone back on this decision, or offered a model with a headphone jack and a lower IP rating and let customers decide which is more important.
By letting all of us swallow the good with the bad, they can claim victory in any argument simply by taking advantage of most customers not fleeing to Android. Between this and the insistence of all USB-C, all the time in the MacBooks, I think many of us are looking for a more pragmatic Apple, the sort that already rears its head every now and then, deciding to take the Mac Pro back from the brink, to put more ports in the second generation MacBook Air, to drop the original iMac puck mouse or to start making the Mac mini. Instead we get the Apple that proclaims its stores are now “town squares”. Barf.
So why’d I get one then? Because with inductive charging, at least I can still perform the mind-blowing, high tech maneuver of both charging the damn thing and listening to music on it at the same time.
And so far, it seems to work just fine. There are small alignment issues - put it significantly off-center and the coils won’t align. I’m using the Mophie charger because of the promise that it will be updated to support faster charging. It is simple and unobtrusive enough, and the combined form factors also make it easy to tell with a glance whether there will be alignment. The phone hasn’t slipped off of the base, but the cord is also laughably short.
None of which is to say that this is impressive. Adequacy is the name of the game, and I’m to understand that the charge speed is slightly crippled compared to an ordinary charger, which may be a real impairment. How the upcoming AirPower charging mat will change this is anyone’s guess. Unless it ups charging capacity and eliminates alignment issues, it will likely not be worth the… I’m guessing $99–$169 it will go for. The wider format is a plus, but I own nothing else that will be charged by it.
It is more expensive than it should be. A lot. And the decision to skip a size and demand more for a 256 GB model is a nickel-and-dime move - one that may also be explainable by wanting to limit demand in the face of poor production capacity, but a nickel-and-dime move nonetheless.
Just like the original iPhone’s initial hefty price tag, it is hard to justify. Like the original iPhone, it will go down fast if you’re willing to wait. But also like the original iPhone, it is a place where the future of the smartphone, incomplete and still fledging as it may be, is showcased first. And, far from any sort of desire to “show off”, I guess that more than anything else tickles me: finding the new not in the application of facial scanning, world-class processor design and high quality screens bleeding almost as far as they ever could, but in ostensibly small tweaks to a user experience ten years in the making breathing new life into it and establishing an interaction model and vocabulary that can go so many places over the next few years. Notches, dongles and misplaced priorities aside, how can I resist that?
I was here before and I’m here again.
The current climate is confrontational. It’s more or less people convinced that they are correct, running into each other head first. Sometimes not everyone of these people will be correct.
It’s easy to point to Twitter or Facebook or whatever and point to it as the problem. It’s easy to corral an opinion that the only thing that changed is that people got fewer characters to type with. It’s easy to take that as far as to think that you can, to take an example at random, swing an election that way.
It is true that we all, mostly, are information junkies with shorter attention spans than before. But that’s not a change over 20 months - try 20 years. Those of us, most of us, prone to lap this all up have it in us from birth. And as more and more of the world pulls apart from bound books, from neat, nestled entities into the big weave of opinions and memes and eddies in the space-time continuum, it would be against our nature to find this one or both of fascinating and interesting. Certainly, it dispenses with a lot of messy ceremony and context.
Waffle was here before and ran on fumes, continuing on the foundation it had once established. It had a formula, and I wrote to it. The reason I wrote to it is simple: I like thinking about things. I like ruminating. I am not super smart, I am not super insightful, I do not in any way live my life without a box, so that I may think outside it, and if I did there’s no guarantee that those thoughts would be of particular import or brilliance.
But as you might do, being the pattern recognizing biped that you are, you might write more or less random crap (with som thought put into it, but still, random crap) for over 12 years and occasionally strike upon something that resonates. You may even time some of those with being the first to say something in particular, or to point something out in particular. And you may fall in love with “having been right”, or being in some way hard to replace.
We’re all hard to replace, and we’re all easy to replace. Every cemetery is full of indispensible people.
So, past a certain point, it may be that I played the role of writing Waffle, and each time, expecting what I wrote to take off at least in some minor way. Since almost nothing ever did, this was a harmful delusion.
Thinking about the climate and the tempo, it’s not hard to understand what’s going on. At most, what I write is going to be a link passed around from place to place. Everyone sees hundreds if not thousands of them each day. The chance of winning that lottery is miniscule, and of course relies on having something that is either truly good or that strikes a tone beyond just being lucky enough to get noticed.
And from my perspective, not having a Twitter account especially and not tooting my own horn beyond publishing new items in my Atom feed (a minute, please, to mentally commiserate the ghosts of Echo, nEcho and Pie - and if that means anything to you, know that you have a good memory, and that you understand what I was comfortable with as being the “good old days”), the whole process is both opaque and hopeless, like putting things in a shop window in a shop on a long since forgotten dank alley, and wondering why there are never any customers.
I did a lot of it this year. Out of the combination of chaos, stagnation and feeling of helplessness, there had to be some destruction. If nothing matters anymore, you eventually punch a wall to see what happens - to see if there will be a mark, to see if your hand will hurt. There will be pain and anguish.
To me, there was the horrible sense that I defined myself as being someone who tried to think about things and tell people about them, and now I was living in a world where all everyone cared about was sharing their opinions on everything to the point where it overshadowed what was important. Even knowing how silly this sounds, it caused me no end of deep existential misery.
Once again, it’s not that I’m brilliant or exceptional, or really think I am. It’s that I’d been acting a bit as if I was, and the tide of evidence telling me how wrong I was crashed down hard on me.
I threw in the towel with everything. I licked my wounds for months. I turned inward. I held in a lot of things, I clenched my fist in the dark, and I thought all of this meant I wasn’t allowed to have or communicate ideas anymore, as if there was a privilege I’d lost.
Only very recently did it dawn on me that I should just keep doing what I was doing, but for the sake of doing it for myself. You can call this a five cent, one-hour-mark Pixar epiphany if you’d like, but that’s how it happened. If reasoning and thinking and feeling conviction is so important, for fuck’s sake, just do that, then. Write it down. Put it up. Don’t sit on it. Get it out.
So I am.
I’m still holding out on joining Twitter. I think I’m hyper enough already, and I still think it’s a horrible medium for both conversation and publication. I still think it feels like talking loudly at people. But even if I liked it, I wouldn’t want to care too much. This is now a strategy. To not care whether something gets linked, read, noticed, those things. If they do, fine. But no writing hoping it will. No writing unless something moves me.
There are two more things. The first is “subheads”, and maybe some of you are saying, well, thank god, what took you since 2003? Looking back, not using them was a stylistic choice, but it also reinforced the idea that I don’t even have to try. I can just stream-of-consciousness something and it’ll be brilliant. That’s not it. I avoided them because I knew I’d hate the awful puns I’d make up, and knowing myself I will, and already have, done so, but that’s just going to be the lesser evil.
The second is that I’m going to try to be honest and humble, and not self deprecating. I am not a happy person and I am struggling with many things. But as little as it is a shame to be covered is it a violin to be played for pity.
The first real entry has some energy to it, and I wanted it to have energy to it, because I wanted it to push its own idea. It proposes something new that from a distance can be pigeonholed as other things. But if it’s anything I’m tired of that existed even in the happy tightly-knit, turn-of-the-century era of people writing well-considered weblog posts at each other, it’s that so many of them were written with an angry superiority, as if establishing absolute dominance over its subject matter, proclaiming the absolute best way to peel that particular onion. I don’t want to do that in general, but I feel that a new idea is fragile enough to deserve some pompous presentation as armor and shield from the environment.
I also dropped WordPress and its consequences. Right now, this is generated by a static site generator, and I am using a default theme, shame of shames, albeit heavily modified to cut a lot of crap out. I type it locally, I generate it manually and I upload everything myself like a rube. But the file transfer connection is SFTP and not FTP, so you can tell it’s 2017. Having to do all this hopefully helps me not want to do it unless something really moves me enough to write.
It is time for mobile operating systems to recognize the way we work and the tasks we actually perform. It is time for a people centric view. It’s time for pins. Not widgets, events, notifications or alarms - pins. It’s not a perfect name, but let’s call it that for now, because ambient, semi-ubiquitous, transient widget/notification mixture, aimed at letting people be people is a bit of a mouthful.
Consider the following:
You go look up when your train or bus will leave and you see the time of arrival. Today, you can look at that time all you want, but this is where it ends and everything else you can do you have to arrange yourself.
But this is not reality, this is a fictional pocket universe, made for the purpose of illustration. So instead of setting a manual timer, creating an event in your calendar or attempting to remember it on your own, you can tap it and pin it.
Now, the time shows up on your home screen, and on your watch, and likely with your notifications in the appropriate form. It is about the same size as a widget, but it shows just this one thing. The app that created the pin is responsible for how it looks and feels - it is like you “tore off” a piece of UI from the app and get to take it to go. It counts down live and you can set an alarm, which the OS handles, but the pin fires at the appropriate time. And you can tap it to go back into the app, to where it’s shown in context.
The pin can include some smarts, like warning you if you’re not within n feet of the station a few minutes before. If you’re getting on a train, the pin can contain a ticket or know to bring up the right card in Wallet. It can also guide you to the destination with your favorite map app.
When the time has passed, you can dismiss the pin at any time, and see the past pins in an archive (which you can choose not to keep).
Imagine if there was an app, like contacts, but just for places. Let’s call it Places for now. You can set up the places you frequent, with help from your phone, which already knows this from keeping tabs on you. A place can be associated with some part of your life, like work or home, or some standard activities, like grocery shopping, hobbies or exercise.
Now, apps can create pins connected to a place, or a person (which already exist in the form of contacts). It can bring up a pin when you get to a certain place, or when you’re talking to someone. Buying milk when you’re near the grocery store is a good example, but how about getting a kitchen scale or a new pillow or sandpaper when you’re just passing by the speciality shop on the other side of town? And what about showing contextual information when you’re talking to or texting with a person.
Almost nothing about this is new. If you’ve got a case of the well-actuallies, you may have been itching to point out that some of this is possible. And it is. But it’s not dirt simple. It’s available in cumbersome corners of unrelated applications, or hidden behind comparatively tax-form-like incantations. You have to jump through hoops to do it, and that doesn’t encourage doing it. If you are determined, you may do it with one or two things, but not with everything.
We are limited by the shallowness of the current metaphors. Events and reminders are meant to belong in one point in time or in a list, and it only occurs to us to use them when planning or making lists. Widgets are meant to show only the most prominent, most important or most recent information and are limited in scope, but the idea of taking up space even when there’s no train to wait for is ridiculous.
Notifications are somewhat close to this, but meant mostly to deliver one message in an instant. But their meager capability and lack of customizability chops most of these things off at their knees. Apple TV’s nudge to use the keyboard and Wallet’s prompt to bring up the relevant ticket are both useful, but are annoying because they look and feel like notifications which are meant to be acted on or discarded. They feel off because they are off.
A pin is a widget for something you care about at that moment, or is associated with a place or a person, that you choose to put in there and that will be relevant to you. It is super focused on exactly one thing, and it will help you do it, achieve it, be on time for it or whatever else it’s for.
Any app can offer them, but you choose to activate them. They are transient, but they give you the feeling that the phone is actually helping you. The late Windows Phone platform was very keen on “saving you from your phone”, and its live tiles went part of the way, choosing to focus on showing more photos and cycling more information, and on letting you pin direct portals to friends instead of apps on the start screen.
iOS in particular lives and dies by people hopping in and out of apps. There’s no effort to further evolve this, except through talking to a personal assistant. Typing is now an option, and many show them things they guess that you want, which is fine and even worthwhile in itself.
But items like this should live within the OS, across apps and be under your direct control. They should be graphical and manipulable by touch, and they should flow from the interaction with your app. Just like tapping a date, phone number or email address offers to create a calendar event, call the number or send an email, so should nuggets of information associated with future actions be able to live on and be acted on.
- Pinning a stock quote, exchange rate or auction, seeing a live result and receiving an alert/notification when something’s happened.
- Pinning a tracked delivery, seeing the current estimate, and getting options to pre-sign when it’s time.
- Pinning a note to ask your boss or co-worker about something, and having it pop up when they call you.
- Pinning a note to that restaurant you go to twice a year to remind you what was so wrong with what you had last time, so that you can try something else this time, and seeing it when you go there.
- Pinning a reminder to when you leave work to quickly note what you did today, so it will be easier to log your time.
- Pinning an ongoing sports game, so that you can see the score and see when it changes.
- Pinning a reminder to a store you pass by every now and then to go in and get some item you never think about and which is not urgent but that you nevertheless may need.
- Pinning a worrisome heart rate reading, and having it warn you if it rises even more, or go away when it gets back in line.
- Pinning the delivery estimate from a order-out app and seeing the count down up front.
- Pinning the upcoming ride from a transit, taxi or ride-finding app, seeing when it arrives, and seeing topical information like expected travel time or fare when you’re onboard.
- Pinning the current weather event so that you can be alerted when it shifts, so you can get inside when it’s about to start raining or stay inside until it clears up.
A notification is too small. It is one message, one status update, one annoying “your friends miss you!” emotional marketing turd from that pool game you haven’t touched in two years. You drown in them, they all look the same, you just want to get rid of them and they can’t do much.
A widget is static. Its content is dynamic, its shape is static. There’s only one of it, and it only shows so much. Good for avoiding information overload, but bad if you want to know everything there is to know about a few things important to you now.
Make it a mini app, make it widget-like, and make it okay for you to have as may or as few as you want, which will be okay, because they will act on just this one thing, and it’ll be over quickly, and if it won’t be over quickly because it’s a latent thing related to people or places, you have geofences or apps triggering their use.
The watch face is maybe the best use for this. Being able to look down and see the thing you’re doing right now at a glance makes many tasks suddenly have a dedicated device and screen, just like time and date. But at least on Apple Watch, the staticness gets in the way. You pick a few things and they’d better be the only things you ever care about. Or you’d better love manually swapping between watch faces, and not require a mix of things. Where you’d need the most context sensitivity, you get the least, unless you give it up for all context sensitivity, and trust Siri completely to know everything with the Siri watch face.
The final step is to have apps that you trust to maintain pins for you, because you asked, and based on your instructions. A transit app that knows where you work and which lines you like to take, and can show you the best option for your next stop, popping it up when you leave home on a weekday morning. That reminder being a live countdown to the next connecting bus, updating as you finish your journey. The same reminder being live on your home screen and your watch, using all this technology to make the deduction that your bus will be leaving two minutes too soon, but that another one is coming at another nearby stop, and you can still make that one.
We have the technology to do all of this. We don’t, apparently, have the will to design this, or we are so hypnotized by apps and notifications being the only things that matter that we can’t even picture it. Personal assistants and machine learning almost attack this, but turn it into a rebus, a game of personal convenience Battleship, where you gotta ask for it just right, or hope that the continuous game of mind reading overlaps with what you had in mind.
Let’s try this instead. Let’s try empowering the concepts that already live within all of us, in how we go about our days, live our lives and where we already use our pocket rectangle to do some fact-finding or researching.
When everything’s an app, everything becomes about navigating within the app. Nearly everything that’s happened for the past few years has been about continuously refining the graphical experience. And there’s nothing wrong with that, but it shouldn’t be all there is. What things like the Siri API shows is that being able to capture activities and intent is powerful.
Let’s say that you don’t feel any sort of attraction to anything mentioned here - a small evolution in this model, and suddenly, instead of popping these things up on your screen, you could queue them up in a rich list of things you want to check out, like Safari’s Reading List or Instapaper but for actions. This may not be a perfect fit for all of them (especially not time-sensitive ones), but for those it fits, it sure beats manually making to-do lists with instructions. It’s an idea that’s been almost impossible to think before, when actions have just been buttons you push and not things in their own right.
It’s been many years and we’ve got the app down pat. It’s time to start looking for other tools that can complete the picture and put us back in control. And it’s time to stop putting all our wishes in the form of a question to a personal assistant.