2020: The past decade of ambient computing

It’s Jan 1, 2020.

Ten years ago we were at the end of the “computer on every desk and in every home” era. The iPhone had begun moving computing off the desk and into the pocket. The most important application was the web browser for content, but operating systems still mattered for running applications.

Ten years later, we’re simply reaching the conclusions of some of those trends–the web browser is where nearly all applications live. An explosion of devices and user interface types running over invisible operating systems have emerged to make those applications accessible in different ways, more transparently integrated with our lives.

We’ve learned to live with the risks and benefits of keeping our data in the cloud, so any display anywhere–with touchscreen, keyboard/mouse, or voice for input–is our own personal computer.

The average family has a display in our kitchen. Continuous voice recognition is again a challenge for the next decade, but discrete voice recognition is ubiquitous. An idle mention of “weather” in a kitchen conversation triggers the display on the wall to show the forecast for coming days.

The phone in our pocket is also a primary computing device. We dock it at our desk to gain a large display, keyboard, mouse, speakers, etc. But all that is stateless. When we unplug and go, we have everything — again, most of which is in the cloud anyway.

A picture is worth a thousand words, and we have access to a lot of pictures.

Displays are everywhere, especially in our urban settings, perhaps overwhelmingly so. Whether signage for a particular store, a billboard, or a public terminal, they’re showing active, context-aware content.

At work, multiple displays are the norm for information workers. Any remaining conference rooms without multiple displays, multiple web cams, and smart integration between them are frustratingly crippled.

And all these displays and terminals are being driven by far fewer “computers”–often just one per house or room, or increasingly one off in a server farm. Perhaps the greatest innovation of the past decade has been a subtle one–we spend less time keeping our computers working, as we have fewer of them. We’ve pushed more of the complexity to fixed-function devices, online services, and computers that are remotely maintained.

Even as our computers have become even more essential to driving our great strides in genetics, materials, collaborative creativity, etc., computing has begun to disappear into the background of our consciousness. More ubiquitous, but explicitly on the mind less often.

And still, the more things change the more they stay the same. The singularity is always a few decades away, and so it is again for the decade ahead.