It’s often hard to remember that the personal computing era is still quite young. It only dates from 1977, with the arrival of the first mass-market PCs. These were the first computing devices meant for individuals that weren’t kits, had screens and keyboards, and could actually do something useful, like word processing, game play, or, eventually, spreadsheets.
The best known of these, the Apple II, which remained on the market in various forms until 1993, will have its 40th anniversary in June. That makes mass-market personal computing younger than the Super Bowl, Starbucks, and Disneyland.
You can trace a pretty straight line from these first machines to the laptop, tablet or smartphone you’re using right now. They are all personal computers, with the smartphone being the most used today. Those 1977 models (which included entries from Commodore and Tandy) are also the forebears of the web, Facebook, Google, and pretty much whatever else you’re using on whatever digital device is at hand.
The information “appliance”
But almost right from the start, and right through the 1990s and even the early 2000s, the quest was on, at least among some, for computing devices to become “appliances” or “information appliances” — that is, dead-simple to use, without training or the need for a manual. I know: the push for greater ease of use was a defining principle of my own tech column in The Wall Street Journal, which launched in 1991 and continued for 22 years.
Of course, the standard for this “appliance”-level of ease of use changed over time. The bar was pretty low in 1977, when, according to Wikipedia, the then-influential Byte magazine said the Apple II "may be the first product to fully qualify as the 'appliance computer' ... a completed system which is purchased off the retail shelf, taken home, plugged in and used." This, despite the fact that it used a command-line interface, needed an audio cassette recorder to store its apps and data, and took plenty of effort to master.
For many years, even as users became more sophisticated, personal computers took too much effort to use without problem-solving, keeping alive the yearning for greater simplicity. Microsoft’s dominant Windows platform, in particular, was a home for all manner of bugs and problems that required IT people to straighten out. Of course, a significant minority of people enjoyed solving these problems and loved to hack and modify PCs. But not mainstream users.
Even Apple’s Macs, which were generally much simpler and more secure, got more buggy and fell behind as the company faltered in the 1990s, and didn’t hit their current stride for years thereafter.
During that era, I tested many digital devices that claimed to be appliances, a few of which were even mildly successful. I reviewed one of them as late as 2005, a device called the Pepper Pad, which was kind of what an iPad might have been with worse designers and an even higher base price. I wasn’t crazy about it.
Only the Palm Pilot, and its descendant, the Treo, came close to the dream, but they were limited and lacked large platform ecosystems. (Despite its popularity, the BlackBerry for too long was largely an enterprise-focused email terminal with few apps.)
The iPhone changed everything
In 2007, everything changed with the iPhone. As crippled as that first model now seems, with its lack of apps and glacial cellular connectivity, the iPhone was a practical, useful, self-contained computer a child could understand. It was an information appliance. And it became even more so with the advent of the App Store the next year. Today, it seems boring to many tech fans, but the iPhone broke through the last barrier that allowed everybody to do a wide range of tasks on a speedy computer.
And that was followed up in 2008 by Android. It took years to sand off the rough edges and get the hardware right, but Android devices added qualities the iPhone lacked: they sold for less and allowed for lots of customization.
Then came the iPad in 2010, to add a large screen and more PC-like apps to the iPhone template.
By the 2010s, almost everybody in the developed world, it seemed, had a powerful digital device that took little or no special skills or training to use.
So, are we done with the quest for the perfect information appliance? Nope.
The next stage
We’ve already begun the next stage. You might call it the quest for the Starship Enterprise computer — the one that you can just talk to when you need information, or to get a task done, to be entertained or to record a moment. It will require perfect conversational ability, sensors of every kind and limitless storage, prodigious, always-learning artificial intelligence, access to almost infinite personal data, and knowledge about nearly everything. These are all huge steps to take, and they will all require massive investment and more than a few failed first attempts.
But every major computer company is pursuing this goal, in one form or another, as are researchers and smaller companies unknown to us yet. So far, we see only the glimmers of this future in things like Siri, Alexa, Cortana, and the Google Assistant.
These voice-activated “assistants” often live on existing laptops, smartphones, and tablets. They also now come in their own hardware, notably the Amazon Echo and the Google Home device.
But, even from today’s vantage point, we all know they are very crude. When I asked Amazon CEO Jeff Bezos last year if we were in the first inning of the quest for these artificially intelligent, conversational devices, he said it was even earlier than that: “It might even be the first guy’s up at bat.”
I suspect that today’s mass-market AI software and hardware will, in only 10 years or so, look as primitive and quaint as the Apple II does now.
And when they get all of that figured out? Well, some other perfect information appliance will still be waiting, just out of reach.