As much as it pains me to say it, Steve Jobs was right: smartphones are better without physical keyboards. You get more screen and more customizable buttons, and the brutal truth is that a well-done virtual keyboard will let you type faster and more accurately than you can with even the best physical keyboard. It's better.
But even so, in losing all those tiny buttons we lost something else: the ability to just start typing on our phones and have something magical happen. Sure, you can swipe or tap to bring up a search — but more often than not you won't bother. It's just easier, or at least more habitual, to hunt down the app you want and use its interface.
The "just type" mentality won't die
But that "just type" mentality won't die among power users — and I'm one of them. I use Alfred on my Mac to connect up all sorts of services so I can just type in what I want done and watch it magically happen. I also am a big fan of Fantastical, which lets you just punch out your appointment in natural language and let the calendar app figure out where, when, and with whom I have an appointment. Input is free if you only hook up two services, and it costs $1.99 for every three you to add on top of that.
And on the iPhone, I've been trying out a new app called Input. The idea is simple: instead of icons, you get a command prompt. You plug in the apps that you use and then you're able to shoot commands to them from a single interface, just by typing. I've plugged ToDoist, Evernote, and Calendar into Input, and now I can just type what I need done (or, more accurately, what I need to get done) and those apps will magically create the relevant entries.
Does the command line really make sense on a smartphone?
But the question is whether this command line model really makes sense on a smartphone — especially one that has both Siri and Google Now available on it. Both of those virtual assistants promise similar functionality — but they're both locked down to their respective ecosystems (though Google, I have to admit, is slightly better at integrating other apps for some features). So Input's reason for being is that it's able to talk to the services that you're actually using, the ones that your digital assistant obstinately refuses to call: Evernote, Wunderlist, Asana, Slack, Venmo, Twitter, and a few more.
The list of supported apps isn't as long as I'd like, but it hits enough of the ones that I use that I could see myself going to Input when I don't want the cognitive overhead of hunting down an app when I just need to rattle off something quick. Input works for me mainly because I use a bunch of those apps instead of living just inside Apple Land or Google Land.
It's a great idea, but it also has me wishing I could make it a core part of the interface, the way that Siri is. And in fact, what it really does is make me realize how much I wish that Siri and Google Now were more extensible. I need an assistant that can talk to all of those services and — most importantly — treat them as equals. Input does that, but it doesn't do all of the other magical things that Siri and Google Now do. What I want is one thing that does all of those things, built deeply into an operating system.
Why can't Siri or Google Now do this?
Just before it went up in smoke, Palm created an extensible, universal search service called "Just type" that let any developer plug any app into it with simple commands. And BlackBerry 10 has a similar feature — though of course it's not supported by many developers. Apparently what I want isn't really a priority, given the fates of both of those companies.
I suppose investing in a command line interface doesn't make sense without a persistent, physical keyboard. Voice makes much more sense for these quick actions. Sadly, it also makes sense that Siri and Google Now aren't as friendly to all apps as I'd like them to be — a cold, hard, ecosystem lock-in kind of sense. It takes all of two minutes to get Alfred to talk to my preferred Tasks app on the Mac and all of 30 seconds to get Input to do same thing on my iPhone. Somehow, I don't think it would be that hard for Apple and Google to build these kinds of choices into their assistants. Until they do, I'll be keeping Input on my home screen.