The 'Context Aware' Device

Apps that talk, data that matches.

I like Siri, but I’m not a fan of talking to her. It’s not that I’m embarrassed to talk to a gadget in public, it’s just that it doesn’t feel all that natural. What I really want, is my gadget to speak to me.

A few days ago Gartner published a paper titled ‘Hype Cycle for Emerging Technologies’ and its core theme was the relationship between humans and machines. It’s not a paper I’ve purchased (unfortunately it’s not freely available to the public domain), but the brief outline has some really interesting tidbits; one of which is the idea of machines becoming more ‘context’ aware.

"Machines and systems can only benefit from a better understanding of human context, humans and human emotion. This understanding leads to simple context-aware interactions, such as displaying an operational report for the location closest to the user; to better understanding customers, such as gauging consumer sentiment for a new product line by analyzing Facebook postings"

Google Now is a fairly great example of this; it houses ‘cards’ that change proactively based on where you are and what time of the day it is, whilst providing you with valuable information such as what the traffic will be like on your journey home. This feels like a great first step into having a really real ’smart’ phone.

One of my bug bears of the iPhone is its current inability to communicate with other apps in a fashion that makes sense. We’re all busy people; leaping between apps to piece together the information doesn’t feel like the most efficient process and in a world full of multi-taskers; why is it we still haven’t got this feeling right on mobile yet?

What if we opened up that experience, what would that feel like? Imagine getting an alert from your iPhone reminding you of a ‘haircut’ entry in your calendar. Now imagine jumping into Facebook and seeing that same reminder, but not only that; Facebook is pulling in suggestions from Foursquare based on your location, filtering that based on your friends reviews and on top of that it’s pulling in the latest men’s haircuts from Google Images?

Facebook-location-mockup_medium

Suddenly you’re getting a clearer picture of how to manage the task of getting your hair cut and for once, it’s not you doing the legwork.

This is you being guided by a machine; it actually understands what your calendar entry means and what information you need to make the key decisions.

Now, let’s pretend you’ve made it to ‘UX Hair’ and you’re having your hair cut. Your device can see you have a Shopping List app installed and the list is currently unchecked, meanwhile Foursquare understands where you are and knows what food places are nearby — well, it’s a good job it can talk to the Groupon app too because now your device is filtering out the best offers based on the items you have in your shopping list, but also the shops Foursquare has picked out for you.

What’s strange is that all of this information, all of this wonderful data is already there.

Let apps be friendly and let them truly understand where they are, together. This is what, I feel, will really make the difference.