By Ultan O'Broin-Oracle on Nov 09, 2011
Yes, voice-based user experience has been around for a while. HCI freshmen grappling with Molich and Nielsen's seminal 1990 CHI paper on usability heuristics for the past two decades would have come across such user interfaces - twice. Even on mobile phones voice assistance is not new. I've used voice-based Google search on my iPhone and Google Translate Conversation Mode on my Nexus S for a long while now, for example. But now the inclusion of Siri as a native feature on the iPhone 4S has really caught the attention of the consumer market and UX professionals alike.
We've had discussion on whether Siri is or isn't a Google search killer, jokes about its inability to deal with Scottish accents, the unfortunate meaning of the word Siri in Japan and Georgia, outrage over the outages, and all the rest.
Two questions interest me:
- What are the enterprise applications user experience (UX) implications of Siri?
- What are the global UX aspects to the Siri potential?
As a UX professional I can see Siri use cases for mobile workers, sure, for simple input and creation tasks, but also for finding and manipulating more complex business transactions by taking direct action on data, contacts, locations, analytics, you name it, from one small device. Richard Bingham has some great points about Siri's potential in the enterprise customer service space.
Siri offers a logical means of interacting with devices that are essentially phones while on the move-your voice-and takes the natural user interface experience currently dominated by gestures to a new level. Obviously personalization and alternative interaction options will still always be needed as not everyone will want to use voice-based assistance all the time. Fine for telling Siri to approve an purchase requisition in your worklist or to map a route to the next service request within a 5 mile radius while you're driving (using a headset mind), but nobody is going to intone, Stephen Hawking-fashion, into their iPhone "Tell me who won't make quota in my sales territory this quarter" while waiting in line in Starbucks. For enterprise use, a more scalable service will also be required. An ability for Siri to handle domain-specific terms and jargon that a now comprehensive range of enterprise applications user profiles use in their conversations is a requirement too. With the mass uptake of iPhones and the fact that Siri learns from input means that shouldn't be a huge problem.
As far as I can tell in terms of international language support, Siri supports English sure, especially well if you like to speak like a real android, but also French and German. Additional languages will be needed to penetrate lucrative Asian, Japanese and South American markets. It will need to handle the more, shall we say, nuanced accents of non-native English speakers too. All this is very doable. Siri uses Nuance Communications technology acquired from the infamous Lernout and Hauspie, so global capability is in the DNA. As for usage in the field worldwide, will mobile workers in every culture take to Siri the same way, or at all? Looks like a fine ethnographic study on mobile voice assistance use in the making.
Can we expect Google and Android to react? You bet. With all that mobile Google Translate and search expertise expect something spectacular before the iPhone 5 appears. Of course, Siri is currently beta anyway, so by then, Apple will have moved it along significantly too.
Note: Apple says it have no plans to backport Siri to previous Apple iPhone versions, though Steven Stroughton-Smith and others have a solution to that.