There is a New Hammer, and Her Name is Siri

[Also posted on Protelp]

“I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.” – Abraham Maslow, American Professor of Psychology who created Maslow’s Hierarchy of Needs

iPhone Siri HammerIt’s no secret that technology is advancing at a rate seemingly faster than we can adjust. Even so, often times we do dive in headfirst to the next big thing, whether it is a device, an app or the technology behind the app and we focus on that as Maslow’s hammer. And everything else becomes a shiny nail.

This concept is exemplified with the Apple integration of Siri in the iOS5. Virtually overnight there was a demand for voice-recognition across products and applications. From phones to TVs, to cars, to homes. We experienced a similar phenomenon and demand for gesture-based controls after the late 2010 launch of Kinect.

There certainly is a real place and benefit for each technology and it is no different with input methods. Be it typing, touch, voice, gesture, visual recognition, and all the way up to – someday – brainwaves, the best application of each will always need to consider the target user, the location and the social context.

User behavior in general has been changing with technology. Take mobile phones for example. Nielsen reports a peak of Average Monthly Voice Calls [1][2] per subscriber up until the first half of 2007, decreasing after that. On the other hand, Average Text Messages have increased continually.

And even though we know that within certain age groups texting has always had the lead, the decline in voice calls has been consistent across all user groups. And we used to believe that phones were made for calls.

But it wasn’t user behavior with phones alone that drove that change. “Mobile” implies that our location changes constantly. Sometimes we are in a private place, but very often we are in public, and our behavior is affected by it, based on our own perception of the need for privacy.

This perception of how private an interaction with a device is has a direct correlation to how close physically, we interact with that device.

People feel more comfortable letting someone use their tablet/iPad than their phone. Even more so if it’s a desktop, or TV. Because the closer it is held, the more we think its part of us.

You may have noticed that Siri’s advertisement portrays very personal interactions with it. And it truly feels that way, even though in public places we won’t use it for private matters. But that is not a problem for the iPhone. There is ‘touch’ for that.

The comfort level that we have in using different input methods is a direct result of the comfort level we have in sharing – or not sharing – those interactions with others. Touch and typing are private interactions, while voice and gesture are more of a public nature. It’s text vs. voice call. Both have their optimal – and not so optimal – application.

This takes us to the last consideration: social context, as looking only to the target user and location may not be enough.

I recently heard about a project involving voice-activated faucets in public restrooms. It was an exciting use of this technology that allowed you to turn the faucet on, off and change the water temperature via voice recognition.

It looked like the perfect hammer and a welcome advancement in an environment where avoiding contact as much as possible is a plus.

What wasn’t welcomed however was having another patron enter the restroom just in time to hear you say: “Hot! Hot! Hot!”

Tags: , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>