I will begin this year with a subject that came to me after a conversation with some good friends out there. I just hope that if they are reading this post they will find it interesting.
I was looking into the ideas of natural and intuitive user interfaces trying to find out what is the actual definition for each of these and how they relate and so on. After some search I expected to find some concrete definitions in papers but I was unlucky. I finally realized that definitions in such new domains are not that clear so I will attempt to describe them in my own words.
Natural (as I understand the meaning of the word) is an interface modality that originates from nature, which means that people are used to interact though that interface modality by nature. So by definition, a voice interface is natural because it is based on something we learn from kids in our interaction with nature. Body language is another similar modality, gestures too and so on.
Intuitive is an interface that doesn’t need to be learned. You know it by heart. There is a very good quote for intuitive interfaces that says… “The only intuitive interface in the world is the nipple. Everything else is learned.”
However, this quote doesn’t really tell the story. Is it intuition that leads a baby to start sucking on his/her mothers nipple? Or is it instinct? Well… if we rely on what dictionaries say about intuition then this is something that is based on our experiences, knowledge, instincts and generally what everyone carries in his personality. So, an intuitive interface is a interface that uses our previous knowledge and experiences so that it doesn’t have to be learned This, by definition inserts the user parameter in the game. I have different experiences and knowledge from my brother, mother and friends. So an intuitive interface to me is something that might not be considered intuitive at all from my father. However, a common experience of using a TV remote for many years makes me and my father quite familiar with that interface so that we can use a similar device (at least for basic TV control) without having to learn anything for almost all TVs. We can change channels volume etc. almost with any given TV remote. This makes the TV remote for me and my father a quite intuitive interface.
Going back to natural interaction modalities and the fact that we learn to use them early in our life we can understand that natural interaction modalities can provide a large amount of experience to build an intuitive user interface. So that’s why many times natural user interfaces are considered intuitive too. But… are they? How intuitive is a phone banking system using voice as interaction modality when it expects to use a specific set of words and commands? I guess not much… On the other and a similar system that when you pick up the phone it answers by “How can I help you” and you answer by “I want to transfer 1000€ to my brother” is a lot more intuitive because that’s the conversation you are used to have when you go in a bank. So natural could be the beginning for intuitive UI design… but not the end.
Going back once again to that quote… I left you with another question. Is it intuition that leads a baby to start sucking on his/her mother’s nipple? Or is it instinct? That was my friend’s response that went the discussion to whole new level. OK… if we know a specific person we can guess what is intuitive for him. But how much of a person’s intuition and intuitive responses and interactions rely to experiences and how much rely on instincts? If we could design a UI for an infant … a baby just born…. what would be the best? Could we design a really usable UI for an infant who relies mostly in instincts? How transferable would that UI be for other babies? How easy and intuitive would it be for an older person? Going a bit further… how much of our instincts do we use in our everyday natural interactions? How much of them do we loose on the course of life? Could instincts become a new basis for UI design… and since in this blog I am particularly interested in accessibility… how easy would these kind of instinctive interfaces be for people with disabilities?
For once in our conversation I felt like we are peaking at a whole new domain in human-machine interaction. Who knows… maybe in the future instinctive UI will become the new holy grail in UI design.
Thinking about it… I just started the new year with a post referring to babies and new research domains. Could it be any better?