Jun 18
I had worked in the IT industry for almost three years. When dealing with clients who are far from technically savvy, I noticed a couple of interesting things about how "regular" people attack user interfaces. 

Strangely, one of the most frustrating things about how these clients interact with a user interface is that they do so very logically. A logical strategy in learning how a user interface works seems like it would be the best solution. However, many user interfaces today are designed more with intuition in mind rather than a logical, more procedural layout. This technique may sound counter-intuitive, but consider the controls of an automobile. If the layout for a car was more "logical", all controls would be labeled buttons. A button to steer left, a button to steer right. Although just about anyone would be able to figure out how to use this interface simply by reading the labels on the buttons, it lacks a sort of self-evidence. This is important, because in one of my observations of user interaction, people will touch objects before they read labels. This is why the concept of the steering wheel is so brilliant; it's completely axiomatic. No one in their right mind would misinterpret a steering wheel as an accelerator, as it only turns left and right, instilling the concept of horizontal control. As soon as one grabs a steering wheel, it becomes clear how the automobile is steered left and right, there's no reading labels, menus, or sounding out instructions in ones head. It is an innate control. What makes this an innate control? My thought is that it all boils down to how humans manipulate objects in space.

Moving things typically involves picking the object up first, then dropping it somewhere else (hence the brilliance of drag-and-drop in the traditional mouse driven user interface). It is interesting how this contrasts with a touch screen interface, however. Most people can't fathom picking up objects on a two dimensional plane, so how would drag and drop work in that case? An interesting example of this is how Apple accomplishes moving icons around iPhone OS devices. To those unfamiliar, to move an application icon's position on the home screen, the user presses and holds on the icon for about a second, the icon "lifts" up, and all the other icons around it move around as the user slides the icon around the screen. This is incredibly intuitive, because it's similar to how humans interact with flat objects on a two dimensional plane, like playing cards on a desk. How this contrasts with drag-and-drop, however, is that since human interaction is done through a device that's usually not within the user's line of sight (the mouse), this level of interaction is abstracted. Thus allowing user interfaces on a desktop computer to be more of a product of invention, rather than simulating objects in the real world. This is also why I don't think virtual reality gaming will ever take off on touch screen devices.

Does this mean there's more room for innovation using the old, out dated mouse driven hardware since there's no tangible, real world metaphor to emulate? I think that the birth of the multitouch user interface is not only innovating in the mobile space, it's a harbinger to a better desktop UI experience as well. It will be interesting, however, to see what happens to the mouse five years from now.
44 comments

Categories
all UI design visual design multi-touch NLP ling-uistics algo-rithms