202 Personal Content Experience: Managing Digital Life in the Mobile Age mid-1980s, when the third UI generation made a commercial break- through, led by Apple Macintosh and later by Microsoft Windows. This generation is called WIMP GUIs (Graphical User Interfaces based on Windows, Icons, Menus, and Pointing devices). The majority of us are still using it, though its deployment started over 25 years ago (van Dam 1997). The work for the next generation (non-WIMP) UIs, which aim at using natural ways of interaction via, for example, speech, gaze, and gestures, has been ongoing since the 1990s, but they have not yet made a commercial breakthrough (van Dam 1997). Even though we focus on the mobile descendants of third- generation user interfaces, which rely on graphical presentation of information and visual sense for acquiring it, we do not wish to under- estimate the importance of other types of interfaces, such as speech user interfaces (SUI). It is tempting to consider genuine eyes- and hands-free use of mobile devices with the aid of speech recognition and synthesis. 6.1 Human in the Loop Every person is unique, but since we belong to the same species, we share common physiological and physical characteristics, regardless of our ethnic origin, education, background, culture, and so forth. Because the systems that we design are targeted at humans, we briefl y introduce some of the general characteristics that affect the user interface design, regardless of the application and the user segment. For more infor- mation, see Carrol (2003); Dix et al. (2003); and Norman (1990). Understanding the characteristics of a human is important from the design point of view, since they can be translated to general UI design principles (section 6.3). When studying a human as a part of the interactive system, we must fi rst consider how we work together with the interface and what our main channels for input and output are. With humans, the input relates to perceiving the world with the senses of sight, touch, and hearing, in order to receive information and make decisions about our next actions. In most interfaces, vision and hearing play central roles in receiving information; vision has a higher information bandwidth com- pared to other senses, followed by hearing. Vision has a number of physiological and cognitive aspects that affect perception. For instance, there are limits on how small objects can be or how many colours a human can perceive. Furthermore, due to cognitive visual processing, there are certain ways (Gestalt laws) that

Personal Content Experience - Page 226 Personal Content Experience Page 225 Page 227