There are many possible reasons for poor performance. In the past, documentation or training was the only solution to these problems, as phrases like, "It's a training problem," or "We'll put it in the manual" were catch-all solutions to poor processes. Thomas Gilbert's Model analyzes performance deficits from six standpoints. The interventions to overcome performance barriers have the highest leverage (cheapest to implement for highest return) from box 1 to 6. In other words, if the problem can be solved through better communication of expectations, it is more effective, easier and cheaper to the organization than a training program to teach performers a task they don't understand.
Information |
Instrumentation |
Motivation |
|
Environment (organizational factors) |
1. Data, information Do performers know what is expected? Interventions: Communication, clear statements of purpose and expectations |
2. Resources, tools, environmental support Do performers have what they need to perform? Interventions: Open supervisor support, appropriate tools, applications |
3. Consequences, rewards, incentives Do performers get appropriate feedback? Interventions: Consistent and immediate feedback of results, consequences must be linked to performance |
Performer characteristics (personal factors) |
4. Knowledge, skills Do performers have the knowledge or skills to perform? Interventions: Training, Job Aids |
5. Capacity Are performers capable of performing? Interventions: Selection process |
6. Motivation Do the performers care about the job or their performance? Interventions: Selection process |
In analyzing the root causes for a performance issue, we often will identify issues and solutions that have nothing to do with documentation or training. Because of this, we are no longer limited to those solutions, but can design performance centered systems leveraging all of the tools at our disposal.
When you change things, people will resist. When you change things for the better, people will still resist.
Often, just being better isn't as important to acceptance of a new design as conformity with the conventions in place. Take, for example, the control panel in Windows Vista. Arguably, the configurations and applications are better organized, better labeled and easier to find. Easier, that is, if you haven't already been trained to find them in different areas and under different labels in earlier versions of Windows. There are twice as many controls in the Vista control panel, giving the user more granular control over the settings and configurations. That should be a good thing, right? Not if you are used to finding configurations buried under other objects and have just accepted the fact and moved on. Now you are being asked to relearn where things are and will spend time fumbling around, especially if you have some legacy computers running different versions of Windows. While this reconfiguration will benefit new users, experienced users don't see enough marginal benefit to accept the change gracefully.
This concept should be considered in all design projects. While you might have a better way of doing things, decide if it's enough better to force all of your users to change their mental maps. It's amazing what people can get used to (imagine some of the legacy systems in use today), but unless you can provide compelling reasons to change, people won't. Your choices are to follow conventions, thereby further cementing that design in the minds of the users, shift them radically and force the transition, or introduce the changes gradually through evolution.
One thing is certain. No matter how you make changes, some users will complain. The key is to provide enough benefit that they get over it quickly.
For another perspective, see Gery's Law.
"If the person responsible for using the system does not benefit, the system is doomed to failure." - Grudin's Law, Jonathan Grudin
We need to recognize that pushing our pain, that is, our needs, on someone else to implement will at best give us grudging and resentful compliance. If we can ensure that the person receiving the benefit is the one doing the work, we will get a more motivated group of performers.For example, the marketing group for a retailer decides they want zip codes from all of their customers. It involves more work for the cashier, and is resisted by customers. If there is no clear incentive for the cashier, or consequences of failure, the cashiers will enter false data in an attempt to make their lives easier.What we need to do in cases like this is to show the clear benefits of the task to the user and get them "on our side". The other option is to monitor and correct issues, but the carrot is preferable to the stick.
"Designs fail when the effort required is greater than the time available at the moment of need or the perceived benefit." - Gery's Law, Gloria Gery
effort required > (time available + perceived benefit) = system failure
Since in most cases we cannot increase the time available (we only wish we could), we must either
1) reduce the effort required (make the system more intuitive, fewer steps, etc.)
or
2) increase the perceived benefit (communicating benefits, motivating the user)
This is an article/presentation I developed with Matt Hummel for a usability conference back in 2003 that I never got to give for various reasons. We never developed the obligatory powerpoint, since our trip was cancelled before that was called for and we didn't believe in doing things before deadlines :).
I still believe in the content of the article and have used much of the information on mental models in my upcoming article on planning ahead.
Abstract:
The success of a system depends on the mental model of the user accurately matching the tasks represented. Users whose existing mental model is based on superstitions, hearsay, or faulty assumptions can defeat the best-laid plans of even experienced usability professionals. System usability professionals can support success by identifying and preventing mismatches.
Introduction:
Superstitions have been employed through out history to explain the unexplained and to assign reason and rationale to events. Superstition spans the spectrum from the mystical (spilling salt is bad luck unless you throw some spilled grains over your shoulder), to the comical (a winning baseball pitcher who won't change his socks), to everyday methods and mannerism we don’t even recognize as superstitions. Unfortunately, computer users also develop superstitions to deal with the seemingly irrational behavior of some computer systems. These superstitions can decrease productivity, increase frustration or even prevent the user from performing the task. How do superstitions form and, more importantly, what can we do about them?
What is a superstition?
Dictionary.com defines a superstition as follows:
su·per·sti·tion
An irrational belief that an object, action, or circumstance not logically related to a course of events influences its outcome.
In other words, believing an action or set of actions will result in a specific outcome, even though there are no facts supporting that belief or the belief is just plain false.
Superstitions in everyday systems
In addition to affecting how human beings behave on their own, superstitions can also influence how humans interact with mechanical and technological systems.
One notable example illustrates how we unknowingly apply superstition to the systems and technologies we use in our everyday lives. In his book, The Psychology of Everyday Things (1988), author and usability guru, Donald Norman, references how most people interact with heating and cooling systems. Whether using an electric range, a refrigerator, or the thermostat in a hotel room, when a rapid change in temperature is desired, users crank the temperature control to one extreme. While the user does this thinking it will make the device heat or cool faster, the truth is the device will heat or cool at the same rate regardless of the temperature setting on the control. The control merely tells the device when to stop heating or cooling.
The disconnection in this example is the user’s “mental model” for the heating or cooling device. Because the user has been rewarded in the past with a cooler room or a hotter range, he or she has formed the belief that turning the control further will speed the process. Those who adjust their mental model remove the superstition and simply set the control to the desired setting. Users who maintain the superstition continue to take this incorrect action and can receive unintentional results. (This is demonstrated in conference rooms across the country where someone is constantly turning up or turning down the climate control.)
What is a mental model?
Mental models are the representations people have of themselves, others, the environment, and the things with which they interact. People form mental models through experience and accretion (when new information fits into an existing schema, and thus is quickly comprehended). A mental model can be developed, added to, or modified any time a person’s behavior and the subsequent results are connected in the person’s mind. In reality, users interpret results and sometimes form an accurate model of what is being experienced, sometimes the results are correlated but without a true cause and effect relationship, and sometimes the results are a coincidence. Regardless of the how’s and why’s, if the actions and outcomes are understood by the person as linked, a mental model can be formed or strengthened.
How do mental models apply to software systems?
Because there is more than one role involved in the design, development, and use of software, more than one mental model is imposed upon the system. Norman identified this and began to label the different models associated with software. Germane to this discussion include:
- The Design Model is the designer’s idea of how the system works and represents the true nature of the tasks.
- The System Model is how the system actually behaves and represents the nature of the tasks.
- The User’s Model is the mental model developed through interaction with the software (which incorporates aspects of both the Design Model and the System Model).
The designer expects the user’s model to be identical to the designer’s model, but if the system model is not clear and consistent, the user will end up with an incorrect mental model.
Because much of today’s software is complex and designed to aid in a wide array of tasks, it is becoming increasingly difficult for users to generate “neat and tidy” models (DiSessa 1986). In most cases system training is limited and knowledge around the work domain is disassociated from the system itself. Users, left to identify, structure, and associate the entire knowledge set (including business rules, best practice decisions and procedures, exception handling and system know-how) necessary to perform his or her work tasks, are increasingly challenged to create sophisticated mental models. Without guidance, users can quickly develop narrow, shallow, misconstrued, or inaccurate models. With an inaccurate model and no mechanism for remediation, users’ performance suffers and the business pays the consequences of lost productivity, poor decisions, mistakes, and loss of customer satisfaction.
How superstitions are formed
Human beings constantly seek meaning in their world. When an accurate and complete model is not provided, superstitions are a strategy to construct meaning where there is none, or none is perceived. Frequently, this assemblage occurs with whatever information is available. Most users will build the model with the information at hand and without regard for their understanding of the information, the information’s accuracy, or the other contexts not currently in use. These model constructs are not necessarily accurate or complete. To accommodate the delta between what is understood in the model and what is not, users develop superstitions.
Most superstitions are rooted in the disconnection between what someone perceives to be true and the actual truth, several factors contribute to superstition development and preservation:
Coincidence
If a good or bad result happens to coincide with another event, the user may associate that event with the result, even if there is no causal link for the correlation.
For example, a user sends several files to the printer at one time and the printer jams (due to too much paper in the paper tray). The user may clear the jam and resend the documents one at a time. Because the user reduced the amount of paper in the feeder when he or she cleared the jam, the documents now print fine. The user may make the coincidence of sending multiple documents and an overloaded feeder a superstition and decide to only send one document at a time in the future.
Lack of domain knowledge
Without the complete knowledge and know-how associated with a job, users will develop superstitions around processes, procedures, and decision making. For example, if a customer service representative has a large FedEx package returned because it was sent to a PO Box, he or she may refrain from sending any large packages (regardless of the carrier) to a PO Box.
Tribal learning
When a person who has a superstition teaches another person how to perform the work, a superstition can be passed on without question. Work around solutions, once required by limitations of the software, hardware or management policies, may continue long after the limitations are resolved simply because that was how the user was trained.
Poor usability
Failure in certain user interface heuristics (such as lack of goal establishment, lack of task structuring, not answering descriptive and functional questions) can contribute to superstitions being formed. These force users to “invent” a process regardless if it is the right process.
Mission-critical tasks
When the work is critical or the consequences are significant, users are less inclined to vary from a process. Even if the process seems redundant, illogical, or inconsistent.
Infrequent tasks
If a task is rarely performed, the user may not perceive benefit from trying alternative methods or questioning an approach.
Problems associated with superstitions
Learning today’s software systems can place a significant burden on a user. Often a user must change how he or she works just to match how the system was designed. This forces users to construct their mental models while under considerable and varying pressures. When a user mental model is inaccurate errors are more likely to occur. The more abstract or mismatch the model, the more likely problems will be frequent or have significant consequences.
When a superstition is entered into a work process several negative results can occur for the business:
- Increased time/loss of productivity and efficiency
- Inaccurate results or responses
- Redundant processes
- Inappropriate actions based on the circumstances
- Inappropriate or poor decisions
- Increased learning curves and time to competency due to incorrect assumptions and associations
- Incorrect procedures and processes followed
- Frustration for the user of the system
“Sometimes the result is a minor frustration or inconvenience, such as changes not being saved to a file. Inaccurate mental models of more complex systems, such as an airplane or nuclear reactor, can lead to disastrous accidents.” (Reason, 1990) Superstitions when ingrained into the model of the system are very difficult to eradicate and persist even after the user has been given concrete evidence to disprove them.
Once these superstitions have led to bad habits, remediation becomes even more difficult.
How to identify superstitions
In addition to the negative impact superstitions can have on businesses, they can also be troubling for usability professionals trying to conduct analysis or assessment. Unless the usability professional is also a subject matter expert, it can be difficult to identify the facts from the fictional superstition. Knowing what user behaviors may indicate a superstition’s presence can improve the likelihood of identification. Being mindful of the following behaviors when working with users can aid superstition identification.
Lack of rationale for behavior
Statements like; “I always do it this way.” or “The person who trained me taught me this method” are frequently signs that can indicate a false model, particularly if the behavior seems odd or overly complex. Such behavior should be noted and investigated further with other users and subject matter experts.
Rationale that is overly complex or simple
If the reasoning appears illogical, very complex, redundant, or unassociated, it could be a signal. Likewise, if there is no rationale for suspicious behaviors, the reasoning seems much too straightforward for the situation, or the response is simply illogical, the behavior might be influenced by a superstition. In either case, the observation should be validated by other users and subject matter experts.
Overly complex, tasks and processes
If the work the user is performing appears excessively complicated or complex, users may have false work, task, or system models. This dimension can be difficult to identify. Because so many work environments today incorporate “band-aid” solutions, cobbled together legacy systems, and non-integrated support resources, even best practice methods can be elaborate. Probing questions can be helpful to assess the situation. If responses to how and why are confident, explanatory, or logical, a superstition is not as likely at play than if the answers indicated the behaviors previously mentioned.
Arcane support resources
When the only way to accomplish a task is through the use of a checklist, notes or procedures, the system model is not understood. While not technically a superstition, this points users to mindless repetition of steps that may or may not be required by the task at hand. Their belief in and reliance on the support resources replaces any independent understanding of the system.
Unnecessary or repetitive actions
Compulsive, repetitive, or unexpected actions (such has pressing the save button multiple times) often indicate a superstitious user. Question the user regarding their behavior to determine if there is a logical explanation.
Avoidance
If a user ignores system functionality, features, or alternatives it may be due to a superstition. Find out why the user is avoiding the function. It may be due to a logical reason such as an actual problem with the function, too little training/support on the function, or the function is unnecessary. However, it may also indicate a superstition where the user is fearful something bad will happen.
Usability professionals must be mindful of these behaviors and thoroughly investigate situations where superstitions are suspected. Inquiry involving multiple users across demographics and validation with subject matter experts is prudent if superstitions are to be identified.
How to prevent superstitions
Averting superstition can be accomplished through proper analysis and design techniques. Usability professionals must go beyond traditional human – computer interactions and focus on the cognitive aspects the work involves. It is equally important to identify “how” users think about the work as it is to identify what the work is or how it should be performed. Structuring the work for the user, providing a clear mental model the user can embrace, and embedding knowledge and support into the context of the work can dramatically reduce the cognitive burden and prerequisite knowledge that often results in superstitions.
Specifically, the usability professional must:
- Perform the correct types of analysis to identify any existing superstitions and gain enough research to make the work process structured and evident. Areas where users need extensive training, job aids, or performance barriers exist need to be targeted for integrated support. Due diligence and blended approaches must incorporate the discovery and validation of requirements as well as the conceptualization of reengineered processes and models. A sampling of techniques includes:
- Contextual Inquiry – Observe actual users, doing the actual work, in the actual work environment and interview them to detail the observation data.</li>
- Focus Groups – Meet user groups and subject matter experts to validate observation and interview research and flesh out processes and models.</li>
- Collaborative Design – Work with actual users to extract their existing mental models. Sketch ideas to represent the system in a manner consistent with how the users “think” about the work. (Ariel PSE 2003)</li>
- Design interfaces using solid design heuristics. A few superstition-busting heuristics include:
- Metaphor – An appropriate metaphor can provide the user with a mental model that relates to their work and is extensible for new ideas and functions. Be careful the metaphor doesn’t promote superstitions by implying behaviors or functionality that are not present.
- Establish goals – Recognize the intents the user will have while doing the work and provide clear matches to their goals.
- Structure the work – Clearly orient users to the best practice tasks and process steps. Mark their progression through the work and keep them aware the work completed as well at the remaining steps.
- Answer descriptive, functional, and procedural questions – On each display clearly communicate information that clarifies the work, explains what must be accomplished, and describes the correct approach.
- Provides advanced warning and feedback – Proactively “coach” the user about their decision or its consequences. Be sure to provide clear recognition as to the task status.
- Consistent – Rely on consistent approaches, concepts, and controls. This allows the user to more easily develop and extend their mental model.
- Interpret – Ensure the information provided to the user requires no translation. System responses displays and response should present information in a clear and natural manner for the user. (Ariel PSE 2003)
- Provide a model for the user. Work with user to identify any existing models or work with them to develop one that is obvious and intuitive. Providing a clear model straightforward model for the user is better than leaving them to develop superstitious ones.
- Embed knowledge, tools, and other support resources. Follow design approaches such as performance-centered design that go beyond system usability to develop fully supported work environments. The goal is to support the user “intrinsically” with expert advice, best practice methods, and embedded knowledge and know-how. (Gery 1991) Because users can perform the work with less experience or training and receive task related information within their work context, superstitions are less likely to be constructed.
- Perform user assessments and iterate the design. Let the user be the design’s judge. Construct test based on real-world task scenarios. Observe when the user struggles and investigate the cause. Iterate the design until its use is obvious and intuitive to a new or experienced user. Care must be taken to ensure existing superstitions don’t influence the test. Awareness of the behaviors listed in the “How to Identify Superstitions” section of this paper can help bring attention to mismatched models.
Conclusion
Superstitions can introduce issues and results inconsistent with intended system models. Users experiencing superstition are likely to be confused or misguided in their understanding of methods and procedures. Superstitions can lead to a breakdown in the user’s understanding and trust, further alienating them from an accurate mental model.
Designers who practice diligent, thoughtful analysis and design, incorporate proven techniques and heuristics, and assess their design with users can limit superstition development. When a clear model is provided for a user, the system will be more readily adopted and used productively.
If the user interface is designed to successfully communicate an appropriate model, users interacting with the system will be less likely to construct superstitions and subsequently perform their work with the system more successfully.
One of the podcasts I listen to - Boagworld by Paul Boag - had an interesting feature this week about context - about how where we are influences what we want from a website. Paul listed five contextual considerations for developing for the web; Environment, Device, Comfort, Mood, Time. He makes good points about them, especially the lack of traditional input tools on most mobile devices, but doesn't go far enough into motivations. This tied into many things I've been thinking about with the release of the 3G iPhone.
Like most people, 99% of my browsing is done from a laptop or desktop sitting on a desk. For these times, traditional websites are fine, and even optimized for the experience. I have an expectation that I can browse, dig deeper and see layouts on a computer with high-bandwidth, full browsers and plugins and a high-resolution, large screen as they were intended by the designer. From design to implementation to interaction, everything about the web browsing experience has been optimized for my viewing pleasure in a relaxed setting.
My mobile surfing experience has been on a Palm Treo. The screen is much smaller, the input is more awkward (no mouse) and the speeds are much, much slower. When I surf on my phone, I generally need specific information such as an address, geocache description or a sports score. The experience is much different in many ways, and not just that the hardware is less capable and I may be outside. I also want to use the mobile for different things.
When you are at a desk, you have the opportunity to browse. The idea of "surfing the web" came from the idea of skimming the top of the ocean of information, going where the wave takes you and finding things through serendipity. When you are mobile, you are much more targeted on a specific need. I don't want to look at a company's brochures or product demos on a phone. I don't need all of your pretty branding and themes if I am browsing over a 3G or Edge network. There is a greater sense of urgency and focus.
The solution used to be a WAP (wireless access protocoal) site that is optimized for the mobile browsing platform - it has scaled back images, floats into a narrow column that can be read on a 320x240 screen, and has large links and buttons for tapping. A good example is ESPN for getting game scores and play-by-play: http://wap.espn.com/ You can within three clicks find any college football game on a Fall Saturday and watch see short play-by-play descriptions on an automatically refreshed page. On a regular computer, the interface is hideous compared to what we've been led to expect, but on a phone, the load times are lightning quick, and interactions and decision points are crisp, getting you where you want to go quickly and with a minimum of fuss. Compare this with their regular doorway of http://www.espn.com/. The mobile site features next to none of the graphics, advertisements and animations of the website. It allows you to dig deeper into the stories you want to see, but leaves out many of the feature stories and options that wouldn't translate to the mobile web. ESPN knows what their mobile users want and how that differs from what they want when they sit down at a computer, and delivers.
The iPhone and the new 3G iPhone started to change the thinking around mobile browsing. Having the Safari browser installed and their zoom features lead developers to think there is no longer a need for a WAP. Users can browse their entire traditional site on their phone, so we don't have to change a thing. The missing factor in this equation is still intent.
In my work in the insurance industry, I have often wanted to develop a WAP application, to see how it's done and experiment. The problem is, no one wants to work on their insurance while they're mobile. Reviewing insurance is a more reflective task, requiring paperwork, comparisons and details that cannot be easily supported in 320x240. I will find a way to do this, as our sales force is mobile, but other than basic communications covered by email, twitter, SMS and actual phone calls (do phones still make phone calls?), I haven't found the killer application yet that is targeted, a short duration, requires minimal entry and can be done without a lot of reference materials at hand.
Give me time.
In late 2001, after starting my own consultancy, I decided I needed my name out there, so I wrote an article for Performance Improvement magazine. It was about portals, which were somewhat new at the time. While the technology had been around, they were still abused and poorly designed for users in many cases. Though I didn't know much about the technology, I knew how to design for performance, and wrote the article.
I went on to present this topic with Matt Hummel at ForUse 2001 in Portsmouth NH.
If I were updating this article today, I would do many things differently, but then again, I have been working within the IBM Websphere Portal infrastructure for four years now and know how many of my ideas were naive. :)
Here it is, warts and all.