Clippy was ahead of his time.
I'll let that sink in.
Clippy, the infamous Microsoft Office assistant, was introduced in November 1996. He was refined three years later, in Microsoft Office 2000. He went into retirement two years later, when he was turned off by default. And he finally departed this digital veil in 2007, when Microsoft Office dismissed him all together.
User experience will happen. Whether it's designed up front, or a product of users interacting with your product after the fact, the human and product will interact. Good UX happens when we make decisions in a way that understands and fulfills the needs of both our users and our business.
It's important in this definition to recognize both sides of the equation; the user and the business. UX design strives to produce positive emotions in the user, whether it's through delight or just satisfaction in getting the task performed efficiently. On the other hand, anyone working for an organization has to ensure the organization goals are met as well. Sometimes negotiating between the two stakeholders can be tricky, when the needs are in conflict.
So why do we need UX? To ensure someone is looking out for both sides equally.
If you ask ten people where user experience belongs in an organization, you will likely get eleven answers, but first, you might get asked what you mean by user experience (UX).
- Client (or Customer) Experience (CX), refers to the impression you leave with your client, resulting in how they think of your brand, across every stage of the customer journey. This involves every step from advertising, brochures and public websites to forms, call centers and correspondence.
- User Experience, the way I am defining it, is focussed on the time the client is interacting with the website or web application to accomplish a task.
Hello, I'm Gary!
An UI UX designer driven by ideas and enthusiasm. I always keep an eye for clean nice visual design for website and applications.
In his book, Performance-based Learning Objectives, Robert Mager wrote about behavioral objectives from an instructional standpoint. Before designing a training program, designers must know in very concrete terms what they are expected to do to demonstrate their mastery of the material. The same factors that go into developing good performance and learning objectives are used to develop thorough scenarios that can be easily evaluated.
Mager’s Theory of Behavioral Objectives
In the design of instructional materials, training needs are first analyzed and the learning goals of the program are determined. Mager’s central concept is that a learning goal should be broken into a subset of smaller tasks or learning objectives. By his definition, a behavioral objective should have three components:
• Behavior. The behavior should be specific and observable.
• Condition. The conditions under which the behavior is to be completed should be stated, including what tools or assistance is to be provided.
• Standard or Degree. The level of performance that is desirable should be stated, including an acceptable range of answers that are allowable as correct.
A fourth component can be inferred of Actor – who is required to perform the task.
Consider the following behavioral objective:
Given a stethoscope and normal clinical environment, the medical student will be able to diagnose a heart arrhythmia in 90% of effected patients.
This example describes the actor (the medical student), observable behavior (identifying the arrhythmia), the conditions (given a stethoscope and a normal clinical environment), and the degree (90% accuracy).
Applying ABCD to Scenarios
A good scenario also covers ABCD
• Actor (who is using the system, what access and motivation does he/she have),
• Behavior (what action is required? What result is expected?),
• Condition (are there elements in the work context that influence the design, such as noise, lack of time, etc.), and
• Degree (to what extent must the task be done accurately and/or quickly?).
This construct allows reviewers to identify with the situation and clearly identify if the scenario is realistic and appropriate.
Gloria Gery once told me that all systems training is compensatory for poor design. In other words, if the design is perfect, no training is needed.
"Just remember: you're not a 'dummy,' no matter what those computer books claim. The real dummies are the people who–though technically expert–couldn't design hardware and software that's usable by normal consumers if their lives depended upon it." - (Walter Mossberg)
This thought has stuck with me as I've designed every increasingly complex systems. I've decided I agree with her, and that it's not a bad thing.
Before you start sending emails, this doesn't say:
- All training is unnecessary in every case. Training is necessary to incorporate new employees into a business. Domain knowledge about the business you are in, corporate attitudes, principles and guidelines must be learned, along with key procedures in case of emergencies. What this does imply is training should be focused on value-added information that is best transferred person-to-person, instead of rote procedures and facts about systems.
- People should be able to walk up and use a system to its fullest. The system should be built so that given a user knowing WHAT he or she is supposed to do, HOW to do it is evident. Shortcuts and other productivity enhancements may be learned over time, but their absence should not inhibit or prevent performance of the task.
- Designing systems that require no training is possible, or even desirable in all situations. Such explicit systems require a great deal of knowledge of the performance environments, users, motivations, business needs and capabilities. The time spent to develop a training free system may not be justified by the benefits. If a short tutorial or instructions and system help within the application can help the user grasp the essentials, it may be a better use of resources to go this way.Additionally, ease of learning does not always equate with ease of use. A system that is easy for anyone off the street to use is rarely one that "expert" users can use productively. Consider most "wizard" interfaces. An interview style interface walks the user through decision points and gathers information. The wizard presents the decisions along with explanations of the implications of the step. These systems can be used by nearly anyone because the domain knowledge and how it is represented in the system are explicit. If an expert user does not require such handholding, a single screen interface with just the data fields may be sufficient to perform the task much more efficiently. Familiarity with the business and a system allows experts to rapidly perform tasks such as data entry at a speed that would be impossible in a wizard-like interface.
As a designer, I strive to design the most intuitive systems and applications I can. I go into a design knowing some training will probably be necessary, but I don't give in to the temptation to say, when faced with a design compromise, "Oh, that will be covered in systems training."
My approach has been to design a system as intuitively as possible, and then, based on user feedback during testing, add instructions to the screens and information to reference and training. The key is frequent and iterative testing. Get a design out in front of as many people as possible. When faced with an issue, either integrate a solution into the application, add on-screen instructions, add online reference or training, or add it to a training class curriculum. This "Frequently Asked Questions" method of systems documentation and training allows the trainers to focus on issues that realistically come up instead of spending time and effort documenting functions and behaviors that are well accepted by everyone.
Now you may let the emails fly. :)
Elsbernd's Hierarchy of System Needs, described in my blog a few months ago, has been hanging on my wall at work. I've given it some thought and while there are aspects of it that I really like, such as the concept of satisfiers and delighters (functionality and performance are satisfiers, usability and aesthetics are delighters), I was forced to admit the model doesn't hold true for all applications.
The original idea came from the traditional commercial enterprise software development model, in which at first, relevant functionality was enough for a new application to be successful. As long as the software accomplished something users were previously unable to do, it didn't matter if the system was slow, difficult to learn or use, or hideously ugly, it accomplished a task. Eventually, users become complacent and start demanding more functionality, faster or more dependable performance, easier to use and more pleasing to the eyes. Eventually, functionality and performance is taken as a given, and only their absence is noted. Usability and Aesthetics, on the other hand, are delighters, in that their presence can increase satisfaction with a system. This is a gross generalization, of course, and doesn't adequately defend against some very legitimate criticisms.
Systems aren't built in a hierarchy - all systems have some level of all four attributes. They serve some purpose, have some level of performance, can be used at some level, and have a presentation. It's not accurate to say systems are built one layer of the hierarchy at a time or to imply that the concerns are always approached in the same order.
The dividing line between satisfiers and delighters is not black and white. Some features and functionality are delighters. When a system does something for you that you didn't expect or even know you wanted can be a delighter, and many of the aesthetic choices made are definitely dissatisfiers.
Not all systems will become "self actualized" at the top of the hierarchy. There are reasons systems may never be developed into an attractive, easy to use system. Sometimes the benefits don't justify the cost. If a system is going to be retired, no new development will be funded. The model does not address the downward pressure of cost or limited resources against the desire to climb the hierarchy.
After weighing these criticisms, my thoughts lately have changed. I no longer see the factors in a hierarchy, but as competing priorities for a limited pool of resources.
All systems development teams must prioritize their time and resources among the four factors. To increase the resources for usability, attention is taken away from the others, unless additional budget is found. There is no ideal allocation of resources to each of the factors. Just as no two applications have the same requirements, the needs of the organization or users may be different from one project to the next. Depending on the functionality needed, performance environment, age and training of the users and the visibility of the application in the marketplace, any one of these factors may be weighted more heavily during development.
The allocation of resources may change over time as a system matures. Once the application is established or as competition enters the market, balancing the factors may become a higher priority. The main point is the factors must fight for space within a given budget of resources and energy.
Traditional systems may follow the hierarchy early in their evolution and favor functionality over any other factors. Systems like mainframes and legacy systems may have interfaces considered unusable by modern standards. They may be difficult to learn and use, involve obscure and arcane codes, or even paper punchcards. Younger users may not believe it, but there was a time before graphic user interfaces. Many of these systems still exist in organizations because they work. They do something relevant for the company and they haven't caused enough pain to be retired or upgraded. Because these systems do what they do well, they no longer receive funding, and the resources they do get are focused on maintaining the system, not upgrading functionality or improving the user experience.
On the other end of the spectrum are the trendy novelty applications that take advantage of trends or new developments and are seen as the "gee whiz" stuff for a limited amount of time. Consider new developments such as animated gifs, scrolling marquees or flash intro pages. These toys don't add much functionality to a website, and are mostly eye candy. For the most recent examples, turn to the iPhone applications. Developers saw a rich media device with the novelty hooks of a touch-screen and accelerometer and rushed out the technological equivalent of cotton candy such as the virtual lighter (moves as you move the phone), koi pond, virtual beer and light saber. These applications have next to no functionality other than momentary amusement. The performance of the application is not a focus as it doesn't do that much to start with. Likewise, if it doesn't do much, it won't be hard to use. These applications have a limited shelf life as the novelty dies, but for the brief period of time, they can be successful.
I have to thank several people for challenging my thinking and discussing the concepts with me, including Matt Sanning and Greg Moore. These thought exercises, like the iPhone apps, may not be the most productive things, but they keep me entertained.