Part of our job when performing UX research is to ask questions.
Whether it’s a contextual enquiry, survey, focus group, or the old faithful “What are you thinking?” during a usability test, a carefully crafted question is fundamental in uncovering the root of a design problem.
Except when the responses are biased.
Biased responses can compromise our findings—both within a participant’s answers and in our handling of them. In this article I’ll explain how to identify, allow for, and better understand bias when performing UX research.
Meet Bob. He’s an enthusiastic (fictitious) participant in the contextual enquiry session you and I are just about to run.
Unknown to him—but luckily for you—he’s about to give us some examples of cognitive bias as he participates in our user research. A cognitive bias is a shift in thought processes that can result in the interviewee’s answers being distorted, inaccurate, or just plain irrational.
There are many types of cognitive bias that can affect your research. Knowing what these biases look like can help you recognise when they happen, and allow for them as you conduct sessions or interpret findings. Here's my top ten:
1. Chronological Snobbery (Recency Bias)
Recency bias describes our tendency to give events from the recent past a greater weight than those from longer ago. When we ask people about their past behaviour, their answers are more likely to be influenced by their most recent experiences, as they’re easier to recall.
It’s a bit like evaluating the success of your favourite sports team—when we asked Bob to describe the performance of the news ticker on our website, he focussed primarily on what had happened recently, without considering the ticker across a long period of time.
Tip: Word your questions carefully, paying attention to time periods mentioned. Some level of this bias is inevitable, however, so factor this into your analysis.
2. Hot-Cold Empathy Gap
We’re not very good at gauging the importance of a person’s fundamental urges—for example, being hungry, tired, hot, or cold. This is true whether we’re attempting to predict the future or evaluate the past, and it’s also true whether we’re assessing ourselves or someone else. This lack of perception, referred to as a hot-cold empathy gap reflects the fact that it’s difficult to empathise with someone if we’re not currently experiencing those same urges.
You may have heard the advice that it’s a bad idea to go shopping on an empty stomach, as it can cause us to buy too much or be more susceptible to impulse purchases. In our research, we’ve asked Bob to pretend he’s shopping online for a new microwave oven. Bob has told us that he probably wouldn’t spend more than $300, and he may genuinely believe that’s the case—even though in the past he may have been swept up by the heat of the moment and done just that, and may be tempted to do so again in the future.
Tip: To fill the gap, it’s important to provide a realistic context to the questions you ask. This will help Bob put himself in the situation more easily.
3. Confirmation Bias
Confirmation bias refers to a type of selective thinking where we tend to notice something that confirms our beliefs, and to ignore or undervalue the relevance of something that contradicts our beliefs. The effect is stronger when we feel challenged about emotionally charged issues or deeply entrenched beliefs.
For example, Bob has told us that he believes that during a full moon there is an increase in admissions to a hospital’s emergency department. If he was to try and validate this theory by observing the ER bay, he’d be likely to pay close attention to admissions during a full moon, but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time would unjustifiably strengthen his belief in the relationship.
Tip: This is also a risk during analysis when personal goals, internal politics or lack of experience can make it tempting to cherry pick from the research findings. Observers might get fixated on a response by a participant that confirms their position on a given issue.
4. Social Desirability
Social desirability is the tendency of some respondents to report an answer in a way they deem to be more socially acceptable than would be their “true” answer. People behave this way to project a favourable image of themselves, and to avoid receiving a negative evaluation. The outcome of the strategy is an over-reporting of socially desirable behaviours or attitudes and an under-reporting of socially undesirable behaviours or attitudes. This is one of those cases that reinforces the fact that we should pay attention to what users do rather than what they say they do.
By way of example, when Bob participated in a survey on feminist attitudes, he responded in support of feminism when a female researcher conducted the survey, but was neutral to the idea when a male conducted the survey.
Tip: Try to neutralise or allow for any influences, and ask questions in a way that assures Bob that he won’t be judged by the answers he gives.
5. Anchoring (Focalism)
Anchoring is a form of bias that reflects our tendency to rely too heavily on a past reference, trait, or piece of information.
For example, we asked Bob whether he thought $50, $20, $10 or $5 was a fair amount to pay for a particular product. Because the numbers we listed first were the higher amounts, he's more likely to choose a higher number in his response. If we'd listed the prices in the opposite order, Bob would probably indicate that a lower amount was a fair price.
Tip: Don’t ask questions containing too many variables, or containing variables that are unnecessarily complex. You may also consider randomising the order that items are listed in, when interviewing multiple participants.
6. The Framing Effect
Related to the anchoring effect, the framing effect is a bias that refers to the tendency for our responses to be influenced by the way a problem or question is presented. Different conclusions might be drawn from the same information, depending on how our awareness is directed and whether it is presented as a loss or as a gain.
For example, while we were discussing the power of positive thinking, I asked Bob whether the glass of water in a product shot on the home page was half-full or half-empty. The fact that he answered “half-full” may be the result of how this question was framed by our current conversation topic.
Tip: Framing is very difficult to avoid. Consider keeping a record of topics you’ve discussed with a participant, and try and dig around any recent influences that might be affecting their responses to key questions.
7. Curse of Knowledge
The curse of knowledge bias describes the tendency for one’s knowledge of a topic to limit one’s ability to think about it from a less-informed perspective. We know what we know—in fact we know it so well that we assume everyone else knows it as well as we do. Our language changes, and we may use shortcuts for concepts rather than full descriptions. Because of our bias, it’s difficult for us to imagine these concepts being hard to understand.
Because I’m very familiar with the type of business for whom I’m conducting user research, I didn’t pick up on some indicators that the current website wasn’t explaining the business offering clearly to Bob. It’s so obvious to me what this business does that I forgot to consider that Bob might not have the same full understanding.
Tip: Independent research is more effective in at identifying problems that the creator or provider do not experience, but others do.
8. Context Effect
The context effect occurs because our cognition and memory are dependent upon context. This means that word recognition, learning abilities, memory, and object recognition are all more difficult to perform when attempted out of context. This lack of context can have a substantial impact on marketing and consumer decisions.
For example, the contextual enquiry we’re conducting with Bob is at his home. If I were to ask him a question about his workplace, the recall time and accuracy for his work-related memory would be lower than if I’d asked him at work.
Tip: When possible, conduct your interviews in the context of the problem you’re investigating. If this is not possible, do what you can to simulate such an environment during your interview. For example, if the context is “at a beach”, then putting a poster on the wall of a beach scene, and bringing an icy cold drink and a beach ball into the office will go some way to simulating that situation.
The term suggestibility refers to a form of misattribution where ideas suggested by a questioner are mistaken for memory. A person experiencing intense emotions tends to be more receptive to ideas, and therefore more suggestible. Generally, suggestibility decreases as age increases.
Suggestibility could materialise in subtle ways, such as yawning when we notice someone else yawning, or it could be as blatant as the fact that Bob claimed as his own achievement a task that we just cited as an example in the previous question.
Tip: Avoid leading questions or giving away any more information than is strictly necessary. Writing and checking questions in advance can help you to avoid introducing this bias.
10. The I-knew-it-all-along Effect (Hindsight Bias)
A hindsight bias is the inclination to see past events that have already occurred as being more predictable than they were before they took place. After an event, people often believe that they knew the outcome of the event before it actually happened.
If, after the fact, something seems to you like “common sense,” chances are your hindsight bias is kicking in.
Tip: Document whatever knowledge exists in the organisation about your area of research, and use it to unveil hindsight bias should it enter the conversation while discussing the results.
I’ve only listed 10 of the most common types of bias, but perhaps I’m suffering from my own bias blind spot? Almost a hundred different biases have been identified by psychology research and behavioural economics, so there’s plenty of fun to be had!
Many of these biases emerge while talking with a participant, and can be avoided by observing the participant’s behaviour. Becoming familiar with cognitive biases—whether you’re drafting interview scripts, responding to participant questions, or analysing findings—will make your research more robust, leading to an increased confidence in your design decisions.
And all of us want that, right? Or perhaps I’m biased …
- Wikipedia has a whopping great list of biases that can be useful to consult.
- Stephen P. Anderson's Mental Notes are a brilliant set of cards that outline some psychological principles that can be used in design and user research.
- Mind Hacks: Tips & Tools for Using Your Brain, by Tom Stafford & Matt Webb (O’Reilly Media)
- The Information, by James Gleik (Vintage)
- The Shallows, by Nicholas Carr, is an inspiring and reactionary argument that the Internet is changing how we think.
- Gestalt Psychology by Michael Tuck (Six Revisions)
- Expoiting Cognitive Bias: Creating UX For The Irrational Human Mind by Jay Vidyarthi (SlideShare)
- Persuasion Triggers In Web Design by David Travis (Smashing Magazine)