Surveys play a massive part in every organization’s growth. How? By getting data that generate insights worth taking action on. These actions, taken correctly, ultimately bring sustained growth to a company.
So to make sure these surveys yield the needed data, your team spends countless hours preparing a template, questions, and flow. They also go with a superb survey software for smoother functioning.
What could go wrong now, right? Wrong! Different types of response biases can still ruin your survey.
The what, how, major types of response biases, and the ways to avoid them. We’re ready to discuss it all. Get ready with a hot cuppa and a comfortable spot. You don’t wanna miss a word here!
What Is Response Bias?
According to the Encyclopedia of Survey Research Methods, response bias means:
“Response bias is a general term that refers to conditions or factors that occur during the process of responding to surveys, affecting how responses are provided. Such circumstances lead to a nonrandom deviation of the answers from their true value.”
A straight and simple response bias definition, isn’t it? Typically, response bias is a common occurrence in surveys focusing on human behaviors or opinions. Since perception plays a huge role in people’s lives, they respond in a way that portrays a positive personality.
Like, if respondents are asked how often they smoke in a week, and the options are: ‘frequently, sometimes, and infrequently, they’ll choose sometimes or infrequently to be perceived positively, creating a bias.
Biases like this, and we’ll see more examples while discussing the types, affect a business in 4 different ways;
The 7 Types Of Response Bias And How To Avoid Them?
You know what response bias is, its effect on surveys, and ultimately your business’ growth. But what are the different respondent biases you’ll come across, and how best to avoid them?
That’s what we discuss here with these 7 biases, starting with;
1. Desirability Bias
The desirability or social desirability bias in surveys comes when respondents know their honest opinions won’t necessarily be appreciated or accepted for sensitive questions. So, rather than choosing an option aligning with their thoughts, they go with the socially desirable response, over-supporting ‘good behavior’.
Here’s a great example – “On a scale of 1 to 10, how much do you support the concept of net zero carbon?” Or “Do you find everyday alcohol consumption acceptable?”
For these 2 questions, respondents would mostly side with the socially responsible and accepted responses, even when they think differently. And for questions like “Should you donate a part of your income to charity every month”, you’ll see positive responses, even when those taking the survey don’t donate or haven’t started yet.
The ways to avoid desirability bias?
- Start by making the survey anonymous for all respondents. You’ll straightaway notice more honesty in answers.
- Ask indirect questions where respondents don’t feel like being judged. Example: Instead of asking “How much time do you waste at the workplace every day?”, ask, “How much time do you think people waste in the office every day?”.
- Try framing questions with neutral wording. Example: Instead of asking “What do you dislike about the office culture?”, ask, “What changes would you love to see in the office culture?”.
2. Demand Bias
This type of response bias originates when respondents are influenced simply because they’re a part of the survey. They change behaviors and opinions as they’re a part of a study or survey.
There are several reasons why people do that. The most common is when respondents can see the live poll status, as they’ll change answers, to either positively or negatively affect the overall result. If people really like a brand, they’ll support its ideas without giving much thought to theirs in the survey. That’s another way demand bias is generated. Lastly, when a respondent personally knows the survey taker or researcher, he/she will provide support by choosing favorable options.
The ways to avoid Demand bias?
- Choose survey groups that are appropriate for a particular survey, not the one where you know most people.
- To get absolutely honest opinions, write emotion-free questions. Example: Instead of starting questions like “As a long-term customer…”, begin directly. Talk about how grateful you are to your long-term customers in the introduction, not in the questions.
- Ask them for their ideas on a new product, website design, etc. Don’t give them yours or they’ll not think much and side with that only.
3. Acquiescence & Dissent Bias
Acquiescence and dissent bias is about survey takers being either extremely positive or extremely negative. In this state, they’re not thinking about giving an honest, thoughtful answer, but just choosing to be positive or negative with their replies.
Acquiescence bias is an extreme form of social desirability, but here, respondents simply agree with the research and question statements, rather than responding in a ‘socially acceptable’ way. So, whatever the question, the reply will be a ‘yes’ if that’s an option.
The reason for acquiescence bias stems from respondents’ perceptions of how they think the survey organizer wants them to respond. And then some respondents only choose positive options because they worry about their online image. While it’s good to get a lot of ‘yes’ answers in a survey, they don’t help much when you’re looking to collect and analyze genuine data.
The exact opposite of acquiescence bias is dissent bias. Here, as the name suggests, respondents are always disagreeing with the statements they’re presented with, rather than giving true opinions. The most likely reason for that is to make a mockery of the survey and finish it in no time!
The ways to avoid acquiescence and dissent bias?
- Completely avoid leading and loading questions that prompt survey takers to give a predetermined response. Example: “How great do you think our new product update is?” – Don’t use a question like this.
- Use different question types other than the ‘yes’ or ‘no’ one, like open-ended and multiple-choice questions.
- Try not to use extremely positive or negative answer options. Let people think when they see your options.
4. Non-response Bias
A non-response bias occurs when respondents skip certain questions or the entire survey. A lack of anonymity and a long survey are the prime reasons behind it.
For example, your employees will always be reluctant to answer a non-anonymous employee engagement survey. In the case of customer surveys too, if you’re collecting feedback after a long time of your customer’s product purchase, they’re unable to give a reliable response and choose to skip the survey entirely.
The ways to avoid non-response bias?
- Firstly, keep your survey short and simple. Don’t ask all the questions in just a single survey.
- Send an anonymity-enabled feedback survey at every customer and employee touchpoint to get reliable data right away.
- Go omnichannel with your survey distribution. Don’t just send it over an email. Use all the social media platforms and also embed the survey on your website.
5. Extreme & Neutral Response Bias
When using Likert scale questions in a survey, extreme and neutral response bias creeps in easily if you’re not aware. In this, the respondents either choose the extreme options – ‘least likely’ or ‘most likely’, or stick to the neutral one.
Extreme bias happens mainly when strong words are used in a question. Example: “How justified it is to give a rape accused a chance of trial?”. Here, the respondent will mostly side with the ‘completely justified’ and ‘completely unjustified’ options. Then there are certain areas where you’ll get extreme answers, like questions on the religious beliefs of people.
Neutral response bias is a result of respondents not understanding a survey or wanting to finish it quickly to get some sort of reward or coupon. Whatever question you ask, these respondents will always choose the neutral or middle option. And that is a sheer waste of time and resources, as this data will lead nowhere.
The ways to avoid extreme and neutral response bias?
- Form questions with positive or neutral wording. This will immediately invoke more thinking among survey takers.
- Keep the entire survey simple and short, with jargon-free, one-liner questions and options.
- Double-barreled Likert scale questions are confusing and often lead to neutral response bias. Try not to use it. Instead, read this and use Likert scale questions rightly.
6. Question & Answer Order Bias
The next type of response bias is the question-and-answer order bias. Talking about the question order bias first, as the name suggests, it is a situation where survey respondents give a biased answer because of the type of preceding question they answered. Basically, survey takers respond in a specific way as a pattern is developed from the preceding questions.
Here’s an example to gain more clarity;
Q1: Did you like the [product feature] of [product name]?
Q2: Are you comfortable with [product name]’s user interface?
Q3: Overall, how would you rate our [product name] on a 1 to 5 scale?
See the sequence of questions here. It starts from a specific feature of a product to its overall performance. If respondents gave a high rating to the first 2 questions, they’d be reluctant to give a lower rating to the product. And that’s how question order bias is created.
The answer order bias is created from respondents’ tendency to select the first or the last option in a multiple-choice question. The intent behind choosing the first answer option is that it is the first option read by the surveyees, which they take to be true. While choosing the last option is because of the recency effect, i.e., people remember the most recent option clearly so they choose that only.
The ways to avoid question-and-answer order bias?
- Always start with the overall rating question before moving to ones on specific product/service feedback.
- Group questions in the same category as one single question, or keep them entirely separate in a survey.
- Try minimizing options in the multiple choice questions, but provide the “Prefer not to say” options to make the survey inclusive.
7. Voluntary Response Bias
Lastly, a voluntary response bias occurs when your sample is made of people who have volunteered to participate in the survey. While this isn’t always bad for your survey or data collection, it sometimes results in overreporting on one aspect, as you’re more likely to have a sample with similar opinions, which doesn’t bring out the intended quality of data.
The ways to avoid voluntary response bias?
- Try to target an audience for your survey. It shouldn’t be the other way around.
- If you’re getting volunteers for a survey, make sure their opinions aren’t similar. The quality of data will be much better.
Why Response Bias is taken Seriously
Data inaccuracy – When respondents choose options to develop a perceived notion about themselves, the gathered data is highly inaccurate and doesn’t represent the true opinions of the target market. This survey data is not going to help a company achieve its goals.
Poor strategies and dissatisfaction – Often, companies ignore or miss out on these biases and make crucial business decisions based on the collected survey data. The results are poor strategies and investments, along with sheer dissatisfaction. The decision-makers cut down on survey campaigns and market research, which eventually hurts their growth prospects even more.
Low ROI – Poor survey insights guarantee poor product performance, resulting in low return on investment for promising ideas and innovation. Not many organizations handle such setbacks well.
More time, money, and resources – Surveys full of response bias are repeated to test the data again, which takes more time, more money, and more resources. Not an ideal scenario for any company.
Bias Is Not Always Bad!
Yes, just because a survey has bias doesn’t mean it’s bad. If you have a clear idea of how these biases are impacting the overall data quality, it’s totally fine. Because let’s be honest, eliminating all bias from a survey is not in anyone’s hands.
Biases are also useful in categorizing your target market based on demographics, income level, interests, and more. So ultimately, biases lead to better data collection if you use them correctly.
Just don’t let it run all over your survey; everything will be fine. To know and understand more about how to tackle survey response biases, and how to conduct winning surveys, our team, at SurveySparrow, is here 24/7. Hit us up and let’s start talking. We’re eagerly waiting!