Creating a Good Survey #05: Top 5 Bias-Inducing Question Types and Solutions

The purpose of a survey is to hear respondents' honest and objective opinions. However, during the question-making process, the researcher's intentions or biases can be subtly reflected, leading to specific answers. Such biased questions can severely distort respondents' true thoughts, undermining data reliability, and ultimately risk leading to wrong business decisions.7 For example, a public opinion poll on the Iraq War showed that when the condition "thousands of US casualties could occur" was added, support for military action dropped sharply from 68% to 43%, clearly demonstrating how a small difference in a question can significantly impact responses.7 If marketers use biased questions to elicit only positive responses about a new product concept, they might face the costly consequence of launching a product disconnected from actual market reactions.

The following are common types of bias-inducing questions marketers often make when writing surveys, along with their solutions. Understand them clearly through various good and bad examples to elevate the quality of your survey questions.

 

1. Leading Question

  • Bad Question Example: "How satisfied were you with our innovative and outstanding new product design?"
    • Problem Analysis: Positive modifiers like 'innovative and outstanding' presuppose a positive evaluation of the product, making it difficult for respondents to give an objective assessment. Respondents may feel burdened to express negative opinions.
  • Good Question Example "What are your thoughts on our new product's design?" or "Please rate your satisfaction with our new product design."
    • Key Improvement Point: Use neutral and objective words to allow respondents to freely express their opinions. Avoid expressions that evoke emotions or suggest a particular direction.

  • Bad Question Example: "Most experts evaluate the OOO policy as effective. What are your thoughts?"
    • Problem Analysis: Presenting expert opinions subtly guides towards a specific answer (the policy is effective). Respondents may feel pressured to conform to the majority opinion.
  • Good Question Example: "What are your thoughts on the effectiveness of the OOO policy?"
    • Key Improvement Point: Do not imply a specific answer by borrowing external opinions or authority; ask for the respondent's own genuine thoughts.

 

2. Assumptive/Loaded Question

  • Bad Question Example: "What is your child's favorite toy from our company?"
    • Problem Analysis: Assumes the respondent has a child and that the child likes toys from our company. Respondents without children or whose children do not like our company's toys will find it difficult to answer.
  • Good Question Example: (Step 1) "Do you have children? (Yes/No)" (Step 2, for 'Yes' respondents) "Does your child play with toys from our company? (Yes/No)" (Step 3, for 'Yes' respondents) "If your child has a favorite toy from our company, what is it?"
    • Key Improvement Point: Do not make assumptions about the respondent's situation or experience. If necessary, confirm with preliminary questions before proceeding. Providing a 'Not Applicable' option is also a solution.

  • Bad Question Example: "Which OTT service do you primarily use to watch movies on weekends?"
    • Problem Analysis: Assumes the respondent uses OTT services on weekends and watches movies.
  • Good Question Example: "If you watch movies on weekends, what method do you primarily use? (Multiple answers possible)" (Options: OTT service, Cable TV, IPTV, Cinema, Other, etc.)
    • Key Improvement Point: Do not assume the respondent's behavior or status. Start with a general question or provide various options so all respondents can answer.

 

3. Double-Barreled Question

  • Bad Question Example: "How satisfied are you with the usability of our website's information search function and design?"
    • Problem Analysis: Asks about two different aspects—information search function and design usability—in one question, confusing respondents about which part to answer. A positive/negative answer makes it unclear which aspect is being evaluated.
  • Good Question Example: (Question 1) "How satisfied are you with the usability of our website's information search function?" (Question 2) "How satisfied are you with the usability of our website's design?"
    • Key Improvement Point: Ensure each question contains only one idea. If you want to ask about two or more things, separate the questions. Check for conjunctions like 'and' or 'or'.

  • Bad Question Example: "How was the product's price-to-performance ratio and A/S satisfaction?"
    • Problem Analysis: Asks about two separate topics—price-to-performance (value for money) and A/S (after-sales service) satisfaction—simultaneously.
  • Good Question Example: (Question 1) "How do you evaluate our product's price-to-performance ratio?" (Question 2) "How satisfied were you with our product's A/S (after-sales service)?"
    • Key Improvement Point: Items requiring different evaluation criteria must be separated into different questions.

 

4. Questions with Absolutes

  • Bad Question Example: "Do you always wear a mask when using public transportation?"
    • Problem Analysis: Absolute terms like 'always' force respondents into extreme answers (Yes/No). Most people find it hard to answer 'always,' leading them to choose 'No,' which can distort actual behavior patterns.
  • Good Question Example: "How often do you wear a mask when using public transportation?" (Options: Always wear it, Mostly wear it, Sometimes wear it, Rarely wear it, Never wear it)
    • Key Improvement Point: Avoid extreme words like 'always,' 'never,' 'all,' 'none.' Provide various options indicating frequency or degree so respondents can answer according to their situation.

  • Bad Question Example: "Do you have no complaints at all about our product?"
    • Problem Analysis: The term 'at all' can lead respondents to answer 'No' (I have complaints) even for minor issues, potentially resulting in more negative outcomes than reality.
  • Good Question Example: "Overall, how satisfied are you with your experience using our product?" (After presenting a satisfaction scale) "If you experienced any inconvenience or have suggestions for improvement while using our product, please let us know."
    • Key Improvement Point: Instead of extreme expressions, use a combination of neutral questions and open-ended questions to gather specific opinions.

 

5. Biased Answer Choices

  • Bad Question Example: "Please select your level of anticipation for our new service. (① Very excited ② Excited ③ Slightly excited)"
    • Problem Analysis: All answer choices are skewed towards a positive direction, preventing respondents with negative or neutral opinions from accurately expressing their thoughts. This results in positively biased data.
  • Good Question Example: "Please select your level of anticipation for our new service. (① Very excited ② Somewhat excited ③ Neutral ④ Not very excited ⑤ Not at all excited)"
    • Key Improvement Point: Provide balanced answer choices that include positive, neutral, and negative opinions. Consider including options like 'Other' or 'No opinion.'

  • Bad Question Example: "What is your average daily smartphone usage time? (A. Less than 1 hour B. 1-2 hours C. 2-3 hours)"
    • Problem Analysis: '2 hours' is duplicated in options B and C, confusing respondents who use their phone for 2 hours. Also, those who use it for more than 3 hours have no option to choose.
  • Good Question Example: "What is your average daily smartphone usage time? (A. Less than 1 hour B. 1 hour to less than 2 hours C. 2 hours to less than 3 hours D. 3 hours to less than 4 hours E. 4 hours or more)"
    • Key Improvement Point: Answer choices must be mutually exclusive and collectively exhaustive. That is, one response should correspond to only one option, and all possible responses should be included.

 

Biased questions are the biggest enemy of surveys. Researchers must always carefully review how their questions might affect respondents and strive to create the most neutral and objective questions possible. Showing the questionnaire to colleagues or acquaintances for feedback beforehand is also a good way to reduce bias.