This is the final article in a four-part series explaining how to implement a great online survey.
Now that you’ve picked the right survey tool and gone through our 13 critical steps to building an online survey, double-check your questions for these 6 common mistakes.
The importance of surveys is crucial for every business. In most cases, they often help business owners get an insight into a clients’ mind and enables you to find out what they are expecting from your company. If you are creating a survey on your own, then you need to be aware of common mistakes and learn how to correct them because your business is depending on this.
If you are trying to create a successful business, then you need to learn how to eliminate mistakes. On the other hand, a bad survey can do you more harm than good.
Example of a bad survey question
Unintentional bias. The way you phrase a question can change the answer you receive. It is important to consider your wording so that it doesn’t appear that the question favors one answer. For example: If you are comparing different products, give each one a neutral name or reference. Do not call one “A” and one “B.” This can imply that Product A is superior to Product B.
Ambiguous wording. Unclear questions can confuse respondents. For instance: “Do you watch TV regularly?” Well, maybe — what is regularly? Every day? Four hours a day? Indirect questions can also throw off your results. Respondents might read a question two different ways, based on their own personal biases and experiences. For example: “What suggestions do you have for improving tomato juice? “ The question is about taste but the response may vary in regards to texture, bottle shape, or something related to mixers. These are called double-barreled questions, and they can easily skew your results, and confuse your respondents.
Influential Answer Choices – It is very easy to create bias in a questionnaire. Re-read your questions to ensure they are not influencing your readers towards a certain answer, or opinion. You will get different answers from asking “What do you think of the XYZ proposal?” than from “What do you think of the Republican XYZ proposal?” The word “Republican” in the second question would cause some people to favor or oppose the proposal based on their feelings about Republicans, rather than about the proposal itself.Here’s an example of a bias:
Most Americans prefer to own American made cars. Do you?
Here’s how to fix it:
When purchasing a car do you prefer one manufacturer over another?
It’s True What They Say About Assumptions – Technical terms and acronyms can be confusing, so spell them out. Even if you’re absolutely sure that respondents know what they mean, it’s always a good idea to be as clear as possible. Your respondents will most likely not be able to ask you for clarification, so it’s always good to play it safe and spell out acronyms at least once.
Poor Answer Choices – Make sure your answer choices are mutually exclusive and don’t overlap. Answer choices that are not mutually exclusive can confuse your target audience.Here’s an example:
What brand of car do you own?
(Problems: what if the respondent doesn’t own a car? What about all the other car options that are out there? What if the respondent owns both of these? What if the respondent had previously owned one of these? The question is not specific enough and the answer choices are not all encompassing).
Here’s how to fix it:
You could make it a dichomonious yes or no by asking:
Do you currently own a Mazda? Do you currently own a Toyota?
you could make the answer choices more encompassing;
Which brand of car do you currently own?
I don’t own a car
I own a car, but not one of the above
Size Does Matter – When deciding the types of questions, you want to use, it’s important to consider your sample size. While short answer and essays can provide you with insight into why someone responded a certain way, this data can be very arduous to go through when you are dealing with a large sample size. Likewise, you cannot run data analysis on this kind of information with statistical software
Tiffany Henderson is currently an I/O psychologist for the Government and the Lead Organizational Consultant for her agency. Working in three different labs/research organizations, she has led teams through implementing multiple online systems/software and e-learning solutions and management information systems.