How To Make A Survey?

SHARE THE ARTICLE ON

A detailed description about how to make a survey survey
Table of Contents

When we have a business question—which new product to provide, which new features and services to add, where to establish our next location—there is only one apparent approach to find out: a poll. It’s almost like magic: we put together a series of questions, poll our followers and admirers, and presto, we know what to do next.

Surveys appear to be straightforward, but in reality, they are rarely so. Simply changing the sorts of questions and response options in our survey may have a significant influence on the quality and usefulness of our survey’s results.

Disastrous findings can lead to bad judgments, which is exactly what we wanted to prevent by creating a survey in the first place. We’ll end up with the products and services that no one wants if we ask the incorrect questions or ask them in the wrong way.

Therefore, thoughtful survey design is so important. It will help us give better and trustworthy results.

Where To Begin?

It’s simple to start the survey writing process by coming up with a list of questions to ask. We  have a list of questions we want to ask our clients, and it would be so simple to put them into a survey software and call it a day.

But that’s not the greatest approach to get started. Instead, start our survey creation process by conceptualizing the responses we desire. We  want actionable feedback, and the only way to acquire it is to think through the specific answer we desire. This might be a straightforward answer (for example, “Our consumers want us to sell THIS type of Booze”) or a more complex hypothesis we aim to prove (for example, “Concern about social status is/is not connected with social media usage”).

So sit back and consider what we hope to learn from our survey. Fill in the blanks with the information we want to learn—the type of soda to provide, the feature people are missing, or the truth of a remark. After we’ve finished this activity, we’ll use the list to create questions for our survey.

Beginning with a list of answers and converting them into survey questions ensures that we include all of the questions we need and write them in a way that will elicit effective responses. It will also keep us from filling our survey with irrelevant questions.

Just as we begin a building project with blueprints—rather than pouring concrete whenever we decide we want a new structure—our survey should begin with the answers we need, and then we’ll be better equipped to create the questions that will offer those answers.

Exploratory Research Guide

Conducting exploratory research seems tricky but an effective guide can help.

Obtaining Responses: Survey Question Types

As we transform our answers into questions, we’ll need to consider what kinds of questions we’ll need to ask. Surveys aren’t only about yes/no questions; most survey programmes have hundreds of different question formats. It’s easy to become perplexed about the sort of question to use for each answer.

The sort of question we ask will influence the responses we receive and the types of analysis we may perform. Here are the most typical sorts of survey questions, as well as examples of the data we’ll collect with each.

Categorical Concerns

If we want a basic count, such as “35 percent of individuals said ABC” or “20 percent of males and 24 percent of women…”, we may use a number of question types: Types of questions include yes/no, checkbox, and multiple choice. These questions are often known as “nominal” questions.

Categorical-level question analysis might include counts and percentages—”22 respondents” or “18% of consumers,” for example—and they perform well with bar graphs and pie charts. With nominal-level data, we cannot compute averages or assess relationships.

Yes/No

A Yes/No question is the simplest survey question and the only question we’ll usually use in a poll. We’ll ask a question and then give us two options: yes or no. If our survey app doesn’t have a Yes/No question, use a multiple choice question and add Yes and No answers yourself.

For instance, are you a vegetarian? Yes/No

Multiple Choice Questions

Do you require more subtlety than a yes/no response can provide? We require a variety of options. We can include as many responses as we like, but our responders can only select one.

For instance, what is your favorite food? Pizza/Pasta/Salad/Steak/Soup/Other

Checkbox

Have a multiple-choice question in which we believe some individuals may wish to select more than one option? Checkbox questions provide that level of versatility. We may add as many responses as we want, and responders can choose as many answers as they like to the question.

For instance, what kinds of meat do you prefer? Beef/Pork/Chicken/Fish/Duck/Other

Ordinal Questions

“Ordinal” questions are those with a definite order of replies (for example, “Income of $0-$25K, $26K-40K, $40K+”). We could use Multiple Choice questions to obtain ordinal data, or we could use drop-down or ranking questions.

Ordinal question analysis is similar to nominal question analysis in that we may obtain counts and percentages. With ordinal-level data, we can’t find averages or analyze relationships.

Drop-down

Drop-down questions function similarly to multiple-choice questions in that we will have numerous distinct possible responses and responders will be able to select only one. For ordinal data, though, we’ll need to display the responses in ascending order—possibly from greatest to smallest. We might also use this question to collect demographic information, such as their nation or state of residence.

For instance, what is your household income? $0-10k/$10-35k/$35-60k/$60k+

Ranking

Ranking questions are a more unusual survey question type that we won’t find in every survey software. They allow us to provide a number of responses and respondents can rearrange them all into the order they wish. This manner, customers may provide comments on every answer we provide. It’s a terrific approach to see which goods people enjoy the most and least.

For instance, what are your favorite beverages? Sort in the order of choice. Milk/Water/Juice/Coffee/Soda/Wine/Beer

Interval/ratio Concerns

Use the interval or ratio question type for the most exact data and detailed analysis. These questions enable us to perform sophisticated analysis such as calculating averages, checking correlations, and running regression models. To ask these kinds of questions, we’ll utilize a rating scale, matrix, or text fields in our survey app.

Interval questions are frequently posed on a scale of 1-5 or 1-7, such as “Strongly Disagree” to “Strongly Agree” or “Never” to “Always.” Ratio questions feature a genuine zero and frequently require respondents to enter an actual amount into the survey box (for example, “How Many Cups of Coffee Do You Drink Per Day? “). We don’t need to be concerned about the distinctions between the two sorts.

Scale of Rating

Ranking scale questions, the default choice for interval questions, appear like multiple choice questions with the answers in a horizontal line rather than a list. There will most likely be 3 to 10 responses, using a numerical scale, a like/love scale, a never/always scale, or any other ratio interval. It’s a terrific approach to get a more accurate picture of people’s thinking than a yes/no question could provide.

For instance, how would you rank the cleanliness of our store on a scale of 1-5? 1/2/3/4/5

Textbox

We’ll need the textbox question for ratio queries, direct comments, or personal info like names. There’s normally a small and large textbox choice, so select the size that’s appropriate for the information we’re gathering. We’ll add the question, and then there will be a blank for our responder to fill in their own answer.

For instance, how many applications do you have on your phone? Please enter a number_

Matrix

Do we have a lot of interval questions? If our survey app contains one, use it. We may make a list of questions and use the same scale for all of them. It makes it easier to collect information on a large number of related products at once.

For instance, how much do you enjoy the following: oranges, apples, grapes? Hate/Dislike/Ok/Like/Love

Best Practices To Make A Survey

  • Make use of simple, direct language

Use large words, intricate terms, and words with numerous meanings as little as possible. Our inquiry should be brief, straightforward, and to the point.

  • Be Particular

Some terms may have distinct meanings for different persons. When we ask inquiries, try to be as precise as possible. Instead of asking, “Do you exercise regularly?” we may ask, “How many days per week do you exercise on average?” This provides us with a more accurate and objective answer.

  • Subdivide large ideas into several questions

Another method for dealing with large topics that mean various things to different individuals is to divide them into smaller, more concrete queries.

“Customer happiness” is a popular issue for firms to investigate, and it’s a huge question with many sub-questions. Instead of asking, “How Satisfied Are You with This Product?” we may ask individuals to weigh in on three independent assertions (on a scale of “Strongly Disagree” to “Strongly Agree”):

This is a product that I like using. This product satisfies my requirements. I would make another purchase from this firm.

Individual statements provide insight into various aspects of our business, and the average of the scores provides an overall measure of satisfaction that we may analyze over time and attempt to improve. The three questions, when combined, provide a specific, practical solution to the topic of customer happiness.

Many series of questions (called “scales”) have been established by marketing researchers to assess large ideas (known as “constructs” in the research area) such as customer satisfaction. The Marketing Scales Handbook by Dr. Gordon C. Bruner is one of the greatest collections of these marketing scales, and it’s a must-have for any serious market researcher with concerns about structures like customer happiness, brand affinity, and more.

  • Avoid Asking Leading Questions

Researchers’ attitudes can sometimes creep into survey questions, indirectly urging respondents to respond in a specific manner and jeopardizing poll results.

For example, asking “Do you believe the school should cut the gym budget to pay crossing guards?” is likely to get a different response than asking “Should the school hire crossing guards to safeguard our children?” despite the fact that both queries are on the same subject.

To avoid asking leading questions, have a friend or colleague go over our survey and look for any questions that appear to have a right or incorrect response. Consider changing the question if our buddy can guess what type of answer we’re searching for. The answer may even lie in dividing the question into multiple questions—a fantastic option for the example question. 

  • Only ask one question at a time

Each of our survey questions should only ask one question. Although it appears straightforward, many survey authors fall into the “double-barreled” question trap. “Do you consume fruits and vegetables on a daily basis?” for example, might be a difficult question to answer. What if someone only eats fruits and vegetables? They don’t have a clear response to this question. It is preferable to divide the issue into two parts.

We can check for double-barreled questions in our survey by checking for terms like “and” or “or” in our questions.

  • Incorporate More Interval Questions

Changing our Yes/No and multiple choice questions to interval questions is an easy method to improve our survey from excellent to exceptional. Make a statement and ask others to rate it on a scale of 1-5 or 1-7, such as “Strongly Disagree, Disagree, Neither Agree nor Disagree, Agree, or Strongly Agree.” We’ll immediately improve the degree of analysis we can do with our results.

Scales of 1-5 or 1-7 are used by researchers because they capture variety in replies without producing information overload for the responder. It may appear that adopting a scale of 1-100 would allow us to collect extremely comprehensive responses, but it instead drives respondents to answer 0, 50, or 100—their responses tend to drift around extremes or the center. Using a scale of 1-5 or 1-7 will allow you to obtain more accurate, nuanced responses from respondents.

Then, rather than examining each question individually, like most people do, we may add another layer of analysis by examining how questions relate to one another. When we ask interval questions, we open the door to looking for connections, which allows us to state things like “People who are more likely to ABC are less likely to think DEF.” We can do a linear regression and declare, “Factors G, H, and I have the greatest influence on J,” if we’re a statistics genius. Simply said, we may use averages and state something like “Junior employees exercise more frequently than senior employees on average.”

See Voxco survey software in action with a Free demo.

Survey Mistakes To Avoid

After we’ve prepared our survey questions and replies, we need to double-check to make sure we haven’t fallen into any of the following mistakes.

Bias

Bias in survey responses is a sad but crucial aspect to bear in mind while creating surveys. Inquiring about gender, color, or wealth at the start of a survey can impact how respondents react to the rest of the question. (This is sometimes referred to as stereotype threat.) For example, studies have shown that when females identify their gender before taking a math test, they do lower than when they are not asked to identify their gender.

Most survey authors avoid bias and stereotype threats by asking sensitive items at the conclusion of surveys, such as those concerning gender, ethnicity, and wealth.

Bias can occur on a smaller scale as well. When we ask someone, “How essential do you believe content marketing is?” followed by, “How much do you expect to invest in content marketing next year?” we may introduce bias. If someone indicates they feel content marketing is highly essential, they may overestimate the amount of money they intend to spend in the next question. An easy technique to avoid this form of bias is to randomize the sequence of the questions.

Bias might also occur when the survey is interpreted. We may consider one person’s perspective differently merely because of their demographic replies without even realizing it. In some instances, we might not want to gather any demographic data at all to create a totally anonymous survey, something common in academic research. 

Framing 

The language used in survey instructions on why a survey is being undertaken might influence how respondents react to questions. For example, presenting a customer service follow-up survey as a review of a team member may elicit more favorable responses than framing the survey as a tool for improving our operations.

Because respondents are being questioned about a genuine, actual person, asking “Did John address your problem well?” may elicit more favorable responses. However, because we desire high-quality responses, asking “Did we fix your problem today?” is a more impartial way of asking the same question.

People have a natural desire to assist. If we tell them that the survey has a purpose, they may answer questions in a way that helps us reach that objective rather than completely honestly. To avoid this, strive to appear neutral while describing the survey and providing instructions.

Incomplete Options

When questions are given on a scale, such as “Strongly Disagree” to “Strongly Agree,” respondents may become irritated if there is no Neutral option. In most cases, neutral alternatives are handled in two ways: offering respondents a “Neither Agree nor Disagree” option in the center of the scale and a “N/A” option at the conclusion if the question does not apply.

Respondents may also get dissatisfied with our survey if it pushes them to answer questions in ways that aren’t accurate. For example, if we ask “What’s your favorite food?” and only provide pizza, cheeseburgers, and burritos as alternatives, folks who like chicken nuggets are left with no apparent answer choice. An simple way around this is to provide an “Other” option and make the question optional.

Always providing consumers with an even better solution is to ensure that we’re offering strong response options by pre-testing our survey and allowing respondents to brainstorm answer alternatives that we may have overlooked. (We could also modify the question to make it less specific.)

Tips On Survey Format

By limiting the amount of questions we ask, we can keep our survey as brief as possible. Long surveys may result in “survey weariness.” When respondents experience survey weariness, they either abandon the survey or stop paying attention and randomly tick items until it is completed. In any case, our data is jeopardized.

Creating our ideal “list of responses” before writing our survey will assist us ensure that only the questions that need to be asked are included. Compare the questions we’ve written to the answers on that list. Remove any unneeded or extraneous questions from the survey.

Here are a few additional suggestions for designing our survey in order to minimize survey fatigue and obtain relevant results:

Divide the survey into several pages

If our survey becomes too lengthy, try dividing it into numerous pages. When respondents look at it, they will feel less overwhelmed. However, be cautious because having too many pages might lead to survey fatigue. We’ll need to achieve a balance between page length and page count.

Display a Progress Bar

Displaying a progress indicator and providing a time estimate is one of the simplest methods to keep folks engaged while they complete our survey. Most survey programmes make it simple to enable progress indicators.

Check that our survey works on a variety of devices

If respondents will be doing our survey on a range of platforms, consider a responsive app or one that contains desktop and mobile versions. Consider when and where our respondents will take the survey when considering what devices they may use (from work, from home, etc.).

Most survey applications currently look fantastic on mobile, so make sure to preview our survey on both our phone and PC to ensure it looks beautiful for all of our consumers.

Read more