Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • Doing Survey Research | A Step-by-Step Guide & Examples

Doing Survey Research | A Step-by-Step Guide & Examples

Published on 6 May 2022 by Shona McCombes . Revised on 10 October 2022.

Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyse the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyse the survey results, step 6: write up the survey results, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research: Investigating the experiences and characteristics of different social groups
  • Market research: Finding out what customers think about products, services, and companies
  • Health research: Collecting data from patients about symptoms and treatments
  • Politics: Measuring public opinion about parties and policies
  • Psychology: Researching personality traits, preferences, and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and longitudinal studies , where you survey the same sample several times over an extended period.

Prevent plagiarism, run a free check.

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • University students in the UK
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18 to 24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalised to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every university student in the UK. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalise to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions.

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by post, online, or in person, and respondents fill it out themselves
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by post is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g., residents of a specific region).
  • The response rate is often low.

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyse.
  • The anonymity and accessibility of online surveys mean you have less control over who responds.

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping centre or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g., the opinions of a shop’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations.

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data : the researcher records each response as a category or rating and statistically analyses the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analysed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g., yes/no or agree/disagree )
  • A scale (e.g., a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g., age categories)
  • A list of options with multiple answers possible (e.g., leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analysed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an ‘other’ field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic.

Use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no bias towards one answer or another.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by post, online, or in person.

There are many methods of analysing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also cleanse the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organising them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analysing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analysed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyse it. In the results section, you summarise the key results from your analysis.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviours. It is made up of four or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with five or seven possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyse your data.

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analysing data from people using questionnaires.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2022, October 10). Doing Survey Research | A Step-by-Step Guide & Examples. Scribbr. Retrieved 21 May 2024, from https://www.scribbr.co.uk/research-methods/surveys/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs quantitative research | examples & methods, construct validity | definition, types, & examples, what is a likert scale | guide & examples.

7 steps when conducting survey research: A beginner-friendly guide

  • May 17, 2022

Steps of survey method: Things to know before conducting survey research

Pay attention to questions, step 2: define the population and sample (who will participate in the survey), are interviews or in-person surveys better than written ones, online surveys are the easiest way to reach a broad audience, mail surveys: control who participates, types of questions: what are the most common questions used in survey research, content, phrasing, and the order of questions, step 5: distribute the survey and gather responses, step 6: analyze the collected data, step 7: create a report based on survey results, last but not least: frequently asked questions, follow the seven steps of survey research with surveyplanet.

Conducting survey research encompasses gaining insight from a diverse group of people by asking questions and analyzing answers. It is the best way to collect information about people’s preferences, beliefs, characteristics, and related information.

The key to a good survey is asking relevant questions that will provide needed information. Surveys can be used one-time or repeatedly.

Wondering how to conduct survey research correctly?

This article will lay out—even if you are a beginner—the seven steps of conducting survey research with guidance on how to successfully carrying it out.

How to conduct survey research in 7 steps

Conducting survey research typically involves several key things to do. Here are the most common seven steps in conducting survey research:

Step 1: Identify research goals and objectives

Step 3: decide on the type of survey method to use, step 4: design and write questions.

These survey method steps provide a general framework for conducting research. But keep in mind that specific details and requirements may vary based on research context and objectives.

To understand the process of conducting a survey, start at the beginning. Conducting a survey consists of several steps, each equally important to the outcome.

Before conducting survey research, here are some resources you might find helpful regarding different methods, such as focus group interviews , survey sampling , and qualitative research methods . Learn why a market research survey is important and how to utilize it for your business research goals.

Finally, it is always a good idea to understand what is the difference between a survey and a questionnaire .

The first of seven steps in conducting survey research is to identify the goal of the research.

This will help with subsequent steps, like finding the right audience and designing appropriate questions. In addition, it will provide insight into what data is most important.

By identifying goals, several questions will be answered: What type of information am I collecting? Is it general or specific? Is it for a particular or broad audience? Research goals will define the answers to these questions and help focus the purpose of the survey and its goal.

An objective is a specific action that helps achieve research goals. Usually, for every goal, there are several objectives.

The answers collected from a survey are only helpful if used properly. Determining goals will provide a better idea of what it is you want to learn and make it easier to design questions. However, setting goals and objectives can be confusing. Ask the following questions:

  • What is the subject or topic of the research? This will clarify feedback that is needed and subjects requiring further input.
  • What do I want to learn? The first step is knowing what precisely needs to be learned about a particular subject.
  • What am I looking to achieve with the collected data? This will help define how the survey will be used to improve, adjudicate, and understand a specific subject.

Uncertain about how to write a good survey question ? We got you covered.

Who is the target audience from which information is being gathered? This is the demographic group that will participate in the survey. To successfully define this group, narrow down a specific population segment that will provide accurate and unbiased information.

Depending on the kind of information required, this group can be broad—for example the population of Florida—or it can be relatively narrow, like consumers of a specific product who are between the ages of 18 and 24.

It is rarely possible to survey the entire population being researched. Instead, a sample population is surveyed. This should represent the subject population as a whole. The number required depends on various factors, mainly the size of the subject population. Therefore, the larger and more representative the sample is the more valid the survey.

Precisely determine what mode of collecting data will be used. The ways to conduct a survey depend on sample size, location, types of questions, and the costs of conducting the research. Not sure how many people you need to survey to be statistically significant!? Use our survey sample size calculator and determine your needed survey size.

Based on the purpose of the research, there are various methods of conducting a survey:

In-person surveys are useful for smaller sample sizes since they allow for the gathering of more detailed information on the survey’s subject. They can be conducted either by phone or in person.

The advantage of interviews is that the interviewer can clarify questions and seek additional information. The main risk with this method is researcher bias or respondent equivocation, though a skilled interviewer is usually able to eliminate these issues.

If the correct steps are followed, conducting an online survey has many advantages, such as cost efficiency and flexibility. In addition, online surveys can reach either a vast audience or a very focused one, depending on your needs.

Online tools are the most effective method of conducting a survey. They can be used by anyone and easily customized for any target group. There are many kinds of online surveys that can be sent via email, hosted on a website, or advertised through social media.

To follow the correct steps for conducting a survey, get help from SurveyPlanet . All you need to do is sign up for an account . Creating perfect surveys will be at your fingertips.

Delivered to the respondents’ email addresses, mail surveys access a large sample group and provide control over who is included in the sample. Though increasingly the most common survey research method, response rates are now relatively low .

To get the best response rate results, read our blogs How to write eye-catching survey emails and What’s the best time to send survey emails ?

Survey questions play a significant role in successful research. Therefore, when deciding what questions to ask—and how to ask them—it is crucial to consider various factors.

Choose between closed-ended and open-ended questions. Closed-ended questions have predefined answer options, while open-ended ones enable respondents to shape an answer in their own words.

Before deciding which to use, get acquainted with the options available. Some common types of research questions include:

  • Demographic questions
  • Multiple-choice questions
  • Rating scale questions
  • Likert scale questions
  • Yes or no questions
  • Ranking questions
  • Image choice questions

To make sure results are reliable, each question in a survey needs to be formulated carefully. Each should be directly relevant to the survey’s purpose and include enough information to be answered accurately.

If using closed-ended questions, make sure the available answers cover all possibilities. In addition, questions should be clear and precise without any vagueness and in the language idiom respondents will understand.

When organizing questions, make sure the order is logical. For example, easy and closed-ended questions encourage respondents to continue—they should be at the beginning of the survey. More difficult and complex questions should come later. Related questions should be clustered together and, if there are several topics covered, then related questions should be grouped.

Surveys can be distributed in person, over the phone, via email, or with an online form.

When creating a survey, first determine the number of responses required and how to access the survey sample. It is essential to monitor the response rate. This is calculated by dividing the number of respondents who answered the survey by the number of people in the sample.

There are various methods of conducting a survey and also different methods of analyzing the data collected. After processing and sorting responses (usually with the help of a computer), clean the data by removing incomplete or inaccurate responses.

Different data analysis methods should be used depending on the type of questions utilized. For example, open-ended questions require a bucketing approach in which labels are added to each response and grouped into categories.

Closed-ended questions need statistical analysis. For interviews, use a qualitative method (like thematic analysis) and for Likert scale questions use analysis tools (mean, median, and mode).

Other practical analyzing methods are cross-tabulation and filtering. Filtering can help in understanding the respondent pool better and be used to organize results so that data analysis is quicker and more accessible.

If using an online survey tool, data will be compiled automatically, so the only thing needed is identifying patterns and trends.

The last of the seven steps in conducting survey research is creating a report. Analyzed data should be translated into units of information that directly correspond to the aims and goals identified before creating the survey.

Depending on the formality of the report, include different kinds of information:

  • Initial aims and goals
  • Methods of creation and distribution
  • How the target audience or sample was selected
  • Methods of analysis
  • The results of the survey
  • Problems encountered and whether they influenced results
  • Conclusion and recommendations
  • What’s the best way to select my survey sample size? One must carefully consider the survey sample size to ensure accurate results. Please read our complete guide to survey sample size and find all the answers.
  • How do I design an effective survey instrument? Try out SurveyPlanet PRO features including compelling survey theme templates.
  • How do I analyze and interpret survey data? Glad you asked! We got you covered. Learn how to analyze survey data and what to do with survey responses by reading our blog.
  • What should I consider in terms of ethical practices in survey research? Exploring ethical considerations related to obtaining informed consent, ensuring privacy, and handling sensitive data might be helpful. Start with learning how to write more inclusive surveys .
  • How do I address common survey challenges and errors? Explore strategies to overcome common issues, such as response bias or question-wording problems .
  • How can I maximize survey response rates? Seeking advice on strategies to encourage higher response rates and minimize non-response bias is a first step. Start by finding out what is a good survey response rate .
  • How can I ensure the validity and reliability of my survey results? Learn about methods to enhance the trustworthiness of survey data .

Now that we’ve gone through the seven steps in survey research and understand how to conduct survey research, why not create your own survey and conduct research that will drive better choices and decisions?

Were these seven steps helpful? Then check out Seven tips for creating an exceptional survey design (with examples) and How to conduct online surveys in seven simple steps as well.

Sign up for a SurveyPlanet account to access pre-made questions and survey themes. And, if you upgrade to a SurveyPlanet Pro account, gain access to many unique tools that will enhance your survey creation and analysis experience.

Photo by Adeolu Eletu on Unsplash

  • How it works

researchprospect post subheader

How to Conduct Surveys – Guide with Examples

Published by Alvin Nicolas at August 16th, 2021 , Revised On August 29, 2023

Surveys are a popular primary data collection method and can be used in various  types of research . A researcher formulates a survey that includes questions relevant to the research topic. The participants are selected, and the questionnaire is distributed among them, either online or offline. It consists of either open or close-ended questions.

Objectives and Uses of Survey 

  • Surveys are conducted for the planning of national, regional, or local programs.
  • They help to study the perceptions of the community related to the topic.
  • Surveys are used in market research, social sciences, and commercial settings.
  • They can also be used for various other disciplines, from business to anthropology.
  • Surveys are frequently used in quantitative research .

Guidelines for Conducting a Survey

Before conducting a survey, you should follow these steps:

  • Construct a clear and concise research problem statement  focusing on what is being investigated and why the research is carried out.
  • Formulate clear and unbiased questions for the survey.
  • Test the questions randomly on volunteer groups and make necessary changes f required.
  • Determine the mode of survey distribution.
  • Schedule the timing of the survey.
  • Use a professional tone, a scholarly approach, and an academic format for your survey.
  • Ensure the privacy and anonymity of the participants.
  • Avoid offensive languages or biased questions.
  • Take the opinion of the participants.
  • Inform the participants about the survey.
  • Calculate the time required for gathering data, analysing, and reporting it.

How to Conduct a Survey?

Following are the steps while conducting the surveys.

  • Set the aims of your research
  • Select the type of survey
  • Prepare a list of questions
  • Invite the participants
  • Record the responses of the participants
  • Distribute the survey questions
  • Analyse the results
  • Write your report

Step 1: Set the Aims of your Research

Before conducting research, you need to form a clear picture of the outcomes of your study.  Create a research question  and devise the goals of your research. Based on the requirements of your research, you need to select the participants. It would help if you decided whether your survey would be online or offline.

You need to select a specific group of participants for your research. The participants can be:

  • A group of college students
  • Hospital staff
  • A group of people in public places
  • Customers or employs a specific company
  • A group of people based on their age, gender, and profession, etc.

Sometimes it’s impossible to survey the entire population individually if it’s a large population. It requires a lot of time and effort. In such cases, you can select a group of people from the selected community, and it’s called the  sample.

  • 50 customers of a company
  • 40 students of class 12
  • 30 boys and 30 girls of age 14-15

You can also use an online survey if your target population is large. It helps in getting the maximum number of responses within a short time.

Useful reading: What is correlational research , a comprehensive guide for researchers.

Does your Research Methodology Have the Following?

  • Great Research/Sources
  • Perfect Language
  • Accurate Sources

If not, we can help. Our panel of experts makes sure to keep the 3 pillars of Research Methodology strong.

Does your Research Methodology Have the Following

Step 2: Select the Type of Survey

Example of the rating scale:.

enjoy reading paper books more than reading e-books

How do you feel about your ability to find a career option according to your goals?

Step 3: Prepare a List of Questions

You can use various types of questions in your survey, such as open-ended, closed-ended, and multiple-choice questions. Most of the participants like short multiple-choice questions. Use simple and clear language to avoid misunderstanding. Avoid offensive language. 

If you are using checklists in your survey to get feedback on a specific feature, service, or product, then write the statements based on your evaluation aims.

Closed-ended Questions

  • Questions with answers such as (yes/no, agree/disagree, true/ false)
  • Rating scales with points or stars to measure the satisfaction of the people.
  • A list of questions with multiple options with either a single answer option or various answers.

Open-ended Questions

Open-ended  questions require the participants’ individual answers according to their opinion, experience, and choice. The answers can be either one word or in sentences.

  • Tell me about your relationship with your boss?
  • Why did you choose this answer?
  • What’s your opinion on women’s education?
  • How do you see the future?
  • What is a success, according to you?

Step 4: Invite the Participants

You can try out many ways to invite the participants to your survey. You can inform them through emails, texts. You can post your survey on social media or design a banner to display on websites to grab the respondents’ attention.

Step 5: Record the Responses of the Participants

One of the essential steps is to gather responses from the participants. In most cases, people don’t pay attention to the survey questions or leave them incomplete. You can offer some rewards to increase the response rates of your participants. You can also promise to share the outcomes with your participants to improve their response rate.

Step 6: Distribute the Survey Questions

You need to decide the sample size (number of participants and responses required) according to your research requirements. It will help if you determine whether you are going to conduct an online survey or offline. 

Step 7: Analyse the Results

You can store the data in tabulated forms, charts, graphs, or you can take out a print of the data in the form of a spreadsheet. You can use text analysis to analyse the findings of your questionnaire survey.  You can perform a thematic analysis  for the  interview  surveys. However, the information on the online surveys is stored automatically, and you can analyse it directly.

Step 8: Write your Report

The final step is to write a report for your survey. You need to ensure that you have met the objectives of your research or not. 

In the  introduction , you need to explain your survey’s whole procedure by mentioning the time and place of the survey conducted. Mention the methods of analysis you used in your survey.

A successful survey represents reliable feedback to the survey questions as evidence of your research. If you have online surveys, the responses will help you measure the participant’s satisfaction and positive or negative opinions.

In the section of discussion and conclusion, you can  explain your findings  by using supporting evidence and concluding the results by answering your research questions.

Frequently Asked Questions

What are the basic steps to conduct the survey.

Basic steps to conduct a survey:

  • Define objectives and target audience.
  • Develop clear and concise questions.
  • Choose survey method (online, phone, etc.).
  • Pilot test to refine questions.
  • Distribute to participants.
  • Collect and analyze responses.
  • Draw conclusions and share findings.

You May Also Like

Thematic analysis is commonly used for qualitative data. Researchers give preference to thematic analysis when analysing audio or video transcripts.

A meta-analysis is a formal, epidemiological, quantitative study design that uses statistical methods to generalise the findings of the selected independent studies.

This post provides the key disadvantages of secondary research so you know the limitations of secondary research before making a decision.

USEFUL LINKS

LEARNING RESOURCES

researchprospect-reviews-trust-site

COMPANY DETAILS

Research-Prospect-Writing-Service

  • How It Works

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO

How to create an effective survey in 15 simple tips

Updated August 15, 2023

You don’t have to be an expert to create a survey, but by following a few survey best practices you can make sure you’re collecting the best data possible.

Access 50+ expert-designed survey templates with a free Qualtrics Surveys account

From working out what you want to achieve to providing incentives for respondents, survey design can take time.

But when you don’t have hours to devote to becoming a survey-creation guru, a quick guide to the essentials is a great way to get started.

In this article, we’re going to reveal how to create a survey that’s easy to complete, encourages collecting feedback, hits the research questions you’re interested in, and produces data that’s easy to work with at the analysis stage .

15 Tips when creating surveys

1. define the purpose of the survey.

Before you even think about your survey questions , you need to define their purpose.

The survey’s purpose should be a clear, attainable, and relevant goal. For example, you might want to understand why customer engagement is dropping off during the middle of the sales process.

Your goal could then be something like: “I want to understand the key factors that cause engagement to dip at the middle of the sales process, including both internal and external elements.”

Or maybe you want to understand customer satisfaction post-sale. If so, the goal of your survey could be: “I want to understand how customer satisfaction is influenced by customer service and support post-sale, including through online and offline channels.”

The idea is to come up with a specific, measurable, and relevant goal for your survey. This way you ensure that your questions are tailored to what you want to achieve and that the data captured can be compared against your goal.

2. Make every question count

You’re building your survey questionnaire to obtain important insights, so every question should play a direct role in hitting that target.

Make sure each question adds value and drives survey responses that relate directly to your research goals. For example, if your participant’s precise age or home state is relevant to your results, go ahead and ask. If not, save yourself and your respondents some time and skip it.

It’s best to plan your survey by first identifying the data you need to collect and then writing your questions.

You can also incorporate multiple-choice questions to get a range of responses that provide more detail than a solid yes or no. It’s not always black and white.

For a deeper dive into the art and science of question-writing and survey best practices, check out Survey questions 101 .

3. Keep it short and simple

Although you may be deeply committed to your survey, the chances are that your respondents... aren’t.

As a survey designer, a big part of your job is keeping their attention and making sure they stay focused until the end of the survey.

Respondents are less likely to complete long surveys or surveys that bounce around haphazardly from topic to topic. Make sure your survey follows a logical order and takes a reasonable amount of time to complete.

Although they don’t need to know everything about your research project, it can help to let respondents know why you’re asking about a certain topic. Knowing the basics about who you are and what you’re researching means they’re more likely to keep their responses focused and in scope.

Access 50+ expert-designed survey templates now

4. Ask direct questions

Vaguely worded survey questions confuse respondents and make your resulting data less useful. Be as specific as possible, and strive for clear and precise language that will make your survey questions easy to answer.

It can be helpful to mention a specific situation or behavior rather than a general tendency. That way you focus the respondent on the facts of their life rather than asking them to consider abstract beliefs or ideas .

See an example:

Good survey design isn’t just about getting the information you need, but also encouraging respondents to think in different ways.

Get access to the top downloaded survey templates here

5. Ask one question at a time

Although it’s important to keep your survey as short and sweet as possible, that doesn’t mean doubling up on questions. Trying to pack too much into a single question can lead to confusion and inaccuracies in the responses.

Take a closer look at questions in your survey that contain the word “and” – it can be a red flag that your question has two parts. For example: “Which of these cell phone service providers has the best customer support and reliability?” This is problematic because a respondent may feel that one service is more reliable, but another has better customer support.

Also, if you want to go beyond surveys and develop a multi-faceted listening approach to drive meaningful change and glean actionable insights, make sure to download our guide .

6. Avoid leading and biased questions

Although you don’t intend them to, certain words and phrases can introduce bias into your questions or point the respondent in the direction of a particular answer.

As a rule of thumb, when you conduct a survey it’s best to provide only as much wording as a respondent needs to give an informed answer. Keep your question wording focused on the respondent and their opinions, rather than introducing anything that could be construed as a point of view of your own.

In particular, scrutinize adjectives and adverbs in your questions. If they’re not needed, take them out.

7. Speak your respondent's language

This tip goes hand in hand with many others in this guide – it’s about making language only as complex or as detailed as it needs to be when conducting great surveys.

Create surveys that use language and terminology that your respondents will understand. Keep the language as plain as possible, avoid technical jargon and keep sentences short. However, beware of oversimplifying a question to the point that its meaning changes.

8. Use response scales whenever possible

Response scales capture the direction and intensity of attitudes, providing rich data. In contrast, categorical or binary response options, such as true/false or yes/no response options, generally produce less informative data.

If you’re in the position of choosing between the two, the response scale is likely to be the better option.

Avoid using scales that ask your target audience to agree or disagree with statements, however. Some people are biased toward agreeing with statements , and this can result in invalid and unreliable data.

9. Avoid using grids or matrices for responses

Grids or matrices of answers demand a lot more thinking from your respondent than a scale or multiple choice question. They need to understand and weigh up multiple items at once, and oftentimes they don’t fill in grids accurately or according to their true feelings .

Another pitfall to be aware of is that grid question types aren’t mobile-friendly. It’s better to separate questions with grid responses into multiple questions in your survey with a different structure such as a response scale.

See an example using our survey tool:

10. Rephrase yes/no questions if possible in online survyes

As we’ve described, yes/no questions provide less detailed data than a response scale or multiple-choice, since they only yield one of two possible answers.

Many yes/no questions can be reworked by including phrases such as “How much,” “How often,” or “How likely.” Make this change whenever possible and include a response scale for richer data.

By rephrasing your questions in this way, your survey results will be far more comprehensive and representative of how your respondents feel.

Next? Find out how to write great questions .

11. Start with the straightforward stuff

Ease your respondent into the survey by asking easy questions at the start of your questionnaire, then moving on to more complex or thought-provoking elements once they’re engaged in the process.

This is especially valuable if you need to cover any potentially sensitive topics in your survey. Never put sensitive questions at the start of the questionnaire where they’re more likely to feel off-putting.

Your respondent will probably become more prone to fatigue and distraction towards the end of the survey, so keep your most complex or contentious questions in the middle of the survey flow rather than saving them until last.

12. Use unbalanced scales with care

Unbalanced response scales and poorly worded questions can mislead respondents.

For example, if you’ve asked them to rate a product or service and you provide a scale that includes “poor”, “satisfactory”, “good” and “excellent”, they could be swayed towards the “excellent” end of the scale because there are more positive options available.

Make sure your response scales have a definitive, neutral midpoint (aim for odd numbers of possible responses) and that they cover the whole range of possible reactions to the question .

13. Consider adding incentives

To increase the number of responses, incentives — discounts, offers, gift cards, or sweepstakes — can prove helpful.

Of course, while the benefits of offering incentives sound appealing (more respondents), there’s the possibility of attracting the opinions of the wrong audiences, such as those who are only in it for the incentive.

With this in mind, make sure you limit your surveys to your target population and carefully assess which incentives would be most valuable to them.

14. Take your survey for a test drive

Want to know how to make a survey a potential disaster? Send it out before you pre-test .

However short or straightforward your questionnaire is, it’s always a good idea to pre-test your survey before you roll it out fully so that you can catch any possible errors before they have a chance to mess up your survey results.

Share your survey with at least five people, so that they can test your survey to help you catch and correct problems before you distribute it.

15. Let us help you

Survey design doesn’t have to be difficult — even less so with the right expertise, digital solutions, and survey templates.

At Qualtrics, we provide survey software that’s used by more than 11,000 of the top brands and 99 of the top business schools worldwide.

Furthermore, we have a library of high-quality, ready-to-use, and easy-to-configure survey templates that can improve your surveys significantly.

You can check out our template marketplace here . As a free or existing customer, you have access to the complete collection and can filter by the core experiences you want to drive.

As for our survey software , it’s completely free to use and powers more than 1 billion surveys a year. Using it, you can get answers to your most important brand, market, customer, and product questions, build your own surveys, get insights from your audience wherever they are, and much, much more.

If you want to learn more about how to use our survey tool to create a survey, as well as what else it can do — check out our blog on how to create a free online survey using Qualtrics .

See instant results with our online free survey maker

Sarah Fisher

Related Articles

December 20, 2023

Top market research analyst skills for 2024

November 7, 2023

Brand Experience

The 4 market research trends redefining insights in 2024

September 14, 2023

How BMG and Loop use data to make critical decisions

August 21, 2023

Designing for safety: Making user consent and trust an organizational asset

June 27, 2023

The fresh insights people: Scaling research at Woolworths Group

June 20, 2023

Bank less, delight more: How Bankwest built an engine room for customer obsession

June 16, 2023

How Qualtrics Helps Three Local Governments Drive Better Outcomes Through Data Insights

April 1, 2023

Academic Experience

How to write great survey questions (with examples)

Stay up to date with the latest xm thought leadership, tips and news., request demo.

Ready to learn more about Qualtrics?

  • Interesting
  • Scholarships
  • UGC-CARE Journals

How to Conduct a Research Survey: A Step-by-Step Guide

Mastering the Art of Research Surveys: A Step-by-Step Guide to Conduct Research Survey

Dr. Somasundaram R

Research surveys are valuable tools for gathering data and insights from a targeted audience. Whether you’re a student conducting academic research, a professional seeking feedback from customers, or an organization looking to understand public opinion, conducting a well-designed survey can provide you with valuable information. In this article, iLovePhD will guide you through the process of conducting a research survey effectively.

Learn how to conduct research surveys like a pro with our comprehensive step-by-step guide. From defining your objectives to analyzing data, this article provides valuable insights and practical tips for designing, distributing, and interpreting research surveys effectively. Start gathering accurate and actionable information today!

Mastering the Art of Research Surveys: A Step-by-Step Guide

  • Define your research objectives: Before diving into the survey creation process, clearly define your research objectives . What specific information do you aim to gather? Clearly identifying your goals will help shape the questions and structure of your survey.
  • Identify your target audience: Determine the specific group of people you want to survey. Understanding your target audience is crucial for tailoring your questions and ensuring the data you collect is relevant and representative.
  • Choose the appropriate survey method: Select a survey method that suits your research objectives and target audience. Common methods include online surveys, phone interviews, in-person interviews, or mail-in questionnaires. Online surveys are popular due to their ease of use, wide reach, and cost-effectiveness.
  • Design your survey questions: Craft clear and concise questions that are easy for respondents to understand. Start with demographic questions (age, gender, location) to gather basic information. Use a mix of closed-ended questions (multiple choice, rating scales) and open-ended questions (allowing respondents to provide detailed answers) to capture a range of data.
  • Structure your survey: Organize your survey in a logical and coherent manner. Begin with an introduction that explains the purpose and confidentiality of the survey, followed by instructions on how to complete it. Group related questions together and consider the flow of the survey to keep respondents engaged.
  • Test your survey : Before launching your survey, test it with a small sample of respondents to identify any potential issues or areas for improvement. This step ensures that the survey is error-free, user-friendly, and effectively captures the desired information.
  • Choose a survey distribution method: If conducting an online survey, you can use various platforms like Google Forms, SurveyMonkey, or Qualtrics to distribute your survey link. If using other methods, determine the best way to reach your target audience, such as through phone calls, in-person interactions, or mail.
  • Monitor survey responses: Regularly monitor the responses you receive to ensure data collection is progressing smoothly. Consider setting reminders or notifications to keep track of new submissions. This step is particularly crucial for time-sensitive surveys.
  • Analyze and interpret the data: Once you’ve collected a sufficient number of responses, analyze the data using appropriate statistical techniques. Look for trends, patterns, and key insights that address your research objectives. Visualize the data through charts or graphs to enhance understanding and communication.
  • Report and share your findings: Compile your research findings into a comprehensive report, including an introduction, methodology, results, analysis, and conclusions. Ensure that your report is well-structured and provides clear and actionable insights. Share the report with relevant stakeholders or audiences who can benefit from the information.

Conducting a research survey can be a valuable means of gathering data and insights. By following the step-by-step guide outlined in this article, you’ll be well-equipped to design, distribute, and analyze your survey effectively.

Remember to focus on your research objectives, tailor your questions to your target audience, and ensure the survey process is user-friendly. With a well-executed research survey, you can unlock valuable insights that drive informed decision-making.

  • Data Analysis
  • data collection
  • research objectives
  • research surveys
  • survey analysis
  • survey best practices
  • survey data interpretation
  • survey design
  • survey distribution
  • survey implementation
  • survey insights
  • survey methodology
  • survey questions
  • survey reporting
  • survey research
  • survey response rates
  • survey techniques
  • survey tools
  • target audience

Dr. Somasundaram R

How to Write a Research Paper in a Month?

Example of abstract for research paper – tips and dos and donts, list of phd and postdoc fellowships in india 2024, most popular, india science and research fellowship (isrf) 2024-25, photopea tutorial – online free photo editor for thesis images, eight effective tips to overcome writer’s block in phd thesis writing, google ai for phd research – tools and techniques, phd in india 2024 – cost, duration, and eligibility for admission, 100 connective words for research paper writing, phd supervisors – unsung heroes of doctoral students, best for you, 24 best free plagiarism checkers in 2024, what is phd, popular posts, how to check scopus indexed journals 2024, how to write a research paper a complete guide, popular category.

  • POSTDOC 317
  • Interesting 258
  • Journals 234
  • Fellowship 129
  • Research Methodology 102
  • All Scopus Indexed Journals 92

ilovephd_logo

iLovePhD is a research education website to know updated research-related information. It helps researchers to find top journals for publishing research articles and get an easy manual for research tools. The main aim of this website is to help Ph.D. scholars who are working in various domains to get more valuable ideas to carry out their research. Learn the current groundbreaking research activities around the world, love the process of getting a Ph.D.

Contact us: [email protected]

Google News

Copyright © 2024 iLovePhD. All rights reserved

  • Artificial intelligence

how to conduct a research survey

Business growth

Marketing tips

How to conduct your own market research survey (with example)

Hero image with an icon of a survey

After watching a few of those sketches, you can imagine why real-life focus groups tend to be pretty small. Even without any over-the-top personalities involved, it's easy for these groups to go off the rails.

So what happens when you want to collect market research at a larger scale? That's where the market research survey comes in. Market surveys allow you to get just as much valuable information as an in-person interview, without the burden of herding hundreds of rowdy Eagles fans through a product test.

Table of contents:

What is a market research survey?

Market surveys are what's known as "primary research"—that is, information that the researching company gathers firsthand. Secondary research consists of data that another organization gathered and published, which other researchers can then use for their own reports. Primary research is more expensive and time-intensive than secondary research, which is why you should only use market research surveys to obtain information that you can't get anywhere else. 

A market research survey can collect information on your target customers':

Experiences

Preferences, desires, and needs

Values and motivations

The types of information that can usually be found in a secondary source, and therefore aren't good candidates for a market survey, include your target customers':

Demographic data

Consumer spending data

Household size

Why conduct market research?

Here are some examples of how market research surveys can be used to fill a wide range of knowledge gaps for companies:

A B2B software company asks real users in its industry about Kanban board usage to help prioritize their project view change rollout.

A B2C software company asks its target demographic about their mobile browsing habits to help them find features to incorporate into their forthcoming mobile app.

A printing company asks its target demographic about fabric preferences to gauge interest in a premium material option for their apparel lines.

A wholesale food vendor surveys regional restaurant owners to find ideas for seasonal products to offer.

Primary vs. secondary market research

Market surveys are what's known as "primary research"—that is, information that the researching company gathers firsthand. Secondary research consists of data that another organization gathered and published, which other researchers can then use for their own reports. 

Primary research is more expensive and time-intensive than secondary research, which is why you should only use market research surveys to obtain information that you can't get anywhere else. 

If you've exhausted your secondary research options and still have unanswered questions, it's time to start thinking about conducting a market research survey.

6 types of market research survey

Depending on your goal, you'll need different types of market research. Here are six types of market research surveys.

1. Buyer persona research

A buyer persona research survey will help you learn more about things like demographics, household makeup, income and education levels, and lifestyle markers. The more you learn about your existing customers, the more specific you can get in targeting potential customers. You may find that there are more buyer personas within your user base than the ones that you've been targeting.

2. Sales funnel research

With a sales funnel research survey, you can learn about potential customers' main drivers at different stages of the sales funnel. You can also get feedback on how effective different sales strategies are. Use this survey to find out:

How close potential buyers are to making a purchase

What tools and experiences have been most effective in moving prospective customers closer to conversion

3. Customer loyalty research

The demographics of your most loyal customers

What tools are most effective in turning customers into advocates

What you can do to encourage more brand loyalty

4. Branding and marketing research

The Charmin focus group featured in that SNL sketch is an example of branding and marketing research, in which a company looks for feedback on a particular advertising angle to get a sense of whether it will be effective before the company spends money on running the ad at scale. Use this type of survey to find out:

Whether a new advertising angle will do well with existing customers

Whether a campaign will do well with a new customer segment you haven't targeted yet

What types of campaign angles do well with a particular demographic

5. New products or features research

What features they wish your product currently had

What they think of a particular product or feature idea

6. Competitor research

Whether your competitors have found success with a buyer persona you're not targeting

Information about buyers for a product that's similar to one you're thinking about launching

Feedback on what features your competitors' customers wish their version of a product had

How to write and conduct a market research survey

Once you've narrowed down your survey's objectives, you can move forward with designing and running your survey.

Step 1: Write your survey questions

A poorly worded survey, or a survey that uses the wrong question format, can render all of your data moot. If you write a question that results in most respondents answering "none of the above," you haven't learned much. 

Categorical questions

Also known as a nominal question, this question type provides numbers and percentages for easy visualization, like "35% said ABC." It works great for bar graphs and pie charts, but you can't take averages or test correlations with nominal-level data.

Multiple choice: Use this type of question if you need more nuance than a Yes/No answer gives. You can add as many answers as you want, and your respondents can pick only one answer to the question. 

Checkbox: Checkbox questions add the flexibility to select all the answers that apply. Add as many answers as you want, and respondents aren't limited to just one. 

A screenshot of a multiple choice question asking about how you travel to work with various answers and an option to type in your own answer in an "other" field

Ordinal questions

This type of question requires survey-takers to pick from options presented in a specific order, like "income of $0-$25K, $26K-$40K, $41K+." Like nominal questions, ordinal questions elicit responses that allow you to analyze counts and percentages, though you can't calculate averages or assess correlations with ordinal-level data.

Dropdown: Responses to ordinal questions can be presented as a dropdown, from which survey-takers can only make one selection. You could use this question type to gather demographic data, like the respondent's country or state of residence. 

Ranking: This is a unique question type that allows respondents to arrange a list of answers in their preferred order, providing feedback on each option in the process. 

Interval/ratio questions

For precise data and advanced analysis, use interval or ratio questions. These can help you calculate more advanced analytics, like averages, test correlations, and run regression models. Interval questions commonly use scales of 1-5 or 1-7, like "Strongly disagree" to "Strongly agree." Ratio questions have a true zero and often ask for numerical inputs (like "How many cups of coffee do you drink per day? ____").

Ranking scale: A ranking scale presents answer choices along an ordered value-based sequence, either using numbers, a like/love scale, a never/always scale, or some other ratio interval. It gives more insight into people's thoughts than a Yes/No question. 

Matrix: Have a lot of interval questions to ask? You can put a number of questions in a list and use the same scale for all of them. It simplifies gathering data about a lot of similar items at once. 

Example : How much do you like the following: oranges, apples, grapes? Hate/Dislike/Ok/Like/Love

Textbox: A textbox question is needed for collecting direct feedback or personal data like names. There will be a blank space where the respondent can enter their answer to your question on their own. 

Screenshot example of an interval question about how much you enjoy commuting to work with options to indicate how much a person agrees and disagrees with a statement

Step 2: Choose a survey platform

Most survey apps today look great on mobile, but be sure to preview your survey on your phone and computer, at least, to make sure it'll look good for all of your users.

A screenshot image of two survey questions on a mobile device rather than a desktop view to illustrate the importance of checking to see how a survey will show up on multiple platforms

If you have the budget, you can also purchase survey services from a larger research agency. 

Step 3: Run a test survey

Before you run your full survey, conduct a smaller test on 5%-10% of your target respondent pool size. This will allow you to work out any confusing wording or questions that result in unhelpful responses without spending the full cost of the survey. Look out for:

Survey rejection from the platform for prohibited topics

Joke or nonsense textbox answers that indicate the respondent didn't answer the survey in earnest

Multiple choice questions with an outsized percentage of "none of the above" or "N/A" responses

Step 4: Launch your survey

If your test survey comes back looking good, you're ready to launch the full thing! Make sure that you leave ample time for the survey to run—you'd be surprised at how long it takes to get a few thousand respondents. 

Even if you've run similar surveys in the past, leave more time than you need. Some surveys take longer than others for no clear reason, and you also want to build in time to conduct a comprehensive data analysis.

Step 5: Organize and interpret the data

Tips for running a market research survey.

You know the basics of how to conduct a market research survey, but here are some tips to enhance the quality of your data and the reliability of your findings.

Find the right audience: You could have meticulously crafted survey questions, but if you don't target the appropriate demographic or customer segment, it doesn't really matter. You need to collect responses from the people you're trying to understand. Targeted audiences you can send surveys to include your existing customers, current social media followers, newsletter subscribers, attendees at relevant industry events, and community members from online forums, discussion boards, or other online communities that cater to your target audience. 

Focus questions on a desired data type: As you conceptualize your survey, consider whether a qualitative or quantitative approach will better suit your research goals. Qualitative methods are best for exploring in-depth insights and underlying motivations, while quantitative methods are better for obtaining statistical data and measurable trends. For an outcome like "optimize our ice cream shop's menu offerings," you may want to find out which flavors of ice cream are most popular with teens. This would require a quantitative approach, for which you would use categorical questions that can help you rank potential flavors numerically.

Establish a timeline: Set a realistic timeline for your survey, from creation to distribution to data collection and analysis. You'll want to balance having your survey out long enough to generate a significant amount of responses but not so long that it loses relevance. That length can vary widely based on factors like type of survey, number of questions, audience size, time sensitivity, question format, and question length.

Market research survey campaign example

Let's say you own a market research company, and you want to use a survey to gain critical insights into your market. You prompt users to fill out your survey before they can access gated premium content.

Survey questions: 

1. What size is your business? 

<10 employees

11-50 employees

51-100 employees

101-200 employees

>200 employees

2. What industry type best describes your role?

3. On a scale of 1-4, how important would you say access to market data is?

1 - Not important

2 - Somewhat important

3 - Very important

4 - Critically important

4. On a scale of 1 (least important) to 5 (most important), rank how important these market data access factors are.

Accuracy of data

Attractive presentation of data

Cost of data access

Range of data presentation formats

Timeliness of data

5. True or false: your job relies on access to accurate, up-to-date market data.

Survey findings: 

63% of respondents represent businesses with over 100 employees, while only 8% represent businesses with under 10.

71% of respondents work in sales, marketing, or operations.

80% of respondents consider access to market data to be either very important or critically important.

"Timeliness of data" (38%) and "Accuracy of data" (32%) were most commonly ranked as the most important market data access factor.

86% of respondents claimed that their jobs rely on accessing accurate, up-to-date market data.

Insights and recommendations: Independent analysis of the survey indicates that a large percentage of users work in the sales, marketing, or operations fields of large companies, and these customers value timeliness and accuracy most. These findings can help you position future report offerings more effectively by highlighting key benefits that are important to customers that fit into related customer profiles. 

Market research survey example questions

Your individual questions will vary by your industry, market, and research goals, so don't expect a cut-and-paste survey to suit your needs. To help you get started, here are market research survey example questions to give you a sense of the format.

Yes/No: Have you purchased our product before?

Multiple choice: How many employees work at your company?

<10 / 10-20 / 21-50 / 51-100 / 101-250 / 250+

Checkbox: Which of the following features do you use in our app?

Push notifications / Dashboard / Profile customization / In-app chat

Dropdown: What's your household income? 

$0-$10K / $11-$35K / $36-$60K / $61K+

Ranking: Which social media platforms do you use the most? Rank in order, from most to least.

Facebook / Instagram / Twitter / LinkedIn / Reddit

Ranking scale: On a scale of 1-5, how would you rate our customer service? 

1 / 2 / 3 / 4 / 5

Textbox: How many apps are installed on your phone? Enter a number: 

Market research survey question types

Good survey apps typically offer pre-designed templates as a starting point. But to give you a more visual sense of what these questions might look like, we've put together a document showcasing common market research survey question types.

Screenshot of Zapier's market research survey question format guide

Use automation to put survey results into action

Related reading:

This article was originally published in June 2015 by Stephanie Briggs. The most recent update, with contributions from Cecilia Gillen, was in September 2023.

Get productivity tips delivered straight to your inbox

We’ll email you 1-3 times per week—and never share your information.

Amanda Pell picture

Amanda Pell

Amanda is a writer and content strategist who built her career writing on campaigns for brands like Nature Valley, Disney, and the NFL. When she's not knee-deep in research, you'll likely find her hiking with her dog or with her nose in a good book.

  • Forms & surveys

Related articles

Hero image with the OpenAI logo

How to use ChatGPT for copywriting and content ideation

How to use ChatGPT for copywriting and...

A hero image for LinkedIn app tips with the LinkedIn logo on a blue background

12 Linkedin Lead Gen Form examples to inspire your next campaign

12 Linkedin Lead Gen Form examples to...

Hero image with an icon of an envelope representing email

14 types of email marketing to experiment with

14 types of email marketing to experiment...

Hero image with an orange icon of a person with a star on a light blue background.

8 business anniversary marketing ideas and examples worth celebrating

8 business anniversary marketing ideas and...

Improve your productivity automatically. Use Zapier to get your apps working together.

A Zap with the trigger 'When I get a new lead from Facebook,' and the action 'Notify my team in Slack'

Pollfish Resources

  • Pollfish School
  • Market Research
  • Survey Guides
  • Get started

How To Conduct A Survey That You Can Trust In 8 Steps 

So you want to conduct a survey, not any run-of-the-mill survey, but one that you can trust, that is, one that quickly gathers the total number of survey respondents you selected — with the correct demographic and psychographic traits.

To do so, you’ll need to be able to preset these requirements in an online survey platform.

You’ll first need to find a potent online survey platform, along with understanding how to conduct a survey that provides accurate and reliable data on your target market . 

While building a strong survey campaign can appear to be difficult, if not downright intimidating, it is much simpler than it looks. This simplicity will depend on the survey platform you choose, as they are not all the same.

Nonetheless, there’s a process to conduct a survey that you can use across multiple campaigns, whether you need to conduct local or global surveys , study customer behavior , or even increase your customer retention rate .

Luckily, we’ve prepared an easy-to-follow, 8-step process for conducting surveys . This article is an 8-step guide to help you to design, conduct, and organize an effective survey in no time. Let’s dive in. 

Table of Contents: How To Conduct A Survey That You Can Trust In 8 Steps

The importance of conducting surveys.

Who should I survey and who is in my target market?

  • How many survey respondents do I need?

Steps to Conduct a Survey

Step 1: identify your research goal, step 2: define your survey audience, step 3: come up with preliminary questions, step 4: design your questionnaire, step 5: distribute your survey, step 6: organize survey responses, step 7. analyze and present survey results, step 8: take action, making every survey count.

First off, let’s uncover why you should conduct a survey in the first place. After all, there are a variety of other market research techniques you can use, including both primary and secondary research methods.

One of the most important reasons to conduct survey research is due to the prowess of surveys; they grant you original hard data and facts . You can use surveys to study virtually any subject and gain both quantitative and qualitative insights . 

Data, especially customer data, is becoming more and more sought after, as 40% of organizations aim to increase data-driven marketing budgets , and 64% of marketing leaders believe that data-driven strategies are vital in today’s economy. Surveys act as a convenient conduit to gain access to any sort of data , whether it is consumer-related or otherwise. 

Conducting surveys on your customers is one of the most effective ways to collect invaluable data and gain answers to concerns that are important to you. This is core to market research, as it allows you to better understand those most likely to buy from you , aka, your target market.

importance of conducting surveys

So how can you use surveys as a means of data for decision-making ? There are numerous campaigns and insights that surveys can avail and unlock. 

Surveys let you uncover hidden growth opportunities, reveal public sentiment, gain deep insights into customer buying behavior , and even get extra media coverage when prominent publications cite the findings of your research.  

They also prevent you from making the wrong business decisions , whether it deals with releasing a new product, creating an ad campaign that won’t resonate, appealing to the wrong persona and much more. Thus, surveys allow you to discover your risks, decide on whether they are worth taking and avoid mistakes.  

As an added bonus, simply the act of conducting a survey affects customer behavior, along with their opinions of a company. Specifically, the satisfaction of writing a positive survey response creates a desire to buy more of a product. With this information in tow, brands that include their name and likeness can increase sales simply by conducting a survey .

As such, surveys don’t merely provide you with an understanding of your customers’ needs, wants and sentiments; they also allow you to affect their perception of your brand and their willingness to buy from you. 

In this way, and as mentioned in the above link, surveys, especially those that provide positive experiences , contribute to your revenue, which keeps your business afloat. Aside from granting you new customers, you can also use them to survey existing customers.

By offering them a good experience and presenting your company in the best possible light, surveys also help you boost consumer loyalty , which is an absolute must. Loyalty is the core of customer retention, which is often cited as more important than customer acquisition. 

For example, did you know that 80% of profits come from just 20% of your existing customers ? In addition, retaining customers is far less costly than acquiring new ones. There are plenty of statistics that back up the claim that customer retention is both more profitable and less expensive to achieve than customer acquisition. 

For example, consider the following: 

  • Increasing customer retention by just 5% can increase profits from 25-95% .
  • The success rate of selling to an existing customer is between 60-70%, while the success rate of selling to a new customer is only 5-20%.

All in all, conducting a survey is crucial to the well-being of your customers and your business. Surveys help you unearth virtually any insights which you can then use to guide your next or ongoing business move.  

What You Need to Conduct a Survey

If you want to get meaningful results that you can act on, there are certain things you’ll need to have and certain actions you’ll need to take. These will steer your survey campaign in the right direction, give you the most accurate and useful results and ward off survey bias . 

Before we dive into the steps to conduct a survey, let’s glance over the things you’ll need (not all of which are tangible), to conduct your survey . These are a must and must be present within the online survey platform (or market research agency) that you use.

The following list lays out everything you need to conduct a successful survey:

  • Survey the correct population, 
  • Use the correct survey distribution method (see Step 5)
  • This includes ALL digital properties where your survey will live, such as websites, mobile sites, apps and more.
  • Have the ability to customize your surveys to your liking
  • Survey templates
  • Advanced skip logic
  • A filtering system
  • Screening questions
  • Matrix questions
  • Open-ending questions
  • Likert scale questions
  • Rating scale questions

NPS survey

  • The Customer Satisfaction Score Survey, aka, CSAT survey
  • The CET ( Customer Effort Score survey )
  • ones that use hearts, 
  • other visual ratings as scaled questions).
  • Both B2C and B2B surveys
  • Access to granular insights via a post-survey results dashboard and a survey builder.
  • Use a platform that offers 24-hour technical support 
  • Can conduct global market research
  • Gibberish answers
  • Respondents who aren’t paying attention
  • Flatliners (those who keep answering with the same choice in multiple-choice questions)
  • Those hiding their location via VPN

As you can gather, there are various elements to a successful survey. You’ll need to therefore carefully select your market research platform — or agency, if you’re taking the syndicated research path. 

You don’t want to settle for a low-tier platform, otherwise, you risk collecting unneeded biases and a whole host of low-quality data.

conduct a survey

Follow along the 8 steps in this guide to conduct meaningful survey research. 

  • Step 1: Identify your research goals
  • Step 2: Define your target audience
  • Step 3: Come up with preliminary questions
  • Step 4: Design your questionnaire
  • Step 5: Distribute your survey
  • Step 6: Organize survey responses
  • Step 7: Analyze and present survey results
  • Step 8: Take action

Every successful survey has a purpose. You’ll need to identify yours to get started. This will serve as the basis of the entire survey campaign. 

In order to identify your research goal, you’ll need to consider the insights your business needs most. Consider the following questions to ask yourself and your team:

  • Do you need to steer an advertising campaign?
  • Do you need to form a marketing strategy?
  • Are you trying to find out why you are losing customers? 
  • Do you want to know if your policies are effective? 
  • Are you figuring out what to do in the current market?
  • Do you need to discover your own employees’ sentiment about your workplace?
  • Would you like to cut back your customer attrition rate ? 

The kinds of questions you need to ask yourself and your company’s different departments are limitless.  

We suggest forming a survey that relates to your most pressing needs, or setting up a proactive survey study , that is, a survey campaign designed before you go through with something, such as designing a new feature or ad.

Understanding your survey’s main goal both improves its quality and reduces the time you’ll spend on executing your research.

In case you struggle to pinpoint your exact goal, write down a list of all the questions and issues your market research campaign needs and prioritize the most important ones. 

In addition, ask yourself and your team questions to better understand your own standing in regards to market research, your existing tools, campaigns and more. These are your peripheral questions, which will help you determine your key research goal.  

The following questions will help you understand your survey goal better :

  • Do you understand who comprises your target market?
  • Do you need to segment your target market further?
  • Do you already have any existing data that you can use? 
  • Do you need data to improve an existing product or launch another?
  • What resources do you have to perform the survey?
  • What actions are you going to take after the survey is complete?

After you have figured out the main goal of your research, you will need to define your survey audience . 

Identifying your survey target audience is key to any successful market research campaign . After all, it is the audience that you seek to study, to learn how its members tick, their habits, sentiments, etc. 

The wrong survey audience will invalidate your study, as it will be irrelevant to your business or study. 

There are two main concerns when it comes to surveying participants: who should I survey and how many participants do I need?

Let’s clarify both. 

Surveying the right people makes all the difference. That’s why before determining your survey audience, you’ll need to first identify the makeup of your target market. To do so, you’ll need to conduct secondary research, along with consolidating what you already know about your target market.

In addition, you’ll need to conduct market segmentation , which will allow you to break your wider target market into various segments. These can exist on the basis of various factors, such as age, ethnicity and other demographic factors, along with behavioral aspects, such as buying habits, frequency of purchase, brand trust and more. 

You can do this by conducting an RFM analysis , which is an abbreviation of Recency, Frequency and Monetary Value. In this analysis, researchers estimate the value of a customer based on the three data points in its abbreviated title. This is one of the models for customer behavior segmentation.

Targeting a specific audience is important for many reasons. For example, suppose you want to learn if iPhone users are happy with the recent product updates. 

By surveying random iPhone users, you may notice that the majority of responses are somewhat neutral. But if you target specifically the Gen Z generation, you might learn that the younger demographic is worried about having to buy extra accessories.  

The more defined your target audience criteria are, the more accurate and deep your survey insights will be. Thus, make sure to brainstorm, segment and fully identify your target market and your own customer personas before setting up your survey questionnaire. 

Identifying them first will show you which target audience you’ll need for your survey to gain the most accurate insights and help you fulfill your survey goal.

How many survey respondents do I need? 

When doctors want to examine your blood, they don’t drain all of it – they just need to take a small sample. The same principle stands with surveys: a small sample of survey respondents can accurately represent the opinions of a larger group. 

For example, if there are 5,000 people in your company and you want to know how well the latest HR policy was received, you don’t need to survey all 5,000. In fact, surveying just 146 employees will be enough. 

Thus, if you want to learn what all American high schoolers think about the recent TikTok ban, you don’t need to ask all 76 million of them. Surveying between 200 and 600 respondents will give you a sufficient amount of opinions to draw from.

For the majority of studies, 200 to 800 respondents will be enough to represent the thoughts and opinions of a particular population . However, all studies are not built the same , nor are they geared towards the same kind of longevity, think longitudinal surveys versus cross-sectional surveys . 

As such, you’ll need to calculate your survey sampling size , which is also referred to as a sampling pool.

If you want to calculate how many respondents you’ll need to get scientifically accurate survey results, feel free to use our sample size calculator . 

margin of error

Now that you’ve carefully selected a main survey goal and theme, along with having identified who to survey and how many to include in your sampling size, it is time to get to the heart of your survey: the questionnaire — or at least the beginning of it.

To do this, you’ll need to consider the main goal and subgoals of your survey campaign. As such, write down the most pressing questions you have. We suggest coming up with a list of 10 questions.

Note that not ALL of them need to be in your survey, in fact, we suggest keeping your questionnaire short. Even users of a game who’ve come across your survey and decided to take it for in-game survey incentives will hesitate to take a lengthy survey.

As such, your preliminary 10 (or more) questions are just that: preliminary. Not all of them will make it to your questionnaire, as they are meant for brainstorming ideas.

As you create these questions, heed the following tips:

  • For example, if you’re going to run a longitudinal study, you’ll need to create questions that span through various time periods. 
  • Or, for a cross-sectional study, you’ll need to create questions for just one survey and thus have one primary focus of the study.
  • Do any of your questions appear too similar to one another? If so, consider merging them or removing a few.
  • For example, if you ask a question in which a certain answer requires more information, consider using follow-up questions.
  • That’s where advanced skip logic becomes handy, as it routes respondents to relevant follow-up questions.
  • This creates paths in your survey and allows you to understand your respondents and the subject of the original question at a deeper level.   
  • This can also be relevant to the first point, as you can use similar questions as potential follow-up questions.
  • If so, refer to your customer segmentation and personas list. You may need to break your survey into two or more, depending on how many customer segments it can be used for.
  • You can also add multiple audiences in one survey. 
  • Here, you can get into the nitty-gritty of what is most important for your study by way of other relevant opinions that will help shape it.

Next, we’re going to move on to designing the questionnaire itself. This will largely depend on the survey platform you use . As aforementioned, you’ll need to use a strong market research SaaS platform that offers a variety of features and services to form a robust survey campaign. 

design survey questions

Make sure your survey platform allows you to build the questions you need at ease and speed. 

It’s key to note that the quality of a questionnaire is where the majority of surveys fall short. Experiments suggest that sensitive or vague opinion questions increase the potential of error by up to 30% .  Put simply, your survey is as good as your questionnaire is. 

Make sure your questions are clear and don’t contain jargon or uncommon abbreviations. This is key to shaping the survey experience.

A poor example of a survey question: Do you think VR is going to take off in the next 5 years?

A better example of a survey question: Do you think virtual reality (VR) is going to take off in the next 5 years?

In some instances, a poor question is one that yields scant information . In this case, it is key to follow it up with another, or create it so that it doesn’t require additional questions to begin with. In this case, a yes or no question constitutes a poor example, whereas an open-ended question is the better example. 

A poor example: Do you agree that this is a great movie?

A better example: What do you think of this movie?

Take some time to learn how to write clear, unbiased, and effective survey questions to get the best results out of your research. 

There are several ways to distribute a survey . These include legacy distribution methods and modern ones. While it may not appear to be very important, choosing how your survey is distributed is as important as choosing who you want to survey. 

This is because survey distribution accounts for many aspects of your study, including the following:

  • In the digital vs analog world
  • On websites or apps
  • Used as part of a focus group
  • If your target market see your survey based on its distribution channel(s)
  • When your target market will see your survey
  • How quickly you’ll gain respondents and completed surveys
  • When you can access your post-survey dashboard 
  • When you can carry out a survey data analysis
  • Associations of your brand (if you mention it in the survey)
  • For example, an online survey platform that continues iterating until it receives all required responses works faster than do interviews that a market research firm conducts.

The environment of the survey is critical to its exposure by your target market, as Point 2 states. This is because different demographics spend time online (and in the real world) differently.

Let’s continue with the example of surveying Gen Z iPhone users. Suppose you moderate a local school Facebook group and decide to post your survey there. Even if you get a large number of responses, the results may not accurately affect this demographic. 

This is because in this case, you don’t pick survey participants randomly, instead, you survey only those who joined the local school Facebook group that you conveniently happen to moderate. 

This is called convenience sampling , since the majority of survey participants unintentionally live in one area. The survey didn’t account for Gen Z users from other areas with different average household incomes. 

To ensure you get the most accurate survey results , use a survey platform that can help you reach your targeted demographics more precisely and at speed. In short, avoid convenience sampling.

Instead, opt for organic sampling , which gathers survey respondents by distributing your survey to the places they spend their time organically. On the Pollfish online survey platform, we use organic sampling in the form of RDE sampling , or Random Device Engagement sampling .

random device engagement

This is the opposite of a research panel , which is a research method that pre-recruits and prescreens a group of research participants who have opted in to take part as the studied subjects of a market research campaign.

Lastly, before we provide a few examples of survey distribution methods, it is also critical to be strategic about when to send your survey . For this, we recommend reading our quick guide on the best time to send a survey .

Here are a few common ways to distribute your surveys: 

  • Email. You can distribute your survey by email, especially if you have access to an established email list. The two main drawbacks of email surveys are that it’s harder to set specific target audience parameters and email response rates are generally low.
  • Social media: if you survey people via social media channels, beware that sometimes social media groups attract people with shared interests that may not represent the opinion of your target audience or the general public.
  • Online survey platforms: survey platforms such as Pollfish allow you to hyper-target specific audiences, control the number of participants, distribute the survey in different ways, reach all quotas, easily organize your survey results and more. 
  • Survey panels : A survey panel is a consistent group of survey participants, who have pre-recruited and pre-screened, who opt into a survey study. Researchers would return to the same people to run surveys or host interviews repeatedly over time.
  • Syndicated research : Syndicated research refers to research conducted by a market research firm, oftentimes independently. It is published and sold by a market research firm, which is usually industry-specific and funded by several companies within a particular industry. The firm and its partner companies own the data that the firm collects. Other companies in their particular industry may purchase the data.

Besides these prominent survey channels, there are other survey solutions you can use; make sure to select the one most pertinent to your market research needs.

After you’ve gathered your responses, you’ll need to organize the data before starting your analysis. As with the prior steps, this will largely depend on your survey tool, which also dictates your survey distribution, audience targeting and creation .

Here are the steps to prepare your data for analysis :

  • Clean. Sometimes people fill out the survey twice by mistake. Although Pollfish survey technology prevents duplicate responses altogether, if you’re conducting a survey on your own, or via syndicated research, make sure to clean duplicates and “funny” answers before you proceed to organize your data.
  • Organize. Group survey answers that are similar to each other and try finding patterns that allow you to structure your data.
  • Visualize. Try finding ways of visualizing survey responses using graphs, charts and images. Visualized survey data is easier to analyze and refer to, especially if you want to share survey results with other people.

The data you collected during your survey can be presented and analyzed in many different ways, so make sure to go back to the survey goal that we covered in Step 1. 

Analyzing survey results and writing a report often go hand in hand, so it’s a good practice to go back and forth between the two until you fully narrow down your findings. 

Here are some questions that will help you write a better report: 

  • Did you achieve your survey goals?
  • How can you organize your findings into cohesive narratives?
  • What are the main insights that you gathered?
  • How can you use the collected data in the future?
  • Are there other ways this data can be interpreted? 

Keep your margin of error in mind during your survey analysis. This measurement points to the degree of error in the results of a survey, specifically one that relies on the random sampling method.

It is imperative to keep the margin of error low, as a high margin of error reveals a smaller likelihood of survey results to reflect the true views of your survey target audience. As such, a higher margin of error renders your survey less reliable and inconclusive.

If you are presenting a report to others, remember that different audiences may be interested in different aspects of your survey.

In case your audience is primarily business stakeholders, then the main focus should be concrete customer preferences or aversions , along with actionable suggestions.

If you are presenting a survey to other researchers, they will be more interested in the technical aspects of your survey such as target audience, sample size, and data analysis method.   

Make sure to consolidate your survey data analysis into one document. The document should be divided into the themes, patterns and other central areas of focus of which you’ve collected and analyzed data to draw different conclusions.

It will be this data — not the raw data in your dashboard — that will guide your business decisions, changes and all other courses of action. 

In this step, you’re going to consult the information you’ve gathered and analyzed in Steps 6 and 7. You’ll need to create a document of your findings, one that exists outside your dashboard and is central to your survey analysis .

This document should cover central findings, along with key granular ones. It should also answer some of the key concerns you had in Step 1, along with the questions designed for your respondents themselves. Do your results and analysis answer all the inquiries and curiosities you had about the topic at hand? If so, it is time to take action . If not, then you should create another survey , one that focuses on the things that are left unanswered, or anything you need more information on.  

conduct a survey

We recommend using an online survey provider that offers the random device engagement method, which, as aforementioned, is a kind of organic sampling that uses digital properties to query respondents where they visit organically.   

If, however, you have all the insights you need, it is time to take data-informed action. There are many ways to take action on any given topic. The following list enumerates various ways to act on your survey data and analysis:

  • The establishment of something (ad campaigns, marketing strategy, pricing, a slogan, etc)
  • Changing something already in existence (ads, videos, promotions, pricing, etc)
  • Scrapping aspects of an ad, marketing, sales or any other business campaign or activity
  • Terminating an action or campaign entirely
  • The formation of slightly different approaches based on different market segments

All in all, after you’ve followed these steps, you will be much closer to your original goal, whether it is solely to have invaluable customer/subject data, or to use that data to make immediate or long-term decisions. 

Therein lies the power of surveys, they grant you the knowledge you can use for a host of decision-making.

Every business has a slew of questions about its industry, competitors and customers. As such, they must use market research to crack these challenges and properly serve their target market. 

Conducting a survey is at the forefront of conducting this kind of research, as it grants you firsthand insights, tailored specifically to your target market, with your most requisite questions.

The challenge in conducting a survey manifold : finding a survey solution to easily distribute your questions to the right audience, creating a survey with the proper questions, distributing the survey in the right channels, consolidating your data and more.

Following our eight steps will help you conduct meaningful and unbiased surveys to answer your most demanding questions. However, adhering to this process is not enough . 

You’ll need to find a potent online survey platform to facilitate your entire survey process , from targeting, to questionnaire building, filtering data and more. 

Ideally, it should provide various quality and technical checks to ward off survey fraud , offer a mobile-first survey environment and allow you to survey anyone, not just via on network on the RDE method (although this method is incredibly effective) .  

It should allow you to survey specific people, such as via email, or whichever digital channel you seek to use. Luckily, there’s the Distribution Link feature , which enables you to do just that. 

Frequently asked questions

What is the first step in planning a successful survey.

Before writing questions or recruiting participants, you should establish the goals of your survey. By understanding goals, you can ensure your survey stays focused and will answer your most important questions.

Why are surveys used?

Surveys are one of the best ways to gather information about your customers or target audience. As opposed to simply researching an industry or trend, surveys let you ask specific questions to the people who matter most to your business.

Why is it important to define the target audience for your survey?

A more defined audience will lead to deeper, more relevant insights. A carefully defined audience provides more accurate results and ensures the goals of your survey are met.

How can online surveys be distributed?

Online surveys can be distributed via email, social media, or a professional survey platform.

How many people should take an online survey?

The number of respondents needed will vary from one survey to the next. The important part is that the sample size accurately represents the target audience. For most studies, a sample size of 200 - 400 is a good goal.

Do you want to distribute your survey? Pollfish offers you access to millions of targeted consumers to get survey responses from $0.95 per complete. Launch your survey today.

Privacy Preference Center

Privacy preferences.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Writing Survey Questions

Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions. Creating good measures involves both writing good questions and organizing them to form the questionnaire.

Questionnaire design is a multistage process that requires attention to many details at once. Designing the questionnaire is complicated because surveys can ask about topics in varying degrees of detail, questions can be asked in different ways, and questions asked earlier in a survey may influence how people respond to later questions. Researchers are also often interested in measuring change over time and therefore must be attentive to how opinions or behaviors have been measured in prior surveys.

Surveyors may conduct pilot tests or focus groups in the early stages of questionnaire development in order to better understand how people think about an issue or comprehend a question. Pretesting a survey is an essential step in the questionnaire design process to evaluate how people respond to the overall questionnaire and specific questions, especially when questions are being introduced for the first time.

For many years, surveyors approached questionnaire design as an art, but substantial research over the past forty years has demonstrated that there is a lot of science involved in crafting a good survey questionnaire. Here, we discuss the pitfalls and best practices of designing questionnaires.

Question development

There are several steps involved in developing a survey questionnaire. The first is identifying what topics will be covered in the survey. For Pew Research Center surveys, this involves thinking about what is happening in our nation and the world and what will be relevant to the public, policymakers and the media. We also track opinion on a variety of issues over time so we often ensure that we update these trends on a regular basis to better understand whether people’s opinions are changing.

At Pew Research Center, questionnaire development is a collaborative and iterative process where staff meet to discuss drafts of the questionnaire several times over the course of its development. We frequently test new survey questions ahead of time through qualitative research methods such as  focus groups , cognitive interviews, pretesting (often using an  online, opt-in sample ), or a combination of these approaches. Researchers use insights from this testing to refine questions before they are asked in a production survey, such as on the ATP.

Measuring change over time

Many surveyors want to track changes over time in people’s attitudes, opinions and behaviors. To measure change, questions are asked at two or more points in time. A cross-sectional design surveys different people in the same population at multiple points in time. A panel, such as the ATP, surveys the same people over time. However, it is common for the set of people in survey panels to change over time as new panelists are added and some prior panelists drop out. Many of the questions in Pew Research Center surveys have been asked in prior polls. Asking the same questions at different points in time allows us to report on changes in the overall views of the general public (or a subset of the public, such as registered voters, men or Black Americans), or what we call “trending the data”.

When measuring change over time, it is important to use the same question wording and to be sensitive to where the question is asked in the questionnaire to maintain a similar context as when the question was asked previously (see  question wording  and  question order  for further information). All of our survey reports include a topline questionnaire that provides the exact question wording and sequencing, along with results from the current survey and previous surveys in which we asked the question.

The Center’s transition from conducting U.S. surveys by live telephone interviewing to an online panel (around 2014 to 2020) complicated some opinion trends, but not others. Opinion trends that ask about sensitive topics (e.g., personal finances or attending religious services ) or that elicited volunteered answers (e.g., “neither” or “don’t know”) over the phone tended to show larger differences than other trends when shifting from phone polls to the online ATP. The Center adopted several strategies for coping with changes to data trends that may be related to this change in methodology. If there is evidence suggesting that a change in a trend stems from switching from phone to online measurement, Center reports flag that possibility for readers to try to head off confusion or erroneous conclusions.

Open- and closed-ended questions

One of the most significant decisions that can affect how people answer questions is whether the question is posed as an open-ended question, where respondents provide a response in their own words, or a closed-ended question, where they are asked to choose from a list of answer choices.

For example, in a poll conducted after the 2008 presidential election, people responded very differently to two versions of the question: “What one issue mattered most to you in deciding how you voted for president?” One was closed-ended and the other open-ended. In the closed-ended version, respondents were provided five options and could volunteer an option not on the list.

When explicitly offered the economy as a response, more than half of respondents (58%) chose this answer; only 35% of those who responded to the open-ended version volunteered the economy. Moreover, among those asked the closed-ended version, fewer than one-in-ten (8%) provided a response other than the five they were read. By contrast, fully 43% of those asked the open-ended version provided a response not listed in the closed-ended version of the question. All of the other issues were chosen at least slightly more often when explicitly offered in the closed-ended version than in the open-ended version. (Also see  “High Marks for the Campaign, a High Bar for Obama”  for more information.)

how to conduct a research survey

Researchers will sometimes conduct a pilot study using open-ended questions to discover which answers are most common. They will then develop closed-ended questions based off that pilot study that include the most common responses as answer choices. In this way, the questions may better reflect what the public is thinking, how they view a particular issue, or bring certain issues to light that the researchers may not have been aware of.

When asking closed-ended questions, the choice of options provided, how each option is described, the number of response options offered, and the order in which options are read can all influence how people respond. One example of the impact of how categories are defined can be found in a Pew Research Center poll conducted in January 2002. When half of the sample was asked whether it was “more important for President Bush to focus on domestic policy or foreign policy,” 52% chose domestic policy while only 34% said foreign policy. When the category “foreign policy” was narrowed to a specific aspect – “the war on terrorism” – far more people chose it; only 33% chose domestic policy while 52% chose the war on terrorism.

In most circumstances, the number of answer choices should be kept to a relatively small number – just four or perhaps five at most – especially in telephone surveys. Psychological research indicates that people have a hard time keeping more than this number of choices in mind at one time. When the question is asking about an objective fact and/or demographics, such as the religious affiliation of the respondent, more categories can be used. In fact, they are encouraged to ensure inclusivity. For example, Pew Research Center’s standard religion questions include more than 12 different categories, beginning with the most common affiliations (Protestant and Catholic). Most respondents have no trouble with this question because they can expect to see their religious group within that list in a self-administered survey.

In addition to the number and choice of response options offered, the order of answer categories can influence how people respond to closed-ended questions. Research suggests that in telephone surveys respondents more frequently choose items heard later in a list (a “recency effect”), and in self-administered surveys, they tend to choose items at the top of the list (a “primacy” effect).

Because of concerns about the effects of category order on responses to closed-ended questions, many sets of response options in Pew Research Center’s surveys are programmed to be randomized to ensure that the options are not asked in the same order for each respondent. Rotating or randomizing means that questions or items in a list are not asked in the same order to each respondent. Answers to questions are sometimes affected by questions that precede them. By presenting questions in a different order to each respondent, we ensure that each question gets asked in the same context as every other question the same number of times (e.g., first, last or any position in between). This does not eliminate the potential impact of previous questions on the current question, but it does ensure that this bias is spread randomly across all of the questions or items in the list. For instance, in the example discussed above about what issue mattered most in people’s vote, the order of the five issues in the closed-ended version of the question was randomized so that no one issue appeared early or late in the list for all respondents. Randomization of response items does not eliminate order effects, but it does ensure that this type of bias is spread randomly.

Questions with ordinal response categories – those with an underlying order (e.g., excellent, good, only fair, poor OR very favorable, mostly favorable, mostly unfavorable, very unfavorable) – are generally not randomized because the order of the categories conveys important information to help respondents answer the question. Generally, these types of scales should be presented in order so respondents can easily place their responses along the continuum, but the order can be reversed for some respondents. For example, in one of Pew Research Center’s questions about abortion, half of the sample is asked whether abortion should be “legal in all cases, legal in most cases, illegal in most cases, illegal in all cases,” while the other half of the sample is asked the same question with the response categories read in reverse order, starting with “illegal in all cases.” Again, reversing the order does not eliminate the recency effect but distributes it randomly across the population.

Question wording

The choice of words and phrases in a question is critical in expressing the meaning and intent of the question to the respondent and ensuring that all respondents interpret the question the same way. Even small wording differences can substantially affect the answers people provide.

[View more Methods 101 Videos ]

An example of a wording difference that had a significant impact on responses comes from a January 2003 Pew Research Center survey. When people were asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” 68% said they favored military action while 25% said they opposed military action. However, when asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule  even if it meant that U.S. forces might suffer thousands of casualties, ” responses were dramatically different; only 43% said they favored military action, while 48% said they opposed it. The introduction of U.S. casualties altered the context of the question and influenced whether people favored or opposed military action in Iraq.

There has been a substantial amount of research to gauge the impact of different ways of asking questions and how to minimize differences in the way respondents interpret what is being asked. The issues related to question wording are more numerous than can be treated adequately in this short space, but below are a few of the important things to consider:

First, it is important to ask questions that are clear and specific and that each respondent will be able to answer. If a question is open-ended, it should be evident to respondents that they can answer in their own words and what type of response they should provide (an issue or problem, a month, number of days, etc.). Closed-ended questions should include all reasonable responses (i.e., the list of options is exhaustive) and the response categories should not overlap (i.e., response options should be mutually exclusive). Further, it is important to discern when it is best to use forced-choice close-ended questions (often denoted with a radio button in online surveys) versus “select-all-that-apply” lists (or check-all boxes). A 2019 Center study found that forced-choice questions tend to yield more accurate responses, especially for sensitive questions.  Based on that research, the Center generally avoids using select-all-that-apply questions.

It is also important to ask only one question at a time. Questions that ask respondents to evaluate more than one concept (known as double-barreled questions) – such as “How much confidence do you have in President Obama to handle domestic and foreign policy?” – are difficult for respondents to answer and often lead to responses that are difficult to interpret. In this example, it would be more effective to ask two separate questions, one about domestic policy and another about foreign policy.

In general, questions that use simple and concrete language are more easily understood by respondents. It is especially important to consider the education level of the survey population when thinking about how easy it will be for respondents to interpret and answer a question. Double negatives (e.g., do you favor or oppose  not  allowing gays and lesbians to legally marry) or unfamiliar abbreviations or jargon (e.g., ANWR instead of Arctic National Wildlife Refuge) can result in respondent confusion and should be avoided.

Similarly, it is important to consider whether certain words may be viewed as biased or potentially offensive to some respondents, as well as the emotional reaction that some words may provoke. For example, in a 2005 Pew Research Center survey, 51% of respondents said they favored “making it legal for doctors to give terminally ill patients the means to end their lives,” but only 44% said they favored “making it legal for doctors to assist terminally ill patients in committing suicide.” Although both versions of the question are asking about the same thing, the reaction of respondents was different. In another example, respondents have reacted differently to questions using the word “welfare” as opposed to the more generic “assistance to the poor.” Several experiments have shown that there is much greater public support for expanding “assistance to the poor” than for expanding “welfare.”

We often write two versions of a question and ask half of the survey sample one version of the question and the other half the second version. Thus, we say we have two  forms  of the questionnaire. Respondents are assigned randomly to receive either form, so we can assume that the two groups of respondents are essentially identical. On questions where two versions are used, significant differences in the answers between the two forms tell us that the difference is a result of the way we worded the two versions.

how to conduct a research survey

One of the most common formats used in survey questions is the “agree-disagree” format. In this type of question, respondents are asked whether they agree or disagree with a particular statement. Research has shown that, compared with the better educated and better informed, less educated and less informed respondents have a greater tendency to agree with such statements. This is sometimes called an “acquiescence bias” (since some kinds of respondents are more likely to acquiesce to the assertion than are others). This behavior is even more pronounced when there’s an interviewer present, rather than when the survey is self-administered. A better practice is to offer respondents a choice between alternative statements. A Pew Research Center experiment with one of its routinely asked values questions illustrates the difference that question format can make. Not only does the forced choice format yield a very different result overall from the agree-disagree format, but the pattern of answers between respondents with more or less formal education also tends to be very different.

One other challenge in developing questionnaires is what is called “social desirability bias.” People have a natural tendency to want to be accepted and liked, and this may lead people to provide inaccurate answers to questions that deal with sensitive subjects. Research has shown that respondents understate alcohol and drug use, tax evasion and racial bias. They also may overstate church attendance, charitable contributions and the likelihood that they will vote in an election. Researchers attempt to account for this potential bias in crafting questions about these topics. For instance, when Pew Research Center surveys ask about past voting behavior, it is important to note that circumstances may have prevented the respondent from voting: “In the 2012 presidential election between Barack Obama and Mitt Romney, did things come up that kept you from voting, or did you happen to vote?” The choice of response options can also make it easier for people to be honest. For example, a question about church attendance might include three of six response options that indicate infrequent attendance. Research has also shown that social desirability bias can be greater when an interviewer is present (e.g., telephone and face-to-face surveys) than when respondents complete the survey themselves (e.g., paper and web surveys).

Lastly, because slight modifications in question wording can affect responses, identical question wording should be used when the intention is to compare results to those from earlier surveys. Similarly, because question wording and responses can vary based on the mode used to survey respondents, researchers should carefully evaluate the likely effects on trend measurements if a different survey mode will be used to assess change in opinion over time.

Question order

Once the survey questions are developed, particular attention should be paid to how they are ordered in the questionnaire. Surveyors must be attentive to how questions early in a questionnaire may have unintended effects on how respondents answer subsequent questions. Researchers have demonstrated that the order in which questions are asked can influence how people respond; earlier questions can unintentionally provide context for the questions that follow (these effects are called “order effects”).

One kind of order effect can be seen in responses to open-ended questions. Pew Research Center surveys generally ask open-ended questions about national problems, opinions about leaders and similar topics near the beginning of the questionnaire. If closed-ended questions that relate to the topic are placed before the open-ended question, respondents are much more likely to mention concepts or considerations raised in those earlier questions when responding to the open-ended question.

For closed-ended opinion questions, there are two main types of order effects: contrast effects ( where the order results in greater differences in responses), and assimilation effects (where responses are more similar as a result of their order).

how to conduct a research survey

An example of a contrast effect can be seen in a Pew Research Center poll conducted in October 2003, a dozen years before same-sex marriage was legalized in the U.S. That poll found that people were more likely to favor allowing gays and lesbians to enter into legal agreements that give them the same rights as married couples when this question was asked after one about whether they favored or opposed allowing gays and lesbians to marry (45% favored legal agreements when asked after the marriage question, but 37% favored legal agreements without the immediate preceding context of a question about same-sex marriage). Responses to the question about same-sex marriage, meanwhile, were not significantly affected by its placement before or after the legal agreements question.

how to conduct a research survey

Another experiment embedded in a December 2008 Pew Research Center poll also resulted in a contrast effect. When people were asked “All in all, are you satisfied or dissatisfied with the way things are going in this country today?” immediately after having been asked “Do you approve or disapprove of the way George W. Bush is handling his job as president?”; 88% said they were dissatisfied, compared with only 78% without the context of the prior question.

Responses to presidential approval remained relatively unchanged whether national satisfaction was asked before or after it. A similar finding occurred in December 2004 when both satisfaction and presidential approval were much higher (57% were dissatisfied when Bush approval was asked first vs. 51% when general satisfaction was asked first).

Several studies also have shown that asking a more specific question before a more general question (e.g., asking about happiness with one’s marriage before asking about one’s overall happiness) can result in a contrast effect. Although some exceptions have been found, people tend to avoid redundancy by excluding the more specific question from the general rating.

Assimilation effects occur when responses to two questions are more consistent or closer together because of their placement in the questionnaire. We found an example of an assimilation effect in a Pew Research Center poll conducted in November 2008 when we asked whether Republican leaders should work with Obama or stand up to him on important issues and whether Democratic leaders should work with Republican leaders or stand up to them on important issues. People were more likely to say that Republican leaders should work with Obama when the question was preceded by the one asking what Democratic leaders should do in working with Republican leaders (81% vs. 66%). However, when people were first asked about Republican leaders working with Obama, fewer said that Democratic leaders should work with Republican leaders (71% vs. 82%).

The order questions are asked is of particular importance when tracking trends over time. As a result, care should be taken to ensure that the context is similar each time a question is asked. Modifying the context of the question could call into question any observed changes over time (see  measuring change over time  for more information).

A questionnaire, like a conversation, should be grouped by topic and unfold in a logical order. It is often helpful to begin the survey with simple questions that respondents will find interesting and engaging. Throughout the survey, an effort should be made to keep the survey interesting and not overburden respondents with several difficult questions right after one another. Demographic questions such as income, education or age should not be asked near the beginning of a survey unless they are needed to determine eligibility for the survey or for routing respondents through particular sections of the questionnaire. Even then, it is best to precede such items with more interesting and engaging questions. One virtue of survey panels like the ATP is that demographic questions usually only need to be asked once a year, not in each survey.

U.S. Surveys

Other research methods.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

  • Privacy Policy

Research Method

Home » Survey Research – Types, Methods, Examples

Survey Research – Types, Methods, Examples

Table of Contents

Survey Research

Survey Research

Definition:

Survey Research is a quantitative research method that involves collecting standardized data from a sample of individuals or groups through the use of structured questionnaires or interviews. The data collected is then analyzed statistically to identify patterns and relationships between variables, and to draw conclusions about the population being studied.

Survey research can be used to answer a variety of questions, including:

  • What are people’s opinions about a certain topic?
  • What are people’s experiences with a certain product or service?
  • What are people’s beliefs about a certain issue?

Survey Research Methods

Survey Research Methods are as follows:

  • Telephone surveys: A survey research method where questions are administered to respondents over the phone, often used in market research or political polling.
  • Face-to-face surveys: A survey research method where questions are administered to respondents in person, often used in social or health research.
  • Mail surveys: A survey research method where questionnaires are sent to respondents through mail, often used in customer satisfaction or opinion surveys.
  • Online surveys: A survey research method where questions are administered to respondents through online platforms, often used in market research or customer feedback.
  • Email surveys: A survey research method where questionnaires are sent to respondents through email, often used in customer satisfaction or opinion surveys.
  • Mixed-mode surveys: A survey research method that combines two or more survey modes, often used to increase response rates or reach diverse populations.
  • Computer-assisted surveys: A survey research method that uses computer technology to administer or collect survey data, often used in large-scale surveys or data collection.
  • Interactive voice response surveys: A survey research method where respondents answer questions through a touch-tone telephone system, often used in automated customer satisfaction or opinion surveys.
  • Mobile surveys: A survey research method where questions are administered to respondents through mobile devices, often used in market research or customer feedback.
  • Group-administered surveys: A survey research method where questions are administered to a group of respondents simultaneously, often used in education or training evaluation.
  • Web-intercept surveys: A survey research method where questions are administered to website visitors, often used in website or user experience research.
  • In-app surveys: A survey research method where questions are administered to users of a mobile application, often used in mobile app or user experience research.
  • Social media surveys: A survey research method where questions are administered to respondents through social media platforms, often used in social media or brand awareness research.
  • SMS surveys: A survey research method where questions are administered to respondents through text messaging, often used in customer feedback or opinion surveys.
  • IVR surveys: A survey research method where questions are administered to respondents through an interactive voice response system, often used in automated customer feedback or opinion surveys.
  • Mixed-method surveys: A survey research method that combines both qualitative and quantitative data collection methods, often used in exploratory or mixed-method research.
  • Drop-off surveys: A survey research method where respondents are provided with a survey questionnaire and asked to return it at a later time or through a designated drop-off location.
  • Intercept surveys: A survey research method where respondents are approached in public places and asked to participate in a survey, often used in market research or customer feedback.
  • Hybrid surveys: A survey research method that combines two or more survey modes, data sources, or research methods, often used in complex or multi-dimensional research questions.

Types of Survey Research

There are several types of survey research that can be used to collect data from a sample of individuals or groups. following are Types of Survey Research:

  • Cross-sectional survey: A type of survey research that gathers data from a sample of individuals at a specific point in time, providing a snapshot of the population being studied.
  • Longitudinal survey: A type of survey research that gathers data from the same sample of individuals over an extended period of time, allowing researchers to track changes or trends in the population being studied.
  • Panel survey: A type of longitudinal survey research that tracks the same sample of individuals over time, typically collecting data at multiple points in time.
  • Epidemiological survey: A type of survey research that studies the distribution and determinants of health and disease in a population, often used to identify risk factors and inform public health interventions.
  • Observational survey: A type of survey research that collects data through direct observation of individuals or groups, often used in behavioral or social research.
  • Correlational survey: A type of survey research that measures the degree of association or relationship between two or more variables, often used to identify patterns or trends in data.
  • Experimental survey: A type of survey research that involves manipulating one or more variables to observe the effect on an outcome, often used to test causal hypotheses.
  • Descriptive survey: A type of survey research that describes the characteristics or attributes of a population or phenomenon, often used in exploratory research or to summarize existing data.
  • Diagnostic survey: A type of survey research that assesses the current state or condition of an individual or system, often used in health or organizational research.
  • Explanatory survey: A type of survey research that seeks to explain or understand the causes or mechanisms behind a phenomenon, often used in social or psychological research.
  • Process evaluation survey: A type of survey research that measures the implementation and outcomes of a program or intervention, often used in program evaluation or quality improvement.
  • Impact evaluation survey: A type of survey research that assesses the effectiveness or impact of a program or intervention, often used to inform policy or decision-making.
  • Customer satisfaction survey: A type of survey research that measures the satisfaction or dissatisfaction of customers with a product, service, or experience, often used in marketing or customer service research.
  • Market research survey: A type of survey research that collects data on consumer preferences, behaviors, or attitudes, often used in market research or product development.
  • Public opinion survey: A type of survey research that measures the attitudes, beliefs, or opinions of a population on a specific issue or topic, often used in political or social research.
  • Behavioral survey: A type of survey research that measures actual behavior or actions of individuals, often used in health or social research.
  • Attitude survey: A type of survey research that measures the attitudes, beliefs, or opinions of individuals, often used in social or psychological research.
  • Opinion poll: A type of survey research that measures the opinions or preferences of a population on a specific issue or topic, often used in political or media research.
  • Ad hoc survey: A type of survey research that is conducted for a specific purpose or research question, often used in exploratory research or to answer a specific research question.

Types Based on Methodology

Based on Methodology Survey are divided into two Types:

Quantitative Survey Research

Qualitative survey research.

Quantitative survey research is a method of collecting numerical data from a sample of participants through the use of standardized surveys or questionnaires. The purpose of quantitative survey research is to gather empirical evidence that can be analyzed statistically to draw conclusions about a particular population or phenomenon.

In quantitative survey research, the questions are structured and pre-determined, often utilizing closed-ended questions, where participants are given a limited set of response options to choose from. This approach allows for efficient data collection and analysis, as well as the ability to generalize the findings to a larger population.

Quantitative survey research is often used in market research, social sciences, public health, and other fields where numerical data is needed to make informed decisions and recommendations.

Qualitative survey research is a method of collecting non-numerical data from a sample of participants through the use of open-ended questions or semi-structured interviews. The purpose of qualitative survey research is to gain a deeper understanding of the experiences, perceptions, and attitudes of participants towards a particular phenomenon or topic.

In qualitative survey research, the questions are open-ended, allowing participants to share their thoughts and experiences in their own words. This approach allows for a rich and nuanced understanding of the topic being studied, and can provide insights that are difficult to capture through quantitative methods alone.

Qualitative survey research is often used in social sciences, education, psychology, and other fields where a deeper understanding of human experiences and perceptions is needed to inform policy, practice, or theory.

Data Analysis Methods

There are several Survey Research Data Analysis Methods that researchers may use, including:

  • Descriptive statistics: This method is used to summarize and describe the basic features of the survey data, such as the mean, median, mode, and standard deviation. These statistics can help researchers understand the distribution of responses and identify any trends or patterns.
  • Inferential statistics: This method is used to make inferences about the larger population based on the data collected in the survey. Common inferential statistical methods include hypothesis testing, regression analysis, and correlation analysis.
  • Factor analysis: This method is used to identify underlying factors or dimensions in the survey data. This can help researchers simplify the data and identify patterns and relationships that may not be immediately apparent.
  • Cluster analysis: This method is used to group similar respondents together based on their survey responses. This can help researchers identify subgroups within the larger population and understand how different groups may differ in their attitudes, behaviors, or preferences.
  • Structural equation modeling: This method is used to test complex relationships between variables in the survey data. It can help researchers understand how different variables may be related to one another and how they may influence one another.
  • Content analysis: This method is used to analyze open-ended responses in the survey data. Researchers may use software to identify themes or categories in the responses, or they may manually review and code the responses.
  • Text mining: This method is used to analyze text-based survey data, such as responses to open-ended questions. Researchers may use software to identify patterns and themes in the text, or they may manually review and code the text.

Applications of Survey Research

Here are some common applications of survey research:

  • Market Research: Companies use survey research to gather insights about customer needs, preferences, and behavior. These insights are used to create marketing strategies and develop new products.
  • Public Opinion Research: Governments and political parties use survey research to understand public opinion on various issues. This information is used to develop policies and make decisions.
  • Social Research: Survey research is used in social research to study social trends, attitudes, and behavior. Researchers use survey data to explore topics such as education, health, and social inequality.
  • Academic Research: Survey research is used in academic research to study various phenomena. Researchers use survey data to test theories, explore relationships between variables, and draw conclusions.
  • Customer Satisfaction Research: Companies use survey research to gather information about customer satisfaction with their products and services. This information is used to improve customer experience and retention.
  • Employee Surveys: Employers use survey research to gather feedback from employees about their job satisfaction, working conditions, and organizational culture. This information is used to improve employee retention and productivity.
  • Health Research: Survey research is used in health research to study topics such as disease prevalence, health behaviors, and healthcare access. Researchers use survey data to develop interventions and improve healthcare outcomes.

Examples of Survey Research

Here are some real-time examples of survey research:

  • COVID-19 Pandemic Surveys: Since the outbreak of the COVID-19 pandemic, surveys have been conducted to gather information about public attitudes, behaviors, and perceptions related to the pandemic. Governments and healthcare organizations have used this data to develop public health strategies and messaging.
  • Political Polls During Elections: During election seasons, surveys are used to measure public opinion on political candidates, policies, and issues in real-time. This information is used by political parties to develop campaign strategies and make decisions.
  • Customer Feedback Surveys: Companies often use real-time customer feedback surveys to gather insights about customer experience and satisfaction. This information is used to improve products and services quickly.
  • Event Surveys: Organizers of events such as conferences and trade shows often use surveys to gather feedback from attendees in real-time. This information can be used to improve future events and make adjustments during the current event.
  • Website and App Surveys: Website and app owners use surveys to gather real-time feedback from users about the functionality, user experience, and overall satisfaction with their platforms. This feedback can be used to improve the user experience and retain customers.
  • Employee Pulse Surveys: Employers use real-time pulse surveys to gather feedback from employees about their work experience and overall job satisfaction. This feedback is used to make changes in real-time to improve employee retention and productivity.

Survey Sample

Purpose of survey research.

The purpose of survey research is to gather data and insights from a representative sample of individuals. Survey research allows researchers to collect data quickly and efficiently from a large number of people, making it a valuable tool for understanding attitudes, behaviors, and preferences.

Here are some common purposes of survey research:

  • Descriptive Research: Survey research is often used to describe characteristics of a population or a phenomenon. For example, a survey could be used to describe the characteristics of a particular demographic group, such as age, gender, or income.
  • Exploratory Research: Survey research can be used to explore new topics or areas of research. Exploratory surveys are often used to generate hypotheses or identify potential relationships between variables.
  • Explanatory Research: Survey research can be used to explain relationships between variables. For example, a survey could be used to determine whether there is a relationship between educational attainment and income.
  • Evaluation Research: Survey research can be used to evaluate the effectiveness of a program or intervention. For example, a survey could be used to evaluate the impact of a health education program on behavior change.
  • Monitoring Research: Survey research can be used to monitor trends or changes over time. For example, a survey could be used to monitor changes in attitudes towards climate change or political candidates over time.

When to use Survey Research

there are certain circumstances where survey research is particularly appropriate. Here are some situations where survey research may be useful:

  • When the research question involves attitudes, beliefs, or opinions: Survey research is particularly useful for understanding attitudes, beliefs, and opinions on a particular topic. For example, a survey could be used to understand public opinion on a political issue.
  • When the research question involves behaviors or experiences: Survey research can also be useful for understanding behaviors and experiences. For example, a survey could be used to understand the prevalence of a particular health behavior.
  • When a large sample size is needed: Survey research allows researchers to collect data from a large number of people quickly and efficiently. This makes it a useful method when a large sample size is needed to ensure statistical validity.
  • When the research question is time-sensitive: Survey research can be conducted quickly, which makes it a useful method when the research question is time-sensitive. For example, a survey could be used to understand public opinion on a breaking news story.
  • When the research question involves a geographically dispersed population: Survey research can be conducted online, which makes it a useful method when the population of interest is geographically dispersed.

How to Conduct Survey Research

Conducting survey research involves several steps that need to be carefully planned and executed. Here is a general overview of the process:

  • Define the research question: The first step in conducting survey research is to clearly define the research question. The research question should be specific, measurable, and relevant to the population of interest.
  • Develop a survey instrument : The next step is to develop a survey instrument. This can be done using various methods, such as online survey tools or paper surveys. The survey instrument should be designed to elicit the information needed to answer the research question, and should be pre-tested with a small sample of individuals.
  • Select a sample : The sample is the group of individuals who will be invited to participate in the survey. The sample should be representative of the population of interest, and the size of the sample should be sufficient to ensure statistical validity.
  • Administer the survey: The survey can be administered in various ways, such as online, by mail, or in person. The method of administration should be chosen based on the population of interest and the research question.
  • Analyze the data: Once the survey data is collected, it needs to be analyzed. This involves summarizing the data using statistical methods, such as frequency distributions or regression analysis.
  • Draw conclusions: The final step is to draw conclusions based on the data analysis. This involves interpreting the results and answering the research question.

Advantages of Survey Research

There are several advantages to using survey research, including:

  • Efficient data collection: Survey research allows researchers to collect data quickly and efficiently from a large number of people. This makes it a useful method for gathering information on a wide range of topics.
  • Standardized data collection: Surveys are typically standardized, which means that all participants receive the same questions in the same order. This ensures that the data collected is consistent and reliable.
  • Cost-effective: Surveys can be conducted online, by mail, or in person, which makes them a cost-effective method of data collection.
  • Anonymity: Participants can remain anonymous when responding to a survey. This can encourage participants to be more honest and open in their responses.
  • Easy comparison: Surveys allow for easy comparison of data between different groups or over time. This makes it possible to identify trends and patterns in the data.
  • Versatility: Surveys can be used to collect data on a wide range of topics, including attitudes, beliefs, behaviors, and preferences.

Limitations of Survey Research

Here are some of the main limitations of survey research:

  • Limited depth: Surveys are typically designed to collect quantitative data, which means that they do not provide much depth or detail about people’s experiences or opinions. This can limit the insights that can be gained from the data.
  • Potential for bias: Surveys can be affected by various biases, including selection bias, response bias, and social desirability bias. These biases can distort the results and make them less accurate.
  • L imited validity: Surveys are only as valid as the questions they ask. If the questions are poorly designed or ambiguous, the results may not accurately reflect the respondents’ attitudes or behaviors.
  • Limited generalizability : Survey results are only generalizable to the population from which the sample was drawn. If the sample is not representative of the population, the results may not be generalizable to the larger population.
  • Limited ability to capture context: Surveys typically do not capture the context in which attitudes or behaviors occur. This can make it difficult to understand the reasons behind the responses.
  • Limited ability to capture complex phenomena: Surveys are not well-suited to capture complex phenomena, such as emotions or the dynamics of interpersonal relationships.

Following is an example of a Survey Sample:

Welcome to our Survey Research Page! We value your opinions and appreciate your participation in this survey. Please answer the questions below as honestly and thoroughly as possible.

1. What is your age?

  • A) Under 18
  • G) 65 or older

2. What is your highest level of education completed?

  • A) Less than high school
  • B) High school or equivalent
  • C) Some college or technical school
  • D) Bachelor’s degree
  • E) Graduate or professional degree

3. What is your current employment status?

  • A) Employed full-time
  • B) Employed part-time
  • C) Self-employed
  • D) Unemployed

4. How often do you use the internet per day?

  •  A) Less than 1 hour
  • B) 1-3 hours
  • C) 3-5 hours
  • D) 5-7 hours
  • E) More than 7 hours

5. How often do you engage in social media per day?

6. Have you ever participated in a survey research study before?

7. If you have participated in a survey research study before, how was your experience?

  • A) Excellent
  • E) Very poor

8. What are some of the topics that you would be interested in participating in a survey research study about?

……………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………….

9. How often would you be willing to participate in survey research studies?

  • A) Once a week
  • B) Once a month
  • C) Once every 6 months
  • D) Once a year

10. Any additional comments or suggestions?

Thank you for taking the time to complete this survey. Your feedback is important to us and will help us improve our survey research efforts.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Quantitative Research

Quantitative Research – Methods, Types and...

Qualitative Research Methods

Qualitative Research Methods

Explanatory Research

Explanatory Research – Types, Methods, Guide

Logo for University of Southern Queensland

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

9 Survey research

Survey research is a research method involving the use of standardised questionnaires or interviews to collect data about people and their preferences, thoughts, and behaviours in a systematic manner. Although census surveys were conducted as early as Ancient Egypt, survey as a formal research method was pioneered in the 1930–40s by sociologist Paul Lazarsfeld to examine the effects of the radio on political opinion formation of the United States. This method has since become a very popular method for quantitative research in the social sciences.

The survey method can be used for descriptive, exploratory, or explanatory research. This method is best suited for studies that have individual people as the unit of analysis. Although other units of analysis, such as groups, organisations or dyads—pairs of organisations, such as buyers and sellers—are also studied using surveys, such studies often use a specific person from each unit as a ‘key informant’ or a ‘proxy’ for that unit. Consequently, such surveys may be subject to respondent bias if the chosen informant does not have adequate knowledge or has a biased opinion about the phenomenon of interest. For instance, Chief Executive Officers may not adequately know employees’ perceptions or teamwork in their own companies, and may therefore be the wrong informant for studies of team dynamics or employee self-esteem.

Survey research has several inherent strengths compared to other research methods. First, surveys are an excellent vehicle for measuring a wide variety of unobservable data, such as people’s preferences (e.g., political orientation), traits (e.g., self-esteem), attitudes (e.g., toward immigrants), beliefs (e.g., about a new law), behaviours (e.g., smoking or drinking habits), or factual information (e.g., income). Second, survey research is also ideally suited for remotely collecting data about a population that is too large to observe directly. A large area—such as an entire country—can be covered by postal, email, or telephone surveys using meticulous sampling to ensure that the population is adequately represented in a small sample. Third, due to their unobtrusive nature and the ability to respond at one’s convenience, questionnaire surveys are preferred by some respondents. Fourth, interviews may be the only way of reaching certain population groups such as the homeless or illegal immigrants for which there is no sampling frame available. Fifth, large sample surveys may allow detection of small effects even while analysing multiple variables, and depending on the survey design, may also allow comparative analysis of population subgroups (i.e., within-group and between-group analysis). Sixth, survey research is more economical in terms of researcher time, effort and cost than other methods such as experimental research and case research. At the same time, survey research also has some unique disadvantages. It is subject to a large number of biases such as non-response bias, sampling bias, social desirability bias, and recall bias, as discussed at the end of this chapter.

Depending on how the data is collected, survey research can be divided into two broad categories: questionnaire surveys (which may be postal, group-administered, or online surveys), and interview surveys (which may be personal, telephone, or focus group interviews). Questionnaires are instruments that are completed in writing by respondents, while interviews are completed by the interviewer based on verbal responses provided by respondents. As discussed below, each type has its own strengths and weaknesses in terms of their costs, coverage of the target population, and researcher’s flexibility in asking questions.

Questionnaire surveys

Invented by Sir Francis Galton, a questionnaire is a research instrument consisting of a set of questions (items) intended to capture responses from respondents in a standardised manner. Questions may be unstructured or structured. Unstructured questions ask respondents to provide a response in their own words, while structured questions ask respondents to select an answer from a given set of choices. Subjects’ responses to individual questions (items) on a structured questionnaire may be aggregated into a composite scale or index for statistical analysis. Questions should be designed in such a way that respondents are able to read, understand, and respond to them in a meaningful way, and hence the survey method may not be appropriate or practical for certain demographic groups such as children or the illiterate.

Most questionnaire surveys tend to be self-administered postal surveys , where the same questionnaire is posted to a large number of people, and willing respondents can complete the survey at their convenience and return it in prepaid envelopes. Postal surveys are advantageous in that they are unobtrusive and inexpensive to administer, since bulk postage is cheap in most countries. However, response rates from postal surveys tend to be quite low since most people ignore survey requests. There may also be long delays (several months) in respondents’ completing and returning the survey, or they may even simply lose it. Hence, the researcher must continuously monitor responses as they are being returned, track and send non-respondents repeated reminders (two or three reminders at intervals of one to one and a half months is ideal). Questionnaire surveys are also not well-suited for issues that require clarification on the part of the respondent or those that require detailed written responses. Longitudinal designs can be used to survey the same set of respondents at different times, but response rates tend to fall precipitously from one survey to the next.

A second type of survey is a group-administered questionnaire . A sample of respondents is brought together at a common place and time, and each respondent is asked to complete the survey questionnaire while in that room. Respondents enter their responses independently without interacting with one another. This format is convenient for the researcher, and a high response rate is assured. If respondents do not understand any specific question, they can ask for clarification. In many organisations, it is relatively easy to assemble a group of employees in a conference room or lunch room, especially if the survey is approved by corporate executives.

A more recent type of questionnaire survey is an online or web survey. These surveys are administered over the Internet using interactive forms. Respondents may receive an email request for participation in the survey with a link to a website where the survey may be completed. Alternatively, the survey may be embedded into an email, and can be completed and returned via email. These surveys are very inexpensive to administer, results are instantly recorded in an online database, and the survey can be easily modified if needed. However, if the survey website is not password-protected or designed to prevent multiple submissions, the responses can be easily compromised. Furthermore, sampling bias may be a significant issue since the survey cannot reach people who do not have computer or Internet access, such as many of the poor, senior, and minority groups, and the respondent sample is skewed toward a younger demographic who are online much of the time and have the time and ability to complete such surveys. Computing the response rate may be problematic if the survey link is posted on LISTSERVs or bulletin boards instead of being emailed directly to targeted respondents. For these reasons, many researchers prefer dual-media surveys (e.g., postal survey and online survey), allowing respondents to select their preferred method of response.

Constructing a survey questionnaire is an art. Numerous decisions must be made about the content of questions, their wording, format, and sequencing, all of which can have important consequences for the survey responses.

Response formats. Survey questions may be structured or unstructured. Responses to structured questions are captured using one of the following response formats:

Dichotomous response , where respondents are asked to select one of two possible choices, such as true/false, yes/no, or agree/disagree. An example of such a question is: Do you think that the death penalty is justified under some circumstances? (circle one): yes / no.

Nominal response , where respondents are presented with more than two unordered options, such as: What is your industry of employment?: manufacturing / consumer services / retail / education / healthcare / tourism and hospitality / other.

Ordinal response , where respondents have more than two ordered options, such as: What is your highest level of education?: high school / bachelor’s degree / postgraduate degree.

Interval-level response , where respondents are presented with a 5-point or 7-point Likert scale, semantic differential scale, or Guttman scale. Each of these scale types were discussed in a previous chapter.

Continuous response , where respondents enter a continuous (ratio-scaled) value with a meaningful zero point, such as their age or tenure in a firm. These responses generally tend to be of the fill-in-the blanks type.

Question content and wording. Responses obtained in survey research are very sensitive to the types of questions asked. Poorly framed or ambiguous questions will likely result in meaningless responses with very little value. Dillman (1978) [1] recommends several rules for creating good survey questions. Every single question in a survey should be carefully scrutinised for the following issues:

Is the question clear and understandable ?: Survey questions should be stated in very simple language, preferably in active voice, and without complicated words or jargon that may not be understood by a typical respondent. All questions in the questionnaire should be worded in a similar manner to make it easy for respondents to read and understand them. The only exception is if your survey is targeted at a specialised group of respondents, such as doctors, lawyers and researchers, who use such jargon in their everyday environment. Is the question worded in a negative manner ?: Negatively worded questions such as ‘Should your local government not raise taxes?’ tend to confuse many respondents and lead to inaccurate responses. Double-negatives should be avoided when designing survey questions.

Is the question ambiguous ?: Survey questions should not use words or expressions that may be interpreted differently by different respondents (e.g., words like ‘any’ or ‘just’). For instance, if you ask a respondent, ‘What is your annual income?’, it is unclear whether you are referring to salary/wages, or also dividend, rental, and other income, whether you are referring to personal income, family income (including spouse’s wages), or personal and business income. Different interpretation by different respondents will lead to incomparable responses that cannot be interpreted correctly.

Does the question have biased or value-laden words ?: Bias refers to any property of a question that encourages subjects to answer in a certain way. Kenneth Rasinky (1989) [2] examined several studies on people’s attitude toward government spending, and observed that respondents tend to indicate stronger support for ‘assistance to the poor’ and less for ‘welfare’, even though both terms had the same meaning. In this study, more support was also observed for ‘halting rising crime rate’ and less for ‘law enforcement’, more for ‘solving problems of big cities’ and less for ‘assistance to big cities’, and more for ‘dealing with drug addiction’ and less for ‘drug rehabilitation’. A biased language or tone tends to skew observed responses. It is often difficult to anticipate in advance the biasing wording, but to the greatest extent possible, survey questions should be carefully scrutinised to avoid biased language.

Is the question double-barrelled ?: Double-barrelled questions are those that can have multiple answers. For example, ‘Are you satisfied with the hardware and software provided for your work?’. In this example, how should a respondent answer if they are satisfied with the hardware, but not with the software, or vice versa? It is always advisable to separate double-barrelled questions into separate questions: ‘Are you satisfied with the hardware provided for your work?’, and ’Are you satisfied with the software provided for your work?’. Another example: ‘Does your family favour public television?’. Some people may favour public TV for themselves, but favour certain cable TV programs such as Sesame Street for their children.

Is the question too general ?: Sometimes, questions that are too general may not accurately convey respondents’ perceptions. If you asked someone how they liked a certain book and provided a response scale ranging from ‘not at all’ to ‘extremely well’, if that person selected ‘extremely well’, what do they mean? Instead, ask more specific behavioural questions, such as, ‘Will you recommend this book to others, or do you plan to read other books by the same author?’. Likewise, instead of asking, ‘How big is your firm?’ (which may be interpreted differently by respondents), ask, ‘How many people work for your firm?’, and/or ‘What is the annual revenue of your firm?’, which are both measures of firm size.

Is the question too detailed ?: Avoid unnecessarily detailed questions that serve no specific research purpose. For instance, do you need the age of each child in a household, or is just the number of children in the household acceptable? However, if unsure, it is better to err on the side of details than generality.

Is the question presumptuous ?: If you ask, ‘What do you see as the benefits of a tax cut?’, you are presuming that the respondent sees the tax cut as beneficial. Many people may not view tax cuts as being beneficial, because tax cuts generally lead to lesser funding for public schools, larger class sizes, and fewer public services such as police, ambulance, and fire services. Avoid questions with built-in presumptions.

Is the question imaginary ?: A popular question in many television game shows is, ‘If you win a million dollars on this show, how will you spend it?’. Most respondents have never been faced with such an amount of money before and have never thought about it—they may not even know that after taxes, they will get only about $640,000 or so in the United States, and in many cases, that amount is spread over a 20-year period—and so their answers tend to be quite random, such as take a tour around the world, buy a restaurant or bar, spend on education, save for retirement, help parents or children, or have a lavish wedding. Imaginary questions have imaginary answers, which cannot be used for making scientific inferences.

Do respondents have the information needed to correctly answer the question ?: Oftentimes, we assume that subjects have the necessary information to answer a question, when in reality, they do not. Even if a response is obtained, these responses tend to be inaccurate given the subjects’ lack of knowledge about the question being asked. For instance, we should not ask the CEO of a company about day-to-day operational details that they may not be aware of, or ask teachers about how much their students are learning, or ask high-schoolers, ‘Do you think the US Government acted appropriately in the Bay of Pigs crisis?’.

Question sequencing. In general, questions should flow logically from one to the next. To achieve the best response rates, questions should flow from the least sensitive to the most sensitive, from the factual and behavioural to the attitudinal, and from the more general to the more specific. Some general rules for question sequencing:

Start with easy non-threatening questions that can be easily recalled. Good options are demographics (age, gender, education level) for individual-level surveys and firmographics (employee count, annual revenues, industry) for firm-level surveys.

Never start with an open ended question.

If following a historical sequence of events, follow a chronological order from earliest to latest.

Ask about one topic at a time. When switching topics, use a transition, such as, ‘The next section examines your opinions about…’

Use filter or contingency questions as needed, such as, ‘If you answered “yes” to question 5, please proceed to Section 2. If you answered “no” go to Section 3′.

Other golden rules . Do unto your respondents what you would have them do unto you. Be attentive and appreciative of respondents’ time, attention, trust, and confidentiality of personal information. Always practice the following strategies for all survey research:

People’s time is valuable. Be respectful of their time. Keep your survey as short as possible and limit it to what is absolutely necessary. Respondents do not like spending more than 10-15 minutes on any survey, no matter how important it is. Longer surveys tend to dramatically lower response rates.

Always assure respondents about the confidentiality of their responses, and how you will use their data (e.g., for academic research) and how the results will be reported (usually, in the aggregate).

For organisational surveys, assure respondents that you will send them a copy of the final results, and make sure that you follow up with your promise.

Thank your respondents for their participation in your study.

Finally, always pretest your questionnaire, at least using a convenience sample, before administering it to respondents in a field setting. Such pretesting may uncover ambiguity, lack of clarity, or biases in question wording, which should be eliminated before administering to the intended sample.

Interview survey

Interviews are a more personalised data collection method than questionnaires, and are conducted by trained interviewers using the same research protocol as questionnaire surveys (i.e., a standardised set of questions). However, unlike a questionnaire, the interview script may contain special instructions for the interviewer that are not seen by respondents, and may include space for the interviewer to record personal observations and comments. In addition, unlike postal surveys, the interviewer has the opportunity to clarify any issues raised by the respondent or ask probing or follow-up questions. However, interviews are time-consuming and resource-intensive. Interviewers need special interviewing skills as they are considered to be part of the measurement instrument, and must proactively strive not to artificially bias the observed responses.

The most typical form of interview is a personal or face-to-face interview , where the interviewer works directly with the respondent to ask questions and record their responses. Personal interviews may be conducted at the respondent’s home or office location. This approach may even be favoured by some respondents, while others may feel uncomfortable allowing a stranger into their homes. However, skilled interviewers can persuade respondents to co-operate, dramatically improving response rates.

A variation of the personal interview is a group interview, also called a focus group . In this technique, a small group of respondents (usually 6–10 respondents) are interviewed together in a common location. The interviewer is essentially a facilitator whose job is to lead the discussion, and ensure that every person has an opportunity to respond. Focus groups allow deeper examination of complex issues than other forms of survey research, because when people hear others talk, it often triggers responses or ideas that they did not think about before. However, focus group discussion may be dominated by a dominant personality, and some individuals may be reluctant to voice their opinions in front of their peers or superiors, especially while dealing with a sensitive issue such as employee underperformance or office politics. Because of their small sample size, focus groups are usually used for exploratory research rather than descriptive or explanatory research.

A third type of interview survey is a telephone interview . In this technique, interviewers contact potential respondents over the phone, typically based on a random selection of people from a telephone directory, to ask a standard set of survey questions. A more recent and technologically advanced approach is computer-assisted telephone interviewing (CATI). This is increasing being used by academic, government, and commercial survey researchers. Here the interviewer is a telephone operator who is guided through the interview process by a computer program displaying instructions and questions to be asked. The system also selects respondents randomly using a random digit dialling technique, and records responses using voice capture technology. Once respondents are on the phone, higher response rates can be obtained. This technique is not ideal for rural areas where telephone density is low, and also cannot be used for communicating non-audio information such as graphics or product demonstrations.

Role of interviewer. The interviewer has a complex and multi-faceted role in the interview process, which includes the following tasks:

Prepare for the interview: Since the interviewer is in the forefront of the data collection effort, the quality of data collected depends heavily on how well the interviewer is trained to do the job. The interviewer must be trained in the interview process and the survey method, and also be familiar with the purpose of the study, how responses will be stored and used, and sources of interviewer bias. They should also rehearse and time the interview prior to the formal study.

Locate and enlist the co-operation of respondents: Particularly in personal, in-home surveys, the interviewer must locate specific addresses, and work around respondents’ schedules at sometimes undesirable times such as during weekends. They should also be like a salesperson, selling the idea of participating in the study.

Motivate respondents: Respondents often feed off the motivation of the interviewer. If the interviewer is disinterested or inattentive, respondents will not be motivated to provide useful or informative responses either. The interviewer must demonstrate enthusiasm about the study, communicate the importance of the research to respondents, and be attentive to respondents’ needs throughout the interview.

Clarify any confusion or concerns: Interviewers must be able to think on their feet and address unanticipated concerns or objections raised by respondents to the respondents’ satisfaction. Additionally, they should ask probing questions as necessary even if such questions are not in the script.

Observe quality of response: The interviewer is in the best position to judge the quality of information collected, and may supplement responses obtained using personal observations of gestures or body language as appropriate.

Conducting the interview. Before the interview, the interviewer should prepare a kit to carry to the interview session, consisting of a cover letter from the principal investigator or sponsor, adequate copies of the survey instrument, photo identification, and a telephone number for respondents to call to verify the interviewer’s authenticity. The interviewer should also try to call respondents ahead of time to set up an appointment if possible. To start the interview, they should speak in an imperative and confident tone, such as, ‘I’d like to take a few minutes of your time to interview you for a very important study’, instead of, ‘May I come in to do an interview?’. They should introduce themself, present personal credentials, explain the purpose of the study in one to two sentences, and assure respondents that their participation is voluntary, and their comments are confidential, all in less than a minute. No big words or jargon should be used, and no details should be provided unless specifically requested. If the interviewer wishes to record the interview, they should ask for respondents’ explicit permission before doing so. Even if the interview is recorded, the interviewer must take notes on key issues, probes, or verbatim phrases

During the interview, the interviewer should follow the questionnaire script and ask questions exactly as written, and not change the words to make the question sound friendlier. They should also not change the order of questions or skip any question that may have been answered earlier. Any issues with the questions should be discussed during rehearsal prior to the actual interview sessions. The interviewer should not finish the respondent’s sentences. If the respondent gives a brief cursory answer, the interviewer should probe the respondent to elicit a more thoughtful, thorough response. Some useful probing techniques are:

The silent probe: Just pausing and waiting without going into the next question may suggest to respondents that the interviewer is waiting for more detailed response.

Overt encouragement: An occasional ‘uh-huh’ or ‘okay’ may encourage the respondent to go into greater details. However, the interviewer must not express approval or disapproval of what the respondent says.

Ask for elaboration: Such as, ‘Can you elaborate on that?’ or ‘A minute ago, you were talking about an experience you had in high school. Can you tell me more about that?’.

Reflection: The interviewer can try the psychotherapist’s trick of repeating what the respondent said. For instance, ‘What I’m hearing is that you found that experience very traumatic’ and then pause and wait for the respondent to elaborate.

After the interview is completed, the interviewer should thank respondents for their time, tell them when to expect the results, and not leave hastily. Immediately after leaving, they should write down any notes or key observations that may help interpret the respondent’s comments better.

Biases in survey research

Despite all of its strengths and advantages, survey research is often tainted with systematic biases that may invalidate some of the inferences derived from such surveys. Five such biases are the non-response bias, sampling bias, social desirability bias, recall bias, and common method bias.

Non-response bias. Survey research is generally notorious for its low response rates. A response rate of 15-20 per cent is typical in a postal survey, even after two or three reminders. If the majority of the targeted respondents fail to respond to a survey, this may indicate a systematic reason for the low response rate, which may in turn raise questions about the validity of the study’s results. For instance, dissatisfied customers tend to be more vocal about their experience than satisfied customers, and are therefore more likely to respond to questionnaire surveys or interview requests than satisfied customers. Hence, any respondent sample is likely to have a higher proportion of dissatisfied customers than the underlying population from which it is drawn. In this instance, not only will the results lack generalisability, but the observed outcomes may also be an artefact of the biased sample. Several strategies may be employed to improve response rates:

Advance notification: Sending a short letter to the targeted respondents soliciting their participation in an upcoming survey can prepare them in advance and improve their propensity to respond. The letter should state the purpose and importance of the study, mode of data collection (e.g., via a phone call, a survey form in the mail, etc.), and appreciation for their co-operation. A variation of this technique may be to ask the respondent to return a prepaid postcard indicating whether or not they are willing to participate in the study.

Relevance of content: People are more likely to respond to surveys examining issues of relevance or importance to them.

Respondent-friendly questionnaire: Shorter survey questionnaires tend to elicit higher response rates than longer questionnaires. Furthermore, questions that are clear, non-offensive, and easy to respond tend to attract higher response rates.

Endorsement: For organisational surveys, it helps to gain endorsement from a senior executive attesting to the importance of the study to the organisation. Such endorsement can be in the form of a cover letter or a letter of introduction, which can improve the researcher’s credibility in the eyes of the respondents.

Follow-up requests: Multiple follow-up requests may coax some non-respondents to respond, even if their responses are late.

Interviewer training: Response rates for interviews can be improved with skilled interviewers trained in how to request interviews, use computerised dialling techniques to identify potential respondents, and schedule call-backs for respondents who could not be reached.

Incentives : Incentives in the form of cash or gift cards, giveaways such as pens or stress balls, entry into a lottery, draw or contest, discount coupons, promise of contribution to charity, and so forth may increase response rates.

Non-monetary incentives: Businesses, in particular, are more prone to respond to non-monetary incentives than financial incentives. An example of such a non-monetary incentive is a benchmarking report comparing the business’s individual response against the aggregate of all responses to a survey.

Confidentiality and privacy: Finally, assurances that respondents’ private data or responses will not fall into the hands of any third party may help improve response rates

Sampling bias. Telephone surveys conducted by calling a random sample of publicly available telephone numbers will systematically exclude people with unlisted telephone numbers, mobile phone numbers, and people who are unable to answer the phone when the survey is being conducted—for instance, if they are at work—and will include a disproportionate number of respondents who have landline telephone services with listed phone numbers and people who are home during the day, such as the unemployed, the disabled, and the elderly. Likewise, online surveys tend to include a disproportionate number of students and younger people who are constantly on the Internet, and systematically exclude people with limited or no access to computers or the Internet, such as the poor and the elderly. Similarly, questionnaire surveys tend to exclude children and the illiterate, who are unable to read, understand, or meaningfully respond to the questionnaire. A different kind of sampling bias relates to sampling the wrong population, such as asking teachers (or parents) about their students’ (or children’s) academic learning, or asking CEOs about operational details in their company. Such biases make the respondent sample unrepresentative of the intended population and hurt generalisability claims about inferences drawn from the biased sample.

Social desirability bias . Many respondents tend to avoid negative opinions or embarrassing comments about themselves, their employers, family, or friends. With negative questions such as, ‘Do you think that your project team is dysfunctional?’, ‘Is there a lot of office politics in your workplace?’, ‘Or have you ever illegally downloaded music files from the Internet?’, the researcher may not get truthful responses. This tendency among respondents to ‘spin the truth’ in order to portray themselves in a socially desirable manner is called the ‘social desirability bias’, which hurts the validity of responses obtained from survey research. There is practically no way of overcoming the social desirability bias in a questionnaire survey, but in an interview setting, an astute interviewer may be able to spot inconsistent answers and ask probing questions or use personal observations to supplement respondents’ comments.

Recall bias. Responses to survey questions often depend on subjects’ motivation, memory, and ability to respond. Particularly when dealing with events that happened in the distant past, respondents may not adequately remember their own motivations or behaviours, or perhaps their memory of such events may have evolved with time and no longer be retrievable. For instance, if a respondent is asked to describe his/her utilisation of computer technology one year ago, or even memorable childhood events like birthdays, their response may not be accurate due to difficulties with recall. One possible way of overcoming the recall bias is by anchoring the respondent’s memory in specific events as they happened, rather than asking them to recall their perceptions and motivations from memory.

Common method bias. Common method bias refers to the amount of spurious covariance shared between independent and dependent variables that are measured at the same point in time, such as in a cross-sectional survey, using the same instrument, such as a questionnaire. In such cases, the phenomenon under investigation may not be adequately separated from measurement artefacts. Standard statistical tests are available to test for common method bias, such as Harmon’s single-factor test (Podsakoff, MacKenzie, Lee & Podsakoff, 2003), [3] Lindell and Whitney’s (2001) [4] market variable technique, and so forth. This bias can potentially be avoided if the independent and dependent variables are measured at different points in time using a longitudinal survey design, or if these variables are measured using different methods, such as computerised recording of dependent variable versus questionnaire-based self-rating of independent variables.

  • Dillman, D. (1978). Mail and telephone surveys: The total design method . New York: Wiley. ↵
  • Rasikski, K. (1989). The effect of question wording on public support for government spending. Public Opinion Quarterly , 53(3), 388–394. ↵
  • Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology , 88(5), 879–903. http://dx.doi.org/10.1037/0021-9010.88.5.879. ↵
  • Lindell, M. K., & Whitney, D. J. (2001). Accounting for common method variance in cross-sectional research designs. Journal of Applied Psychology , 86(1), 114–121. ↵

Social Science Research: Principles, Methods and Practices (Revised edition) Copyright © 2019 by Anol Bhattacherjee is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • (855) 776-7763

All Products

BIGContacts CRM

Survey Maker

ProProfs.com

  • Get Started Free

Want insights that improve experience & conversions?

Capture customer feedback to improve customer experience & grow conversions.

7 Steps to Conduct a Survey- Best Practices, Tools, & More

7 Steps to Conduct a Survey- Best Practices, Tools, & More

Want to conduct a survey but wondering where to start? We’ve got you covered. 

Here’s how to do it:

  • Identify your target audience
  • Decide on the survey questions
  • Add question branching
  • Set triggers to pick the right moment to conduct the survey
  • Choose the design and deploy the survey
  • Analyze the results & take action

Simple, right? The challenge is to execute these steps properly.

Being a leading survey tool host, we regularly see businesses like yours needing help with their survey design and launch.

That’s why we have created this one-stop guide on how to conduct surveys. We’ll explore steps to plan the surveys, collect responses and analyze results to help you design campaigns that produce tangible results.

Let’s begin.

What is a Survey?

A survey is defined as a single or multi-page questionnaire that aims to collect information from the respondents. Surveys can be used to gather different types of data from the intended audience, like demographic, behavioral, psychographic, and more.

For businesses, a survey can be a source of first-hand customer data, making it an indispensable part of customer success strategy.

You can generate leads, pinpoint product or website issues, map customers’ journeys, and gauge users’ preferences.

For example, if you’ve just added a new feature to the mobile app, you can conduct a survey like CSAT to gauge users’ satisfaction and feature adoption.

As to how you can conduct a survey, there are several mediums/channels to do so:

  • In-store kiosks
  • Social media polls

Read More: 13 Ways to Collect Customer Feedback

Importance of Conducting Surveys

1 . measure the customers’ pulse.

Surveys can provide hundreds of nuanced data points about your customers and prospects. You can see how they perceive your brand and map their sentiments.

  • How happy are they with your products and services?
  • What do they think about your brand?
  • How effective are your support services?
  • Which touchpoints offer the best and worst experiences?

You can quantify these sentiments using surveys like NPS , CSAT, and CES. Just conduct pulse surveys on your website, app, or product to gauge their experiences.

how to conduct a research survey

2. Map Issues With Products & Services

The flexibility of survey deployment makes it an ideal means to collect data about product issues. Suppose you have a website and see many people bouncing off the pricing page. You can deploy surveys on the page to find the reason for this departure.

how to conduct a research survey

In the same way, you can add surveys at various stages of customers’ journeys to map their issues and problems. The best part is that you can keep it active to collect feedback data continuously and optimize the experience.

3. Validate New Ideas & Product Opportunities

Let’’s say you have three different ideas about new product features. How would you know which one to work on?

Simple! Conduct a survey and ask your users.

Surveys lets you know what your audience wants and expects from you. You can then implement this feedback into your product development process as Twilio does.

There are 18 teams at Twilio that depend on customer feedback to deliver optional solutions quickly. They conduct target surveys to validate which ideas are worth developing. It helps them to prioritize testing and experimentation for those features that offer the most value to the customers.

The teams can go from ideation to development stages faster to deliver new functionalities weekly.

You can also do the same with your website or product. Just conduct a survey at critical touchpoints to find new growth opportunities.

4. Set a Baseline & Compare

You can leverage the redeployment ability of surveys to compare customer experiences over time and track the changes.

For example, a simple NPS survey on your product page can help you track customers’ loyalty and repurchase probability with time. You can rerun it every month to compare the scores and make improvements.

But it’s not only restricted to a single page; you can compare the feedback data from different channels to design a cohesive and seamless omnichannel experience.

What Do You Need to Conduct a Survey?

Before creating a survey, you need to prepare a strategy to maximize the response rate and collect accurate feedback data. You can’t survey randomly at any point. It would only result in skewed feedback and a waste of time.

Here are some pointers that will help you to develop a plan for your survey campaign:

1. Identify the Points of Deployment

The customers’ interaction points are scattered across various stages along with their journey. You can’t conduct surveys at every point as it’ll frustrate the customer. Plus, the data would require more workforce and hours to analyze.

That’s why it’s crucial to pinpoint the precise interaction to show the survey to visitors, so they are more inclined to answer it.

You can use Google Analytics to find the most underperforming pages and conduct a survey there. Or you can choose those pages where conversions have steadily declined with time.

2. Set Measurable Goals

Each survey serves a different purpose, and associating goals with it’ll help you frame the right questions.

For example:

  • If the goal is to reduce the bounce rate, you can choose an exit-intent survey and find out the reasons behind page abandonment. Then, optimize the page and track changes in the bounce rate.
  • If the goals are to measure customer satisfaction , you can collect CSAT scores using CSAT surveys. 
  • If you want to evaluate newly added page content, choose the experience mapping survey to check how the content is performing.

3. Determine the Required Sample Size

One of the most important aspects of your survey campaign is achieving a sufficient response size. The more responses you collect, higher are the chances that the feedback data is reliable and accurate.

Here’s a sample size calculator you can use:

how to conduct a research survey

As you can see, for a target audience (population size) of 13000 and a confidence level of 99%, the required sample size is 3145. It means that you need to collect at least 3145 responses for your survey for the results to be statistically significant.

4. Choose the Right Tool

Half of your work is done when you have the right survey tool. It’ll let you conduct the survey and aid in analyzing feedback data.

Here’s What to Look for in a Survey Tool:

  • The number of deployment channels: Choose a tool that offers all the channels you want to conduct the survey on, like website, email, mobile app, product, link, etc. It’ll help to keep all the data in one place.

how to conduct a research survey

  • Sufficient audience targeting options: Tools like Qualaroo offer advanced triggering options to target visitors’ behavior, actions, and other attributes to show your survey to the right people.
  • Flexible survey creator: Look for a survey tool that offers features like skip logic, design options, template library, rebranding, etc., to create highly personalized and targeted surveys.
  • Data analysis techniques: If your tool offers AI-based analysis techniques like sentiment analysis, it can automatically categorize the responses based on user emotions and prioritize the negative feedback to take necessary actions.

how to conduct a research survey

7 Steps to Conduct a Survey That Brings Desired Results

You’ve done your research and come up with a workable strategy. You also have the right tool in hand. Now you’re ready to conduct the survey by following these simple steps:

1. Identify Your Target Audience

To make your surveys successful, you need a target audience. For example:

  • You can’t collect satisfaction scores from first-time visitors. Technically, you can, but collecting it from people who have used or purchased the product is more worthy and meaningful.
  • In the same way, if you’ve added a new feature to the Chrome browser, you can’t show the survey to Firefox users.
  • To collect data from visitors aged 18 to 25, you must select the sample size from this visitor segment.

That’s why you need to choose your target audience carefully. It’ll ensure the reliability and accuracy of your survey responses.

It’ll also help you add screening questions to disqualify irrelevant respondents.

2. Decide on the Survey Questions

Make a list of all the possible questions you want to ask the respondents. Then, pick the ones that are important for your survey. You can choose between open-ended and closed-ended questions.

For example, if you’re surveying to create a user persona, the possible questions can be:

  • Describe yourself in one sentence.
  • What is your name?
  • What is your age?
  • Which device do you usually use to shop with us?
  • What did you come to this site to do today?
  • What were you hoping to find on this page?
  • Does this page meet your expectations?

You can add/remove the questions depending on your audience and the depth of information you want from them.

Now, select the order of the questions. It’s best to start with the rating scale question and then move toward the free-text questions to gather in-depth data.

Read More: Answer selection types

3. Add Question Branching to Your Survey

Add screening questions and skip logic to make your survey personalized and highly targeted. 

It’ll ensure two things:

  • Only people who qualify for the survey can answer it.
  • The respondents will only see relevant questions based on their previous answers.

how to conduct a research survey

For example, if your target audience is visitors from the age group of 18-25, you can add the following as the first question:

Using question branching, you can then disqualify the respondents who choose (b),(c),(d), or (e) as their answers. For respondents who choose (a), you can add the relevant follow-up questions to gather feedback data.

It’ll help to keep the feedback clean without worrying about other groups compromising the data. So, list the questions, assign appropriate branching and check that there’s no broken or incomplete path for any branch.

4. Set Triggers to Pick the Right Moment to Conduct the Survey

Survey launch timing matters. It helps to collect contextual data from the customers. So set up appropriate triggers to show the survey at the right moment to the right people.

  • If you’re conducting a survey to gauge users’ perception of a newly added product feature, you can set it to appear when they interact with that feature.
  • Similarly, the ideal trigger for post-purchase surveys is when the order confirmation message appears.

Advanced survey tools provide many in-built targeting options to help you set the right triggers. You can add different conditions to deploy the survey at the precise moment.

how to conduct a research survey

5. Choose the Survey Design and Deploy the Survey

The last thing to do before deploying the survey is to set its theme and color. It’s best to align the survey theme with your website or app. You can also add the brand logo to imbue confidence in the respondents.

how to conduct a research survey

Once satisfied, activate the survey and start collecting the feedback data. Make sure you collect enough responses to meet the required sample size.

6. Analyze the Results

There are different ways to analyze the responses depending on the resources available at your disposal. Here are a few steps to do it:

  • Restructure the data in a spreadsheet and add all the relevant information to each response, such as customer ID, metadata, feedback, point of interaction, customer type, and lifetime value.
  • Categorize the feedback by its types: issue, general feedback, bug, feature request, support grievances, and more.
  • Next, start assigning the appropriate action to resolve the problems. Make sure these are small and quick points. For example, if the feedback type is an issue marked as critical, summarize the issue and the required action in a few words, like:

Payment failure >> possible issue with Stripe API >> Check on priority.

7. Take Actions

Share the sheet with other teams and create tasks for them to address the feedback.

  • Resolve the uncovered UI/UX on a priority.
  • Fix the broken flow with the help of your dev team.
  • Reach out to those who shared positive feedback to collect product reviews and app store ratings.
  • Get in touch with frustrated customers to solve their problems and retain them.

6 Best Practices for Conducting an Online Survey

Whether you are new or experienced, there are few basic rules that you can add to your checklist to get the maximum return on your investment. Let’s quickly take some of these best strategies and practices to do a survey properly.

1. Add Incentives to Improve the Response Rate

Incentives are one of the best ways to increase the response rate on your surveys. 

According to PeoplePulse , incentivized surveys receive at least 10% more responses than surveys without incentives.

how to conduct a research survey

Use different incentives to improve response rates.

  • You can add discount coupon codes to the CSAT and NPS surveys. It will also act as an encouragement for the customers to purchase again.
  • Embed free consultation offer in the surveys on your pricing page or landing page. In return, you can collect visitors’ contact information to add them to your prospect list.
  • Other incentives such as customized meal plans, exercise plans, personality profiles, and gift cards to entice people into filling the survey.

2. State the Purpose of Your Survey

It is always helpful to showcase the objective of your survey to your respondents. Make them understand that the survey feedback will help make their experience better. 

how to conduct a research survey

You can also mention the recent interaction related to the survey to help customers recall their experience while filling the survey. 

3. Always Follow up on Your Surveys

It is a good practice to send survey reminders to the customers or product users who haven’t submitted their responses.

A single survey reminder can increase the response rate by 14% . It is a substantial jump in the number of responses as the sample size increases.

  • If you are using mail campaigns, you can send a reminder mail to non-respondents after a few days. 
  • If you use the website or in-app surveys, set the survey to reappear to the visitor during their second visit.
  • If you use product surveys, add an unobtrusive survey reminder notification bar in the My Account section. Set the bar to auto-disappear after the user completes the survey.

4. Use the Funnel Technique

The Funnel technique is a powerful mechanism to direct the respondents through the survey starting with broader questions and asking specific questions towards the middle of the survey. It helps to pose more in-depth questions to the respondents.

  • Do you shop online?
  • How often do you shop online?
  • What are your favorite shopping websites?
  • What products do you usually buy online?

In the above example, each question narrows down the line of inquiry to gauge respondents’ preferences and interests.

Using this technique, you can gradually ask more personal questions without making the respondents uncomfortable.

5. Keep the Surveys Short

Survey length is an important factor that can affect the response rate. According to the Internal journal for market research , the ideal survey length should be between 10 to 20 minutes. 

What’s more, the response rate may drop by over 15% if a survey takes more than 5 minutes to complete.

The reason for this drop is simple. The respondents won’t wait for so long to complete the survey. They are more likely to abandon it with the increase in the number of questions or completion time.

Another drawback of longer surveys is that the respondents may answer the questions randomly without much thought to complete them quickly. This behavior will pollute the data samples and may produce incorrect results.

That’s why it is vital to keep your surveys short and to the point. Share your survey with the internal teams to calculate the average completion rate before sending them out.

6.Use Randomization

It is observed that the respondents have a natural tendency to feel inclined towards the first option in a survey question. This is called order bias . As a result, respondents are more likely to choose the responses that sit towards the top of the list.

Randomizing the order of response anchors can help mitigate this issue. Since each respondent sees a different sequence of the responses, the data results are less likely to be affected by order bias.

50+ Survey Questions to Choose From

The questions are the crux of survey campaigns. To learn how to conduct a survey is to learn about the right questions to ask the respondents. You can get everything right, but it will all be for naught if you don’t ask the right questions.

That’s why we have compiled a list of professional questions you can use in your surveys. Choose the questions depending on the feedback and survey type you wish to conduct on your website, app, or product.

Many survey tools also offer readymade templates to help you get started if you are new to this.

1.c Market Research Surveys

  • Rate the factors that affect your buying decision for [product].
  • Would you purchase the product at [price]?
  • According to you, what is the ideal price range for the product?
  • Would you purchase this product if it were available today?
  • Based on its current features and attributes, would you recommend [your brand name] to others?
  • If yes, please tell us what you like the most about [your brand name]?
  • If no, please specify the reason.
  • According to you, In which area is this product/service lacking the most? Specify below.
  • Which product/service would you consider as an alternative to ours?
  • Rate our competitor based on the following:

2. Demographic Surveys

  • Tell us something about yourself?
  • What is your gender?
  • What is your age group?
  • What is your highest level of education?
  • Which best describes your family?
  • Do you use the [product name]?
  • How likely is it that you’d recommend our product to a friend or colleague?
  • What feature would you like to see in the website/product?
  • Which feature do you think will help improve the product experience for you?
  • Of these four options, what’s the next thing you think we should build?

4. Product Opportunity Surveys

  • What’s the one feature we can add that would make our product indispensable for you?
  • How often do you use this feature?
  • What’s the next feature we should build?
  • How disappointed would you be if you could no longer use [Product/feature name?]
  • How does the product run after the update?
  • Have you seen any website/product/app with a similar feature?
  • Would the implementation of [this feature] increase the usability of the [product name]?
  • How would you rate this new feature?

5. Experience Mapping Surveys

  • Rate our product based on the following aspects:
  • How long have you had the product?
  • How often do you use the product?
  • Have you faced any problems with the product? Specify below.
  • How satisfied are you with the product?
  • How likely are you to purchase a product from this company again?
  • Is there anything that can be improved? Please specify.
  • How well does the website meet your needs?
  • Was the information easy to find?
  • Was the information clearly presented?
  • What other information should we provide on our website?
  • How can we make the site easier to use?

6. Brand Awareness Surveys

  • [Your brand name] Have you heard of the brand before?
  • How do you feel about this brand?
  • Have you seen this brand’s advertisements?
  • If yes, where have you seen or heard about our brand recently? (Select all that apply)
  • Have you purchased from this brand before?
  • Do you currently use the product of this brand?
  • Of all the brands offering similar products, which do you feel is the best brand?
  • Please specify what makes it the best brand for you in the category.
  • If the answer is 0-6, please specify the reason for your answer.
  • If the answer is 9-10, what do you like the most about the brand/product?
  • How satisfied are you with the product/website/app?
  • If the answer is 1-5, how can we improve the product/website/app?
  • If the answer is 8-10, what 3 things do you like the most about the product/website/app?
  • How would you rate our service on a scale of 1 – 10?
  • Was this article helpful? (Yes/No)
  • How satisfied are you with our support?
  • How easy/hard was it for you to use the product/website/app?
  • Does this [website/ product/ tool/ software] have all the features and functionalities you expected?
  • How would you improve this [website/ product/ tool/ software]?
  • What is missing from the website/product/app?
  • What is the most important feature you think we should add?

6 Common Survey Challenges & How to Overcome Them

1. keeping the scales uniform across all surveys.

The interpretation of survey scales varies as per their arrangement.

For example, if a Likert scale response anchors range from negative to positive, a higher score is desirable. However, if the sequence is reversed, a lower score would be considered as good.

The challenge is to keep the sequence uniform in all your surveys to track the scores correctly. If the scale gets reversed at any point, it can skew up the results.

Always stick to one pattern for all the scales – negative to positive or positive to negative to avoid confusion and misinterpretations.

2. Difficulty in Analyzing Free Responses

Analyzing free responses is one of the biggest challenges when you conduct an online survey. These are unregulated and depend solely on the understanding of the respondents. If they misinterpret the question, the open response will only skew your feedback data.

It’s one of the main reasons to keep your survey questions simple.

What’s more, they are also affected by the grasp of the language of the respondent. The sentences can be unstructured or hard to understand.

Free-text responses offer more in-depth feedback but pose a severe challenge of extracting valuable insights. They are time-taking and tedious.

One of the ways to mitigate this issue is using AI-based analytical tools like sentiment analysis, text analytics, and word cloud generator.

Advanced survey tools offer these techniques as an inbuilt tool feature to make data mining easier.

how to conduct a research survey

3. Avoiding Leading Questions in the Survey

Leading questions are framed in such a way that they allude to a specific direction. The problem with these questions is that they can influence the respondents to choose a particular answer from the options.

For example, what is your favorite fast food?

The above question implies that the respondent eats fast food. The respondent may either skip the question or answer it randomly.

You can add a screening question to improve the data quality and disqualify irrelevant respondents.

  • How often do you eat fast food?
  • What is your favorite fast food?

Another way to avoid leading questions is to get an extra pair of eyes. Share your survey with other teams or a control group to test it out.

4. Survey Fatigue

Survey fatigue is a real challenge that can affect both the response rate and feedback quality. People constantly receive surveys in their SMS, emails, website visits, and apps.

So, it is possible that the visitors may completely ignore the survey or answer it randomly without properly reading the questions.

how to conduct a research survey

There are few ways to combat survey fatigue while conducting a survey.

  • Add an incentive to encourage people to take the survey. As discussed earlier, it can improve the response rate.
  • Keep your survey short.
  • Separate critical surveys like NPS or CSAT from general surveys that take longer to complete.

5. Duplicate Responses

You cannot avoid duplicate responses in online surveys. But there are few ways to reduce them.

  • Use cookies to identify repeat visitors and prevent them from retaking the survey.
  • You can target IP addresses to prevent visitors from the same IP from retaking it.
  • A lot of tools also provide inbuilt duplicate response protection techniques.

6. Avoid Double-Barreled Questions

Double-barreled questions are those that pose two situations into one question. The respondent may have conflicting views about the statements, making it harder to choose one response.

For example, how satisfied are you with our services and customer support?

Here, the respondent may have positive sentiment toward the service quality but may be dissatisfied with the customer support agent.

To make the question more precise, you can split it into two questions. It will also let you add follow-up questions to each answer to find more details about customers’ issues and delights.

5 Best Tools to Conduct Online Surveys in 2021

It’s easy to get overwhelmed while looking for a correct survey tool because of their sheer numbers in the market. That’s why we have listed the top 5 tools to lighten your load and help you get started with the survey campaigns.

1. Qualaroo

Qualaroo is a complete customer feedback management solution that can help you create and manage the survey data under one dashboard. You can conduct surveys on your website, app, SaaS product, social media, and email. 

With features like skip-logic, 40+ pre-built question templates, 12+ question types, 50+ language translations, customer survey design options, rebranding, and advanced targeting options, you can start collecting feedback in a few hours. 

The tool also supports advanced AI-based data analysis techniques – sentiment analysis and text analytics to categorize the survey responses and produce valuable insights automatically. 

Price: Free forever account. Paid plans start at $80/month

2. ProProfs Survey Maker

ProProfs Survey Maker brings more than surveys to the table. It lets you create interactive scored quizzes, polls, assessments, and survey forms as well. You can also add a feedback sidebar on your website. 

It also supports multi-channel deployment, i.e., you can add the survey to your website, mobile app, email, and social media. You can analyze and track the responses using a detailed dashboard.

Price: Forever free account. Paid plans start at $0.05/response/month

3. SurveyMonkey

SurveyMonkey is one of the best survey tools in the market that offers skip logic, multiple answer types, survey language translation, progress bar, scoring mechanism, question randomization, pre-built templates, and design customization options. 

You can deploy surveys on your website, app, product, or email. It also features the sentiment analysis tool and word cloud generator to analyze the feedback and extract valuable insights. You can filter the data using custom charts and feedback summaries to study the desired data points.

Price: Forever free account. Paid plans start at $31/month

4. Typeform

If you are looking for a simple yet effective survey tool for your website, Typeform is the one to go for. The tool lets you build surveys, forms, polls, and quizzes for your website. You can show the survey on your website as a popup, popover, slider, or sidebar button. 

Like other tools in the list, Typeform also has a plethora of survey personalization features like pre-built templates, question randomizer, progress bar, skip logic, customization themes, and mobile-responsive design. The targeting options are a little simpler than other tools.

Price: Forever free account. Paid plans start at $25/month

5. Survicate

Last but not least, Survicate is another one-stop customer feedback tool with features to conduct surveys on websites, web apps, mobile apps, and emails. Use advanced features like audience targeting, question branching, pre-built templates, 15+ question types, and white labeling to create highly targeted surveys. 

With the inbuilt AI-based text analytics engine and analytical dashboard, you can extract valuable customer insights and track the survey campaign’s performance.

Price: Free forever plan. Paid plans start at $89/month

Ask.Analyze.Act

As you can see, a survey is not just a list of questions but an entire strategic approach to establish a line of conversation between you and your customers. With the survey tools becoming less expensive and more versatile, learning how to conduct a survey effectively can help you gain new customers, retain existing ones, optimize your products and develop ideas to increase conversions. 

So, what are you waiting for?

Understand what data you want to collect, pick the right survey tool, and follow these steps to build survey campaigns that produce tangible results.

What Question Types Can I Use in Surveys?

You can choose from different survey answer types like 

  • Multiple answer selection (checkboxes)
  • Single answer selection (radio button)
  • Single answer selection (dropdown)
  • Text-based answer
  • Text-based answer (single line)
  • Star Rating selection
  • Net Promoter Score

What Are the Different Types of Surveys?

The survey types depend on the type of data they collect from the respondents, like:

  • Market Research Surveys
  • Post-Purchase Surveys
  • Customer Satisfaction Surveys
  • Exit-Intent Surveys
  • NPS (Net Promoter Score) Surveys
  • Lead Generation Surveys
  • Website Polls

Read More: Types of Website Surveys

What Are Common Survey Challenges and Errors?

Here are some common challenges and mistakes people make while conducting surveys:

  • Asking too many questions 
  • Framing assumptive questions
  • Making selection error
  • Not adding enough response options
  • Using negative question wording
  • Assuming prior knowledge

Shivani Dubey

About the author

Shivani dubey.

Shivani has more than 3 years of experience in the modern creative content paradigm and technical writing verticals. She has been published in The Boss Magazine, Reseller Club, and HR Technologist. She is passionate about Artificial Intelligence and has a deep understanding of how organizations can leverage customer support technologies for maximum success. In her free time, she enjoys Nail art, playing with her guinea pigs, and chilling with a bowl of cheese fries.

Popular Posts in This Category

how to conduct a research survey

40+ Best SaaS Tools For Business in 2024

how to conduct a research survey

Website Feedback Tabs & Buttons: Everything You Need to Know

how to conduct a research survey

How to Manage Customer Feedback Management: Your Kickstarter Guide

how to conduct a research survey

Automate User Research for Long-Term Viability and a Better Bottom Line

Your biggest growth opportunity might be right under your nose, 5 reasons your lead gen efforts may be missing the mark.

  • Technical Support
  • Find My Rep

You are here

How to Conduct Surveys

How to Conduct Surveys A Step-by-Step Guide

  • Arlene Fink - UCLA, Los Angeles, USA
  • Description

See what’s new to this edition by selecting the Features tab on this page. Should you need additional information or have questions regarding the HEOA information provided for this title, including what is new to this edition, please email [email protected] . Please include your name, contact information, and the name of the title for which you would like more information. For information on the HEOA, please go to http://ed.gov/policy/highered/leg/hea08/index.html .

For assistance with your order: Please email us at [email protected] or connect with your SAGE representative.

SAGE 2455 Teller Road Thousand Oaks, CA 91320 www.sagepub.com

NEW TO THIS EDITION:

  • How to properly use artificial intelligence (AI)  to identify entire surveys and survey questions through appropriate querying techniques and rigorous evaluation.
  • Coverage of recruiting participants and administering surveys through social media  gives readers contemporary and effective options for recruiting participants while understanding the implications for sampling.
  • Guidance for conducting surveys on smartphones  help readers consider the appropriate situations for these now-ubiquitous devices.
  • Discussions of videoconference platforms such as Zoom and Microsoft Teams  can help students maximize recruitment for surveys and their effectiveness.
  • Additional coverage of ethical survey research, including cultural validity and privacy concerns  encourage readers to consider a survey's impact on both research and participants.
  • Updated language throughout  reflects contemporary standards and will appeal to students.
  • Learning Objectives now structure the reading experience.

  KEY FEATURES:

  • Helpful examples make it easy to learn how to apply relevant concepts.
  • Summing Up sections that highlight each chapter’s most important concepts help readers master key content.
  • Making the Decision sections help readers make informed choices by citing the advantages and disadvantages at each waypoint.
  • Standard checklists for writing transparent survey reports prepare students to write and report on their own rigorous surveys.

For instructors

Select a purchasing option, related products.

Conducting Research Literature Reviews

Logo

  • Start For Free

How to Create a Marketing Survey? (+ 10 Essential Tips)

marketing survey

Today’s business world is highly competitive and, because of this, it’s more crucial than ever before to understand your customers.

To succeed, you need to know the needs and preferences of your customers. Market surveys that are crafted with care and intent serve as valuable tools for gaining insights into each of these areas — without the stress of cold-calling!

Anyone can create a survey, but not everyone can create a survey that customers enjoy taking, which are two completely different things.

Creating a survey that target customers will want to take involves a measure of thoughtfulness, planning, and a mindset that prioritizes user experience.

In the following article, we’re digging into what it means to create a marketing survey that doesn’t bore your customers and what you should do to get started conducting your own market research.

Collect feedback with JustFeedback

JustFeedback helps your business increase profits and reduce risk by improving your customer experience

Setup in seconds No credit card required

1. Clearly define the survey objectives

Before diving into the fun part – the creation of your marketing survey-, it’s important to clearly define the objectives for the project at hand. What specific information do you want to gather from your customers?

Are you looking for feedback on a new product, exploring the current market trends, or do you want customer satisfaction information that you can evaluate?

If you can outline your goals with high certainty, the survey creation process will be kick started. The same certainty also ensures that your market research questions are targeted and relevant.

2. Know your target markets well

The first step to creating successful marketing surveys is to build a solid foundation by growing an understanding of your target audience.

Understanding your target audience’s customer demographics is the key foundation to crafting surveys that they identify with. If you want people to take your survey, it needs to be relevant.

Your target audience won’t be willing to take time out of their day to fill out a survey that has nothing to do with them, which further goes to show how important it is to know your target market.

Tailor your survey questions to address the things they’re interested in and like. Focus on their personal needs, too. Personalization is key here.

It’s also a good idea to take into consideration how specific your market research survey needs to be.

You don’t want to be targeting a demographic that is too broad (too big), but you also don’t want to be too specific in terms of who you’re targeting with your surveys.

Unfortunately, there’s a fine line (that can be very blurry) between these two extremes. Using too many filters to vet potential respondents before the survey will make it harder for you to find enough customers who qualify to complete the entire survey.

Say you filter respondents down to females who are 18-29 years old that live in Redding, California and who own a 2005 Buick. Using this set of highly-specific criteria, you’re going to get very few respondents who qualify to complete your survey.

On the contrary, the chances are actually higher that you won’t find anyone at all to complete the survey.

market research, market survey, brand awareness

3. Craft clear, concise survey questions

The quality of your survey questions significantly impacts how effective your online survey ends up being. You should aim for questions that are clear, straight-to-the-point, and free of “filler” (that is, extra words and unnecessary details).

Your questions don’t need to be detailed or contain descriptions unless they are for specification purposes.

Avoid using language that is too technical or industry-specific as to not confuse your target audience or dissuade them from continuing your survey.

For example, if you’re in the space industry, you wouldn’t want to put out a survey that contains long, relatively uncommon words like ‘Enceladus’ if you’re trying to gauge the opinions and preferences of people who aren’t astronomers by trade.

Instead, opt for straightforward questions that are easy to understand and answer.

Don’t worry about making your questions sound professional or impressive; instead, put the emphasis on gathering the appropriate information without overwhelming your customers. Simple questions receive better and more thoughtful input.

A market research survey that is poorly worded or that uses a question format that isn’t correct, can quickly render all of the data you manage to collect null and void.

Customer surveys that ask the wrong questions or that use poor wording can easily render your survey data useless.

If a question you write sees most respondents answering with “I don’t know” and similar answers, you have missed your goal of gaining customer feedback.

There are a host of different kinds of market research questions that can be used to gather data from target customers. Here are a few of the different kinds that you could consider using.

Use Categorical Questions

Categorical questions can also be called nominal questions. They produce numbers and percentages that can be used for easy data visualization.

They are great for creating visual data like bar graphs and pie charts, but don’t give enough information to gather numerical averages. In addition, they can’t test correlations.

Some of the most basic survey questions, yes/no questions, are categorical.

Multiple choice questions are, as well. However, they offer slightly more information than yes/no questions do, since you can add as many possible answers as you want.

Checkbox questions are extremely flexible and gives respondents the freedom to choose as many answers as they feel fit them. They are the questions that surveys ask that have respondents click any number of different boxes to select their answer(s).

They are also a type of categorical question.

4. Choose the right format

There are various survey formats available for use. Each one is suited to different objectives and types of information, which makes it important to know which type should be used for what purpose. Common survey types include:

  • Customer satisfaction: used for understanding the level of satisfaction your customers have with your product, service, or company as a whole.
  • NPS: used to measure who likely a customer is to recommend your business. NPS surveys are also used to gauge customer loyalty.
  • Market research: used to gain accurate insights and information about your target market.Consider your goals and their nature, as well as the type of data you’re looking to collect when selecting the appropriate type of survey for your needs.

5. Use open and close-ended questions

For a marketing survey to be effective, it should have an equal balance of open-ended and close-ended questions, if possible. Close-ended questions are questions like multiple choice questions or asking for ratings,

Marketing surveys that are effective have a balance of closed-ended and open-ended questions. Closed-ended questions -multiple choice or questions that ask for ratings- provide quantitative data that is easy to analyze and make use of.

However, open-ended questions are great for allowing respondents to give feedback in more detail by using their own words.

By including both types of questions in your surveys, you’re rewarded with valuable sneak peaks into how your customers perceive your business on a personal, individual level.

a. Open-ended questions

There are an endless number of market research questions we could use as examples for open-ended questions. Questions that start with the following phrases are open-ended:

  • What do you think of/about
  • How do you feel about
  • How would you describe
  • Tell us about

b. Close-ended questions

While close-ended questions are not as fun for respondents to answer (and don’t give results that are as detailed as their open-ended counterparts), there is a time and place for using them.

Examples of close-ended questions include those that start with:

  • Do you like…
  • Have you ever…
  • Are you going to…

These kinds of questions are best answered with simple, yes/no answers, which is what makes them close-ended. Since the question doesn’t prompt the respondant to elaborate on their answer, a yes or no will suffice, effectively “closing” the conversation.

competitive analysis, competitive advantage, marketing strategy

6. Consider user experience

A user’s experience plays a critical role in the success of your market research survey.

Surveys should always be easy to access and be designed for navigation across a wide variety of different platforms and devices.

Designs should be intuitive and clean, boasting potent visual cues and clear instructions that work to effortlessly guide customers through the survey.

Special attention should be paid to page layout, load time, and interactive elements — each of these facets should work with the others like a well-oiled machine.

7. Launch a pilot market research survey

Before launching your market research survey to your target market and customer base, conduct a quick pilot test using a group of willing respondents who are interested in helping you out.

A trial run-type test, a pilot survey will help you to easily identify flaws in your survey design, which allows you to make any adjustments that might be necessary before the survey goes live.

After the pilot has launched, pay attention to how your respondents interact with it. Collecting feedback after the pilot will leave you with a ton of valuable information to work with moving forward.

8. Analyze and interpret survey results

Once you’ve gathered enough market research data that you can use, it’s time to move onto the next step — analyzing and interpreting the market research data to make improvements!

In the data you’ve collected, be sure to watch out for patterns and recurring themes.

You can use pie charts, bar graphs, and other visualization tools to showcase your findings. Visualization tools are an easy way to draw attention to important information, since humans are highly visual creatures and we are drawn to images.

Asking your existing customers follow-up questions is a smart way to put market research data to good use.

Take note of ways that you could improve the effectiveness of your existing marketing strategies and marketing activities; don’t simply ask the questions and try to remember the results!

Recording the data on paper (or your computer) is the best way to track it.

Are there any strategies that you could make better? More effective? How about more engaging?

Carefully analyzing the results of your market research surveys uncovers insights that can be used to make future decisions surrounding your business’ growth.

conduct market research, brand awareness survey, measure brand awareness

9. Implement feedback loops

Once you’ve gathered your information, it’s important to use the information for something. So, make use of the valuable insights you gained from your survey by using them to implement feedback loops!

A feedback loop is the process of gathering information from customers and then using it to make changes related to the operation of your business. It’s a loop, like the name suggests.

Share your market survey research with your team (if you have one) and discuss potential changes that could be made based on the feedback received, successfully bringing the loop to completion.

By incorporating feedback loops, you demonstrate to your customers that their opinions are valued – and valid.

It also reassures them that your business is dedicated to improving their experience, which goes a long way when it comes to retaining loyal customers.

Implementing feedback loops can be done in a number of different ways, but one awesome example is everyone’s favorite streaming service– Netflix.

Netflix implements watcher feedback on a regular basis.

Their loop starts when they publish a piece of content for streaming. From there, they monitor how it performs by tracking how many people watch it and, if they don’t finish it, how far they got before the switched it off.

Once they’ve gathered those numbers they use the data to decide whether they keep the content live or if they delete and replace it with something else. This feedback also plays into which content is posted in the future and helps to draw in potential customers.

10. Repeat and improve

Successful marketing research surveys are not simply one-time jobs. They’re an ongoing process of repetition and improvement that should spend minimal amounts of time being stagnant.

Survey questions should be regularly reviewed and updated to reflect current customer needs and the dynamic objectives of your business.

Feedback about survey experiences should be gathered from both loyal customers and new customers. This feedback should then be incorporated and considered when making future changes, regardless of the reason for them.

By continually refining your marketing efforts, you can ensure that your market research surveys remain effective, relevant, and engaging to your target market and that your data quality never suffers.

To create a marketing survey that your customers love, you must take the time to plan, utilize thoughtful designs, and gain a deep understanding of your target audience.

By carefully following the steps we’ve outlined above, you’ll be able to design and run market research surveys that capture valuable data while simultaneously engaging and resonating with your audience.

Don’t forget to define clear objectives for your market research survey, learn about your audience, craft appropriate questions while striking just the right balance between open and close-ended questions, and choose the right survey format for your needs.

A focus should be put on user experience and a pilot test should be ran for maximum insight gain.

Using the right approach, a well-designed marketing survey can be a powerful tool for gaining insights into consumer attitudes, driving customer engagement, and making informed decisions regarding your business.

Ready to create surveys with JustFeedback ?

🚀 Collect feedback with JustFeedback customer experience survey tools Start For Free

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Korean Med Sci
  • v.35(45); 2020 Nov 23

Logo of jkms

Reporting Survey Based Studies – a Primer for Authors

Prithvi sanjeevkumar gaur.

1 Smt. Kashibai Navale Medical College and General Hospital, Pune, India.

Olena Zimba

2 Department of Internal Medicine No. 2, Danylo Halytsky Lviv National Medical University, Lviv, Ukraine.

Vikas Agarwal

3 Department Clinical Immunology and Rheumatology, Sanjay Gandhi Postgraduate Institute of Medical Sciences, Lucknow, India.

Latika Gupta

Associated data.

The coronavirus disease 2019 (COVID-19) pandemic has led to a massive rise in survey-based research. The paucity of perspicuous guidelines for conducting surveys may pose a challenge to the conduct of ethical, valid and meticulous research. The aim of this paper is to guide authors aiming to publish in scholarly journals regarding the methods and means to carry out surveys for valid outcomes. The paper outlines the various aspects, from planning, execution and dissemination of surveys followed by the data analysis and choosing target journals. While providing a comprehensive understanding of the scenarios most conducive to carrying out a survey, the role of ethical approval, survey validation and pilot testing, this brief delves deeper into the survey designs, methods of dissemination, the ways to secure and maintain data anonymity, the various analytical approaches, the reporting techniques and the process of choosing the appropriate journal. Further, the authors analyze retracted survey-based studies and the reasons for the same. This review article intends to guide authors to improve the quality of survey-based research by describing the essential tools and means to do the same with the hope to improve the utility of such studies.

Graphical Abstract

An external file that holds a picture, illustration, etc.
Object name is jkms-35-e398-abf001.jpg

INTRODUCTION

Surveys are the principal method used to address topics that require individual self-report about beliefs, knowledge, attitudes, opinions or satisfaction, which cannot be assessed using other approaches. 1 This research method allows information to be collected by asking a set of questions on a specific topic to a subset of people and generalizing the results to a larger population. Assessment of opinions in a valid and reliable way require clear, structured and precise reporting of results. This is possible with a survey based out of a meticulous design, followed by validation and pilot testing. 2 The aim of this opinion piece is to provide practical advice to conduct survey-based research. It details the ethical and methodological aspects to be undertaken while performing a survey, the online platforms available for distributing survey, and the implications of survey-based research.

Survey-based research is a means to obtain quick data, and such studies are relatively easy to conduct and analyse, and are cost-effective (under a majority of the circumstances). 3 These are also one of the most convenient methods of obtaining data about rare diseases. 4 With major technological advancements and improved global interconnectivity, especially during the coronavirus disease 2019 (COVID-19) pandemic, surveys have surpassed other means of research due to their distinctive advantage of a wider reach, including respondents from various parts of the world having diverse cultures and geographically disparate locations. Moreover, survey-based research allows flexibility to the investigator and respondent alike. 5 While the investigator(s) may tailor the survey dates and duration as per their availability, the respondents are allowed the convenience of responding to the survey at ease, in the comfort of their homes, and at a time when they can answer the questions with greater focus and to the best of their abilities. 6 Respondent biases inherent to environmental stressors can be significantly reduced by this approach. 5 It also allows responses across time-zones, which may be a major impediment to other forms of research or data-collection. This allows distant placement of the investigator from the respondents.

Various digital tools are now available for designing surveys ( Table 1 ). 7 Most of these are free with separate premium paid options. The analysis of data can be made simpler and cleaning process almost obsolete by minimising open-ended answer choices. 8 Close-ended answers makes data collection and analysis efficient, by generating an excel which can be directly accessed and analysed. 9 Minimizing the number of questions and making all questions mandatory can further aid this process by bringing uniformity to the responses and analysis simpler. Surveys are arguably also the most engaging form of research, conditional to the skill of the investigator.

Q/t = questions per typeform, A/m = answers per month, Q/s = questions per survey, A/s = answers per survey, NA = not applicable, NPS = net promoter score.

Data protection laws now mandate anonymity while collecting data for most surveys, particularly when they are exempt from ethical review. 10 , 11 Anonymization has the potential to reduce (or at times even eliminate) social desirability bias which gains particular relevance when targeting responses from socially isolated or vulnerable communities (e.g. LGBTQ and low socio-economic strata communities) or minority groups (religious, ethnic and medical) or controversial topics (drug abuse, using language editing software).

Moreover, surveys could be the primary methodology to explore a hypothesis until it evolves into a more sophisticated and partly validated idea after which it can be probed further in a systematic and structured manner using other research methods.

The aim of this paper is to reduce the incorrect reporting of surveys. The paper also intends to inform researchers of the various aspects of survey-based studies and the multiple points that need to be taken under consideration while conducting survey-based research.

SURVEYS IN THE COVID-19 PANDEMIC

The COVID-19 has led to a distinctive rise in survey-based research. 12 The need to socially distance amid widespread lockdowns reduced patient visits to the hospital and brought most other forms of research to a standstill in the early pandemic period. A large number of level-3 bio-safety laboratories are being engaged for research pertaining to COVID-19, thereby limiting the options to conduct laboratory-based research. 13 , 14 Therefore, surveys appear to be the most viable option for researchers to explore hypotheses related to the situation and its impact in such times. 15

LIMITATIONS WHILE CONDUCTING SURVEY-BASED RESEARCH

Designing a fine survey is an arduous task and requires skill even though clear guidelines are available in regard to the same. Survey design requires extensive thoughtfulness on the core questions (based on the hypothesis or the primary research question), with consideration of all possible answers, and the inclusion of open-ended options to allow recording other possibilities. A survey should be robust, in regard to the questions gathered and the answer choices available, it must be validated, and pilot tested. 16 The survey design may be supplanted with answer choices tailored for the convenience of the responder, to reduce the effort while making it more engaging. Survey dissemination and engagement of respondents also requires experience and skill. 17

Furthermore, the absence of an interviewer prevents us from gaining clarification on responses of open-ended questions if any. Internet surveys are also prone to survey fraud by erroneous reporting. Hence, anonymity of surveys is a boon and a bane. The sample sizes are skewed as it lacks representation of population absent on the Internet like the senile or the underprivileged. The illiterate population also lacks representation in survey-based research.

The “Enhancing the QUAlity and Transparency Of health Research” network (EQUATOR) provides two separate guidelines replete with checklists to ensure valid reporting of e-survey methodology. These include “The Checklist for Reporting Results of Internet E-Surveys” (CHERRIES) statement and “ The Journal of Medical Internet Research ” (JMIR) checklist.

COMMON TYPES OF SURVEY-BASED RESEARCH

From a clinician's standpoint, the common survey types include those centered around problems faced by the patients or physicians. 18 Surveys collecting the opinions of various clinicians on a debated clinical topic or feedback forms typically served after attending medical conferences or prescribing a new drug or trying a new method for a given procedure are also surveys. The formulation of clinical practice guidelines entails Delphi exercises using paper surveys, which are yet another form of survey-mediated research.

Size of the survey depends on its intent. They could be large or small surveys. Therefore, identification of the intent behind the survey is essential to allow the investigator to form a hypothesis and then explore it further. Large population-based or provider-based surveys are often done and generate mammoth data over the years. E.g. The National Health and Nutrition Examination Survey, The National Health Interview Survey and the National Ambulatory Medical Care Survey.

SCENARIOS FOR CONDUCTING SURVEY-BASED RESEARCH

Despite all said and done about the convenience of conducting survey-based research, it is prudent to conduct a feasibility check before embarking on one. Certain scenarios may be the key determinants in determining the fate of survey-based research ( Table 2 ).

ETHICS APPROVAL FOR SURVEY-BASED RESEARCH

Approval from the Institutional Review Board should be taken as per requirement according to the CHERRIES checklist. However, rules for approval are different as per the country or nation and therefore, local rules must be checked and followed. For instance, in India, the Indian Council of Medical Research released an article in 2017, stating that the concept of broad consent has been updated which is defined “consent for an unspecified range of future research subject to a few contents and/or process restrictions.” It talks about “the flexibility of Indian ethics committees to review a multicentric study proposal for research involving low or minimal risk, survey or studies using anonymized samples or data or low or minimal risk public health research.” The reporting of approvals received and applied for and the procedure of written, informed consent followed must be clear and transparent. 10 , 19

The use of incentives in surveys is also an ethical concern. 20 The different of incentives that can be used are monetary or non-monetary. Monetary incentives are usually discouraged as these may attract the wrong population due to the temptation of the monetary benefit. However, monetary incentives have been seen to make survey receive greater traction even though this is yet to proven. Monetary incentives are not only provided in terms of cash or cheque but also in the form of free articles, discount coupons, phone cards, e-money or cashback value. 21 These methods though tempting must be seldom used. If used, their use must be disclosed and justified in the report. The use of non-monetary incentives like a meeting with a famous personality or access to restricted and authorized areas. These can also help pique the interest of the respondents.

DESIGNING A SURVEY

As mentioned earlier, the design of a survey is reflective of the skill of the investigator curating it. 22 Survey builders can be used to design an efficient survey. These offer majority of the basic features needed to construct a survey, free of charge. Therefore, surveys can be designed from scratch, using pre-designed templates or by using previous survey designs as inspiration. Taking surveys could be made convenient by using the various aids available ( Table 1 ). Moreover, even the investigator should be mindful of the unintended response effects of ordering and context of survey questions. 23

Surveys using clear, unambiguous, simple and well-articulated language record precise answers. 24 A well-designed survey accounts for the culture, language and convenience of the target demographic. The age, region, country and occupation of the target population is also considered before constructing a survey. Consistency is maintained in the terms used in the survey and abbreviations are avoided to allow the respondents to have a clear understanding of the question being answered. Universal abbreviations or previously indexed abbreviations maintain the unambiguity of the survey.

Surveys beginning with broad, easy and non-specific questions as compared to sensitive, tedious and non-specific ones receive more accurate and complete answers. 25 Questionnaires designed such that the relatively tedious and long questions requiring the respondent to do some nit-picking are placed at the end improves the response rate of the survey. This prevents the respondent to be discouraged to answer the survey at the beginning itself and motivates the respondent to finish the survey at the end. All questions must provide a non-response option and all questions should be made mandatory to increase completeness of the survey. Questions can be framed in close-ended or open-ended fashion. However, close-ended questions are easier to analyze and are less tedious to answer by the respondent and therefore must be the main component in a survey. Open-ended questions have minimal use as they are tedious, take time to answer and require fine articulation of one's thoughts. Also, their minimal use is advocated because the interpretation of such answers requires dedication in terms of time and energy due to the diverse nature of the responses which is difficult to promise owing to the large sample sizes. 26 However, whenever the closed choices do not cover all probabilities, an open answer choice must be added. 27 , 28

Screening questions to meet certain criteria to gain access to the survey in cases where inclusion criteria need to be established to maintain authenticity of target demographic. Similarly, logic function can be used to apply an exclusion. This allows clean and clear record of responses and makes the job of an investigator easier. The respondents can or cannot have the option to return to the previous page or question to alter their answer as per the investigator's preference.

The range of responses received can be reduced in case of questions directed towards the feelings or opinions of people by using slider scales, or a Likert scale. 29 , 30 In questions having multiple answers, check boxes are efficient. When a large number of answers are possible, dropdown menus reduce the arduousness. 31 Matrix scales can be used to answer questions requiring grading or having a similar range of answers for multiple conditions. Maximum respondent participation and complete survey responses can be ensured by reducing the survey time. Quiz mode or weighted modes allow the respondent to shuffle between questions and allows scoring of quizzes and can be used to complement other weighted scoring systems. 32 A flowchart depicting a survey construct is presented as Fig. 1 .

An external file that holds a picture, illustration, etc.
Object name is jkms-35-e398-g001.jpg

Survey validation

Validation testing though tedious and meticulous, is worthy effort as the accuracy of a survey is determined by its validity. It is indicative of the of the sample of the survey and the specificity of the questions such that the data acquired is streamlined to answer the questions being posed or to determine a hypothesis. 33 , 34 Face validation determines the mannerism of construction of questions such that necessary data is collected. Content validation determines the relation of the topic being addressed and its related areas with the questions being asked. Internal validation makes sure that the questions being posed are directed towards the outcome of the survey. Finally, Test – retest validation determines the stability of questions over a period of time by testing the questionnaire twice and maintaining a time interval between the two tests. For surveys determining knowledge of respondents pertaining to a certain subject, it is advised to have a panel of experts for undertaking the validation process. 2 , 35

Reliability testing

If the questions in the survey are posed in a manner so as to elicit the same or similar response from the respondents irrespective of the language or construction of the question, the survey is said to be reliable. It is thereby, a marker of the consistency of the survey. This stands to be of considerable importance in knowledge-based researches where recall ability is tested by making the survey available for answering by the same participants at regular intervals. It can also be used to maintain authenticity of the survey, by varying the construction of the questions.

Designing a cover letter

A cover letter is the primary means of communication with the respondent, with the intent to introduce the respondent to the survey. A cover letter should include the purpose of the survey, details of those who are conducting it, including contact details in case clarifications are desired. It should also clearly depict the action required by the respondent. Data anonymization may be crucial to many respondents and is their right. This should be respected in a clear description of the data handling process while disseminating the survey. A good cover letter is the key to building trust with the respondent population and can be the forerunner to better response rates. Imparting a sense of purpose is vital to ideationally incentivize the respondent population. 36 , 37 Adding the credentials of the team conducting the survey may further aid the process. It is seen that an advance intimation of the survey prepares the respondents while improving their compliance.

The design of a cover letter needs much attention. It should be captivating, clear, precise and use a vocabulary and language specific to the target population for the survey. Active voice should be used to make a greater impact. Crowding of the details must be avoided. Using italics, bold fonts or underlining may be used to highlight critical information. the tone ought to be polite, respectful, and grateful in advance. The use of capital letters is at best avoided, as it is surrogate for shouting in verbal speech and may impart a bad taste.

The dates of the survey may be intimated, so the respondents may prepare themselves for taking it at a time conducive to them. While, emailing a closed group in a convenience sampled survey, using the name of the addressee may impart a customized experience and enhance trust building and possibly compliance. Appropriate use of salutations like Mr./Ms./Mrs. may be considered. Various portals such as SurveyMonkey allow the researchers to save an address list on the website. These may then be reached out using an embedded survey link from a verified email address to minimize bouncing back of emails.

The body of the cover letter must be short, crisp and not exceed 2–3 paragraphs under idea circumstances. Ernest efforts to protect confidentiality may go a long way in enhancing response rates. 38 While it is enticing to provide incentives to enhance response, these are best avoided. 38 , 39 In cases when indirect incentives are offered, such as provision of results of the survey, these may be clearly stated in the cover letter. Lastly, a formal closing note with the signatures of the lead investigator are welcome. 38 , 40

Designing questions

Well-constructed questionnaires are essentially the backbone of successful survey-based studies. With this type of research, the primary concern is the adequate promotion and dissemination of the questionnaire to the target population. The careful of selection of sample population, therefore, needs to be with minimal flaws. The method of conducting survey is an essential determinant of the response rate observed. 41 Broadly, surveys are of two types: closed and open. Depending on the sample population the method of conducting the survey must be determined.

Various doctors use their own patients as the target demographic, as it improves compliance. However, this is effective in surveys aiming towards a geographically specific, fairly common disease as the sample size needs to be adequate. Response bias can be identified by the data collected from respondent and non-respondent groups. 42 , 43 Therefore, to choose a target population whose database of baseline characteristics is already known is more efficacious. In cases of surveys focused on patients having a rare group of diseases, online surveys or e-surveys can be conducted. Data can also be gathered from the multiple national organizations and societies all over the world. 44 , 45 Computer generated random selection can be done from this data to choose participants and they can be reached out to using emails or social media platforms like WhatsApp and LinkedIn. In both these scenarios, closed questionnaires can be conducted. These have restricted access either through a URL link or through e-mail.

In surveys targeting an issue faced by a larger demographic (e.g. pandemics like the COVID-19, flu vaccines and socio-political scenarios), open surveys seem like the more viable option as they can be easily accessed by majority of the public and ensures large number of responses, thereby increasing the accuracy of the study. Survey length should be optimal to avoid poor response rates. 25 , 46

SURVEY DISSEMINATION

Uniform distribution of the survey ensures equitable opportunity to the entire target population to access the questionnaire and participate in it. While deciding the target demographic communities should be studied and the process of “lurking” is sometimes practiced. Multiple sampling methods are available ( Fig. 1 ). 47

Distribution of survey to the target demographic could be done using emails. Even though e-mails reach a large proportion of the target population, an unknown sender could be blocked, making the use of personal or a previously used email preferable for correspondence. Adding a cover letter along with the invite adds a personal touch and is hence, advisable. Some platforms allow the sender to link the survey portal with the sender's email after verifying it. Noteworthily, despite repeated email reminders, personal communication over the phone or instant messaging improved responses in the authors' experience. 48 , 49

Distribution of the survey over other social media platforms (SMPs, namely WhatsApp, Facebook, Instagram, Twitter, LinkedIn etc.) is also practiced. 50 , 51 , 52 Surveys distributed on every available platform ensures maximal outreach. 53 Other smartphone apps can also be used for wider survey dissemination. 50 , 54 It is important to be mindful of the target population while choosing the platform for dissemination of the survey as some SMPs such as WhatsApp are more popular in India, while others like WeChat are used more widely in China, and similarly Facebook among the European population. Professional accounts or popular social accounts can be used to promote and increase the outreach for a survey. 55 Incentives such as internet giveaways or meet and greets with their favorite social media influencer have been used to motivate people to participate.

However, social-media platforms do not allow calculation of the denominator of the target population, resulting in inability to gather the accurate response rate. Moreover, this method of collecting data may result in a respondent bias inherent to a community that has a greater online presence. 43 The inability to gather the demographics of the non-respondents (in a bid to identify and prove that they were no different from respondents) can be another challenge in convenience sampling, unlike in cohort-based studies.

Lastly, manually filling of surveys, over the telephone, by narrating the questions and answer choices to the respondents is used as the last-ditch resort to achieve a high desired response rate. 56 Studies reveal that surveys released on Mondays, Fridays, and Sundays receive more traction. Also, reminders set at regular intervals of time help receive more responses. Data collection can be improved in collaborative research by syncing surveys to fill out electronic case record forms. 57 , 58 , 59

Data anonymity refers to the protection of data received as a part of the survey. This data must be stored and handled in accordance with the patient privacy rights/privacy protection laws in reference to surveys. Ethically, the data must be received on a single source file handled by one individual. Sharing or publishing this data on any public platform is considered a breach of the patient's privacy. 11 In convenience sampled surveys conducted by e-mailing a predesignated group, the emails shall remain confidential, as inadvertent sharing of these as supplementary data in the manuscript may amount to a violation of the ethical standards. 60 A completely anonymized e-survey discourages collection of Internet protocol addresses in addition to other patient details such as names and emails.

Data anonymity gives the respondent the confidence to be candid and answer the survey without inhibitions. This is especially apparent in minority groups or communities facing societal bias (sex workers, transgenders, lower caste communities, women). Data anonymity aids in giving the respondents/participants respite regarding their privacy. As the respondents play a primary role in data collection, data anonymity plays a vital role in survey-based research.

DATA HANDLING OF SURVEYS

The data collected from the survey responses are compiled in a .xls, .csv or .xlxs format by the survey tool itself. The data can be viewed during the survey duration or after its completion. To ensure data anonymity, minimal number of people should have access to these results. The data should then be sifted through to invalidate false, incorrect or incomplete data. The relevant and complete data should then be analyzed qualitatively and quantitatively, as per the aim of the study. Statistical aids like pie charts, graphs and data tables can be used to report relative data.

ANALYSIS OF SURVEY DATA

Analysis of the responses recorded is done after the time made available to answer the survey is complete. This ensures that statistical and hypothetical conclusions are established after careful study of the entire database. Incomplete and complete answers can be used to make analysis conditional on the study. Survey-based studies require careful consideration of various aspects of the survey such as the time required to complete the survey. 61 Cut-off points in the time frame allow authentic answers to be recorded and analyzed as compared to disingenuous completed questionnaires. Methods of handling incomplete questionnaires and atypical timestamps must be pre-decided to maintain consistency. Since, surveys are the only way to reach people especially during the COVID-19 pandemic, disingenuous survey practices must not be followed as these will later be used to form a preliminary hypothesis.

REPORTING SURVEY-BASED RESEARCH

Reporting the survey-based research is by far the most challenging part of this method. A well-reported survey-based study is a comprehensive report covering all the aspects of conducting a survey-based research.

The design of the survey mentioning the target demographic, sample size, language, type, methodology of the survey and the inclusion-exclusion criteria followed comprises a descriptive report of a survey-based study. Details regarding the conduction of pilot-testing, validation testing, reliability testing and user-interface testing add value to the report and supports the data and analysis. Measures taken to prevent bias and ensure consistency and precision are key inclusions in a report. The report usually mentions approvals received, if any, along with the written, informed, consent taken from the participants to use the data received for research purposes. It also gives detailed accounts of the different distribution and promotional methods followed.

A detailed account of the data input and collection methods along with tools used to maintain the anonymity of the participants and the steps taken to ensure singular participation from individual respondents indicate a well-structured report. Descriptive information of the website used, visitors received and the externally influencing factors of the survey is included. Detailed reporting of the post-survey analysis including the number of analysts involved, data cleaning required, if any, statistical analysis done and the probable hypothesis concluded is a key feature of a well-reported survey-based research. Methods used to do statistical corrections, if used, should be included in the report. The EQUATOR network has two checklists, “The Checklist for Reporting Results of Internet E-Surveys” (CHERRIES) statement and “ The Journal of Medical Internet Research ” (JMIR) checklist, that can be utilized to construct a well-framed report. 62 , 63 Importantly, self-reporting of biases and errors avoids the carrying forward of false hypothesis as a basis of more advanced research. References should be cited using standard recommendations, and guided by the journal specifications. 64

CHOOSING A TARGET JOURNAL FOR SURVEY-BASED RESEARCH

Surveys can be published as original articles, brief reports or as a letter to the editor. Interestingly, most modern journals do not actively make mention of surveys in the instructions to the author. Thus, depending on the study design, the authors may choose the article category, cohort or case-control interview or survey-based study. It is prudent to mention the type of study in the title. Titles albeit not too long, should not exceed 10–12 words, and may feature the type of study design for clarity after a semicolon for greater citation potential.

While the choice of journal is largely based on the study subject and left to the authors discretion, it may be worthwhile exploring trends in a journal archive before proceeding with submission. 65 Although the article format is similar across most journals, specific rules relevant to the target journal may be followed for drafting the article structure before submission.

RETRACTION OF ARTICLES

Articles that are removed from the publication after being released are retracted articles. These are usually retracted when new discrepancies come to light regarding, the methodology followed, plagiarism, incorrect statistical analysis, inappropriate authorship, fake peer review, fake reporting and such. 66 A sufficient increase in such papers has been noticed. 67

We carried out a search of “surveys” on Retraction Watch on 31st August 2020 and received 81 search results published between November 2006 to June 2020, out of which 3 were repeated. Out of the 78 results, 37 (47.4%) articles were surveys, 23 (29.4%) showed as unknown types and 18 (23.2%) reported other types of research. ( Supplementary Table 1 ). Fig. 2 gives a detailed description of the causes of retraction of the surveys we found and its geographic distribution.

An external file that holds a picture, illustration, etc.
Object name is jkms-35-e398-g002.jpg

A good survey ought to be designed with a clear objective, the design being precise and focused with close-ended questions and all probabilities included. Use of rating scales, multiple choice questions and checkboxes and maintaining a logical question sequence engages the respondent while simplifying data entry and analysis for the investigator. Conducting pilot-testing is vital to identify and rectify deficiencies in the survey design and answer choices. The target demographic should be defined well, and invitations sent accordingly, with periodic reminders as appropriate. While reporting the survey, maintaining transparency in the methods employed and clearly stating the shortcomings and biases to prevent advocating an invalid hypothesis.

Disclosure: The authors have no potential conflicts of interest to disclose.

Author Contributions:

  • Conceptualization: Gaur PS, Zimba O, Agarwal V, Gupta L.
  • Visualization: Gaur PS, Zimba O, Agarwal V, Gupta L.
  • Writing - original draft: Gaur PS, Gupta L.

SUPPLEMENTARY MATERIAL

Reporting survey based research

18 Different Types of Survey Methods + Pros & Cons

how to conduct a research survey

There are many reasons why surveys are important. Surveys help researchers find solutions, create discussions, and make decisions. They can also get to the bottom of the really important stuff, like, coffee or tea? Dogs or cats? Elvis or The Beatles? When it comes to finding the answers to these questions, there are 18 different types of survey methods to use.

Create your first survey, form, or poll now!

18 Different Types of Survey Methods

Different surveys serve different purposes, which is why there are a number of them to choose from. “What are the types of surveys I should use,” you ask? Here’s a look at the 18 types of survey methods researchers use today.

1. Interviews

Also known as in-person surveys or household surveys, this used to be one of the most popular types of survey to conduct. Researchers like them because they involve getting face-to-face with individuals. Of course, this method of surveying may seem antiquated when today we have online surveying at our fingertips. However, interviews still serve a purpose. 

Researchers conduct interviews when they want to discuss something personal with people. For example, they may have questions that may require extensive probing to uncover the truth. Sure, some interviewees may be more comfortable answering questions confidentially behind a keyboard. However, a skilled interviewer is able to put them at ease and get genuine responses. They can often go deeper than you may be able to using other surveying methods. 

Often, in-person interviews are recorded on camera. This way, an expert can review them afterward. They do this to determine if the answers given may be false based on an interviewee’s change in tone. A change in facial expressions and body movements may also be a signal they pick up on. 

2. Intercept Surveys

While interviews tend to choose respondents and have controls in place, intercept surveys (or “man on the spot”) surveys are conducted at certain locations or events. This involves having an interviewer, or multiple interviewers, scoping out an area and asking people, generally at random, for their thoughts or viewpoints on a particular topic. 

3. Focus Groups

These types of surveys are conducted in person as well. However, focus groups involve a number of people rather than just one individual. The group is generally small but demographically diverse and led by a moderator. The focus group may be sampling new products, or to have a discussion around a particular topic, often a hot-button one. 

The purpose of a focus group survey is often to gauge people’s reaction to a product in a group setting or to get people talking, interacting—and yes, arguing—with the moderator taking notes on the group’s behavior and attitudes. This is often the most expensive survey method as a trained moderator must be paid. In addition, locations must be secured, often in various cities, and participants must be heavily incentivized to show up. Gift cards in the $75-100 range for each survey participant are the norm.   

4. Panel Sampling

Recruiting survey-takers from a panel maintained by a research company is a surefire way to get respondents. Why? Because people have specifically signed up to take them. The benefit of these types of surveys for research, of course, is there you can be assured responses. In addition, you can filter respondents by a variety of criteria to be sure you’re speaking with your target audience.

The downside is data quality. These individuals get survey offers frequently. So, they may rush through them to get their inventive and move on to the next one. In addition, if you’re constantly tapping into the same people from the same panel, are you truly getting a representative sample?

5. Telephone Surveys

Most telephone survey research types are conducted through random digit dialing (RDD). RDD can reach both listed  and  unlisted numbers, improving sampling accuracy. Surveys are conducted by interviewers through computer-assisted telephone interviewing (CATI) software. CATI displays the questionnaire to the interviewer with a rotation of questions.  

Telephone surveys started in the 1940s. In fact, in a  recent blog , we recount how the predictions for the 1948 presidential election were completely wrong because of sampling bias in telephone surveys. Rising in popularity in the late 50s and early 60s when the telephone became common in most American households, telephone surveys are no longer a very popular method of conducting a survey. Why? Because many people refuse to take telephone surveys or simply are not answering calls from a number they don’t recognize.

6. Post-Call Surveys

If a telephone survey is going to be conducted, today it is usually a post-call survey. This is often accomplished through IVR, or interactive voice response. IVR means there is no interviewer involved. Instead, customers record answers to pre-recorded questions using numbers on their touch-tone keypads. If a question is open-ended, the interviewee can respond by speaking and the system records the answer. IVR surveys are often deployed to measure how a customer feels about a service they just received. For example, after calling your bank, you may be asked to stay on the line to answer a series of questions about your experience.

Most post-call surveys are either  NPS surveys  or customer satisfaction (CSAT) surveys. The former asks the customer “How likely are you to recommend our organization to a f riend or family based on your most recent interaction?” while the CSAT survey asks customers “How satisfied are you with the results of your most recent interaction?”.   NPS survey results reflect how the customer feels about the brand, while CSAT surveys a re all about individual agent and contact center performance.   

7. SMS Text Surveys

Many people rarely using their phone to talk anymore, and ignore calls from unknown numbers. This has given rise to the SMS (Short Messaging Service) text survey. SMS surveys are delivered via text to people who have opted in to receive notifications from the sender. This means that there is usually some level of engagement, improving response rates. The one downside is that questions typically need to be short, and answers are generally 1-2 words or simply numbers (this is why many NPS surveys, gauging customer satisfaction, are often conducted via SMS text). Be careful not to send too many text surveys, as a person can opt-out just as easily, usually by texting STOP.

8. Mail-in Surveys / Postal Surveys

These are delivered right to respondents’ doorsteps! Mail surveys were frequently used before the advent of the internet when respondents were spread out geographically and budgets were modest. After all, mail-in surveys didn’t require much cost other than the postage. 

So are mail-in surveys going the way of the dinosaur? Not necessarily. They are still occasionally more valuable compared to different methods of surveying. Because they are going to a specific name and home address, they often feel more personalized. This personalization can prompt the recipient to complete the survey. 

They’re also good for surveys of significant length. Most people have short attention spans, and won’t spend more than a few minutes on the phone or filling out an online survey. At least, not without an incentive! However, with a mail-in survey, the person can complete it at their leisure. They can fill out some of it, set it aside, and then come back to it later. This gives mail-in surveys a relatively high response rate.

9. Kiosk Surveys

These surveys happen on a computer screen at a physical location. You’ve probably seen them popping up in stores, hotel lobbies, hospitals, and office spaces. These days, they’re just about anywhere a researcher or marketer wants to collect data from customers or passers-by.  Kiosk surveys  provide immediate feedback following a purchase or an interaction. They collect responses while the experience is still fresh in the respondent’s mind. This makes their judgment more trustworthy. Below is an example of a SurveyLegend kiosk survey at McDonald’s. The kiosk survey collects information, thanks the respondent for their feedback, and then resets for the next customer. Read how to  create your own kiosk survey here .

kiosk mode

10. Email Surveys

Email surveys are one of the most effective surveying methods as they are delivered directly to your audience via their online account. They can be used by anyone for just about anything, and are easily customized for a particular audience. Another good thing about email surveys is you can easily see who did or did not open the survey and make improvements to it for a future send to increase response rates. You can also A/B test subject lines, imagery, and so on to see which is more effective. SurveyLegend offers dozens of different types of online survey questions, which we explore in our blog  12 Different Types of Survey Questions and When to Use Them (with Examples) .

Types of Questions on Surveys

11. Pop-up Surveys

A pop-up survey is a feedback form that pops up on a website or app. Although the main window a person is reading on their screen remains visible, it is temporarily disabled until a user interacts with the pop-up, either agreeing to leave feedback or closing out of it. The survey itself is typically about the company whose site or app the user is currently visiting (as opposed to an intercept survey, which is an invitation to take a survey hosted on a different site).

A pop-up survey attempts to grab website visitors’ attention in a variety of ways, popping up in the middle of the screen, moving in from the side, or covering the entire screen. While they can be intrusive, they also have many benefits. Read about the  benefits of pop-up surveys here .

12. Embedded Surveys

The opposite of pop-up surveys, these surveys live directly on your website or another website of your choice. Because the survey cannot be X’ed out of like a pop-up, it takes up valuable real estate on your site, or could be expensive to implement on someone else’s site. In addition, although the  embedded survey  is there at all times, it may not get the amount of attention a pop-up does since it’s not “in the respondent’s face.”

13. Social Media Surveys

There are more than  3.5 billion people  are using social media worldwide, a number projected to increase to almost 4.5 billion in 2025. This makes social media extremely important to marketers and researchers. Using platforms such as Facebook, Twitter, Instagram, and the new Threads, many companies and organizations send out social media surveys regularly. Because people check their social media accounts quite regularly, it’s a good way to collect responses and monitor changes in satisfaction levels or popular opinion. Check out our blog on  social media surveys  for more benefits and valuable tips.

14. Mobile Surveys

Mobile traffic has now overtaken desktop computers as the most used device for accessing the internet, with more than 54% of the share. But don’t fret – you don’t have to create an entirely new survey to reach people on their phones or tablets. Online poll makers like SurveyLegend are responsive, so when you create a desktop version of a survey, it automatically becomes mobile-friendly. The survey renders, or displays, on any device or screen regardless of size, with elements on the page automatically rearranging themselves, shrinking, or expanding as necessary. Learn more about our  responsive surveys .

15. Mobile App Surveys

Today, most companies have a mobile app. These can be an ideal way to conduct surveys as people have to willingly download your app; this means, they already have a level of engagement with your company or brand making them more likely to respond to your surveys.

16. QR Code Surveys

QR Code or QRC is an abbreviation of “Quick Response Code.” These two-dimensional encoded images, when scanned, deliver hidden information that’s stored on it. They’re different from barcodes because they can house a lot more information, including website URLs, phone numbers, or up to 4,000 characters of text. The recent QR code comeback provides a good opportunity for researchers to collect data. Place the QR code anywhere – on flyers, posters, billboards, commercials – and all someone had to do is scan it with the mobile device to have immediate access to a survey. Read more about the  benefits of QR code surveys .

17. Delphi Surveys

A Delphi survey is a structured research method used to gather the collective opinions and insights of a panel of experts on a particular topic. The process involves several rounds of questionnaires or surveys. Each round is designed to narrow things down until a consensus or hypothyses can be formed. One of the key features of the Delphi survey research is that participants are unknown to each other, thereby eliminating influence.

18. AI Surveys

Artificial intelligence is the latest types of survey method. Using AI, researchers allow the technology to ask survey questions. These “Chatbots” can even ask follow-up questions on the spot based on a respondent’s answer. There can be drawbacks, however. If a person suspects survey questions are coming from AI, they may be less likely to respond (or may respond incorrectly to mess with the AI). Additionally, AI is not good with emotions, so asking sensitive questions in an emotionless manner could be off putting to people.  Read more about AI Surveys .

Online Surveys: Ideal for Collecting Data and Feedback

Statistic: Countries with the largest digital populations in the world as of January 2023 (in millions) | Statista

That’s not all. People can take online surveys just about anywhere thanks to mobile devices. The use of these devices across age groups is balancing out as well. Check out smartphone use by age group below.

Statistic: Share of adults in the United States who owned a smartphone from 2015 to 2021, by age group | Statista

With more and more people accessing the internet through their mobile devices, now you can reach teens while they’re between classes and adults during their subway commute to work. Can’t say that for those other types of surveys !

Online surveys are also extremely cost-efficient. You don’t have to spend money on paper, printing, postage, or an interviewer. This significantly reduces set-up and administration costs. This also allows researchers and companies to send out a survey very expeditiously. Additionally, many online survey tools provide in-depth analysis of survey data. This saves you from having to spend money on further research once the survey is complete. 

Researchers have their pick of options when it’s time to survey people. Which method you choose may depend upon cost, reach, and the types of questions.

Now, you may be wondering, “ Where can I make free surveys ?” You can get started with free online surveys using SurveyLegend! He re are a few things that make SurveyLegend the ideal choice for different types of surveys for research ( or for fun) .

  • When it comes to surveys, brief is best to keep respondents attention. So, SurveyLegend automatically collects some data, such as the participant’s location, reducing the number of questions you have to ask.
  • People like eye candy and many surveys are just plain dull. SurveyLegend offers beautifully rendered pre-designed surveys that will get your participant’s attention – and keep it through to completion!
  • Today, most people take surveys on mobile devices. Often surveys desktop surveys don’t translate well, resulting in a high drop-off rate. SurveyLegend’s designs are responsive, automatically adjusting to any screen size.

What’s your favorite method of surveying people? (Hey… that’s a good topic for a survey!) Sound off in the comments!

Frequently Asked Questions (FAQs)

The 10 most common survey methods are online surveys, in-person interviews, focus groups, panel sampling, telephone surveys, post-call surveys, mail-in surveys, pop-up surveys, mobile surveys, and kiosk surveys.

Benefits of online surveys include their ability to reach a broad audience and that they are relatively inexpensive.

Kiosk surveys are surveys on a computer screen at the point of sale.

A focus group is an in-person interview or survey involving a group of people rather than just one individual. The group is generally small but demographically diverse, and led by a moderator. 

Jasko Mahmutovic

How to Write Survey Questions Ebook

Related Articles You Might Like

how to conduct a research survey

How To Create a Follow-up Survey & Questions To Ask

“The fortune is in the follow-up.”  – Jim Rohn Rohn, an American entrepreneur, author, and motivational speaker who passed in 2009, understood the importance of follow-up. He would often...

how to conduct a research survey

How To Create a Successful Webinar Survey & Questions To Ask

Webinars continue to fuel successful marketing initiatives and learning platforms. But not all webinars are created equal. If you’ve attended a virtual event in the past – and it’s 2024,...

how to conduct a research survey

What Is A Closed-Loop Survey & Five Steps To Closing The Loop

When we talk about “closing the loop,” we’re not referring to that childhood method of tying shoelaces! In business, closing the loop refers to completing a cycle or ensuring...

Privacy Overview

U.S. flag

A .gov website belongs to an official government organization in the United States.

A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Test for Fentanyl
  • if You Think Someone is Overdosing
  • Stop Overdose
  • Naloxone FAQs
  • Stigma Reduction

About Stop Overdose

  • Through preliminary research and strategic workshops, CDC identified four areas of focus to address the evolving drug overdose crisis.
  • Stop Overdose resources speak to the reality of drug use, provide practical ways to prevent overdoses, educate about the risks of illegal drug use, and show ways to get help.

Red concentric circles with text

Drugs take nearly 300 lives every day. 1 To address the increasing number of overdose deaths related to both prescription opioids and illegal drugs, we created a website to educate people who use drugs about the dangers of illegally manufactured fentanyl, the risks and consequences of mixing drugs, the lifesaving power of naloxone, and the importance of reducing stigma around recovery and treatment options. Together, we can stop drug overdoses and save lives.

What you can do

  • Get the facts on fentanyl
  • Learn about lifesaving naloxone
  • Understand the risks of polysubstance use
  • Reduce stigma around recovery and treatment

Explore and download Stop Overdose and other educational materials on CDC's Overdose Resource Exchange .

  • Centers for Disease Control and Prevention, National Center for Health Statistics. National Vital Statistics System, Mortality 2018-2021 on CDC WONDER Online Database, released in 2023. Data are from the Multiple Cause of Death Files, 2018-2021, as compiled from data provided by the 57 vital statistics jurisdictions through the Vital Statistics Cooperative Program. Accessed at http://wonder.cdc.gov/mcd-icd10-expanded.html on Mar 5, 2024

Every day, drugs claim hundreds of lives. The Stop Overdose website educates drug users on fentanyl, naloxone, polysubstance use, and dealing with stigma.

U.S. flag

An official website of the United States government.

Here’s how you know

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • American Job Centers
  • Apprenticeship
  • Demonstration Grants
  • Farmworkers
  • Federal Bonding Program
  • Foreign Labor Certification
  • Indians and Native Americans
  • Job Seekers
  • Layoffs and Rapid Response
  • National Dislocated Worker Grants
  • Older Workers
  • Skills Training Grants
  • Trade Adjustment Assistance
  • Unemployment Insurance
  • Workforce Innovation and Opportunity Act (WIOA)
  • WIOA Adult Program
  • Advisories and Directives
  • Regulations
  • Labor Surplus Area
  • Performance
  • Recovery-Ready Workplace Resource Hub
  • Research and Evaluation
  • ETA News Releases
  • Updates for Workforce Professionals
  • Regional Offices
  • Freedom of Information Act
  • Office of Apprenticeship
  • Office of Foreign Labor Certification
  • Office of Grants Management
  • Office of Job Corps
  • Office of Unemployment Insurance (1-877-S-2JOBS)

Tracking Changes in Program Implementation: Findings from Multiple Rounds of the Reemployment Services and Eligibility Assessments (RESEA) Implementation Survey

Publication info, research methodology, country, state or territory, description, other products.

In 2018, amendments to Section 306(c) of the Social Security Act (SSA) permanently authorized the Reemployment Services and Eligibility Assessments (RESEA) program and introduced substantive changes, including formula-based funding to states and a series of requirements intended to increase the use and availability of evidence-based reemployment interventions and strategies. The RESEA program aims to help Unemployment Insurance (UI) claimants return to work quickly and improve employment outcomes. It is also intended to strengthen UI program integrity and promote alignment between UI and the broader workforce development system. This brief describes changes in implementation of the RESEA program and the findings from multiple rounds of a survey of states.  

  • States consistently reported that they targeted claimants identified as most likely to exhaust UI benefits when selecting participants for the RESEA program.
  • The timing of the initial RESEA meeting relative to the notification of selection remained relatively consistent between Waves 1 and 4. Across the four waves, the initial RESEA meeting most often occurred within two weeks after notification of RESEA selection.
  • Overall, states provided more flexibility in scheduling and location of the RESEA meetings than they did prior to the COVID-19 pandemic. The use of remote service delivery options, including phone calls and videoconferences, increased. Despite increased flexibility, the content and services provided during the initial RESEA meetings remained similar between 2020-2023.
  • In Waves 1, 3, and 4, more than half of states reported conducting a subsequent RESEA meeting after the initial RESEA meeting. In Wave 4, the number of subsequent meetings conducted increased with several states conducting more than one subsequent meeting. 
  • States reported increases in activities designed to promote attendance and service delivery, such as reminder notifications to claimants. Many states leveraged the use of letter, phone, email, and text reminders to increase attendance at mandatory RESEA meetings, thereby reducing the failure to report rate.
  • By Wave 4, nearly all states had resumed pre-pandemic, staff-led reviews while sustaining more flexible and online review procedures. Before the pandemic, RESEA staff were required to review claimants’ work search logs. During the pandemic, their approach to work search reviews changed by either suspending the requirement or transitioning to an online system.
  • Relative to the first wave in 2020, states reported conducting more data analyses of RESEA participants to assess program effectiveness by Wave 4.
  • In Wave 4, 12 states reported having completed an evaluation of their RESEA program and 37 states reported planning for future RESEA evaluations of program components. Some states plan to conduct program component evaluations on job readiness workshops, intensive career services, RESEA meetings, or service delivery modes.

Brief: Tracking Changes in Program Implementation: Findings from Multiple Rounds of the Reemployment Services and Eligibility Assessments (RESEA) Implementation Survey

IMAGES

  1. Conduct Your Survey Easily

    how to conduct a research survey

  2. 7 Steps To Conduct A Sample Survey

    how to conduct a research survey

  3. HOW TO CONDUCT SURVEYS

    how to conduct a research survey

  4. PPT

    how to conduct a research survey

  5. Survey Research: Definition, Examples & Methods

    how to conduct a research survey

  6. How to Conduct a Survey in 5 Easy Steps

    how to conduct a research survey

VIDEO

  1. How to Conduct & Publish a Survey Study

  2. The Ultimate Guide to Conducting a Successful Survey: Everything You Need to Know

  3. What is a Survey and How to Design It? Research Beast

  4. How to Send Surveys: 4 Survey Distribution Methods

  5. How to conduct a survey

  6. Conducting Survey Research

COMMENTS

  1. Doing Survey Research

    Survey research means collecting information about a group of people by asking them questions and analysing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout. Distribute the survey.

  2. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  3. 7 Steps In Conducting a Survey Research

    Step 3: Decide on the type of survey method to use. Step 4: Design and write questions. Step 5: Distribute the survey and gather responses. Step 6: Analyze the collected data. Step 7: Create a report based on survey results. These survey method steps provide a general framework for conducting research.

  4. How to Conduct Surveys

    Step 1: Set the Aims of your Research. Before conducting research, you need to form a clear picture of the outcomes of your study. Create a research question and devise the goals of your research. Based on the requirements of your research, you need to select the participants.

  5. Survey Research: Definition, Examples & Methods

    Survey research is the process of collecting data from a predefined group (e.g. customers or potential customers) with the ultimate goal of uncovering insights about your products, services, or brand overall.. As a quantitative data collection method, survey research can provide you with a goldmine of information that can inform crucial business and product decisions.

  6. How to Create an Effective Survey (Updated 2022)

    The idea is to come up with a specific, measurable, and relevant goal for your survey. This way you ensure that your questions are tailored to what you want to achieve and that the data captured can be compared against your goal. 2. Make every question count.

  7. How to Conduct a Research Survey: A Step-by-Step Guide

    Identify your target audience: Determine the specific group of people you want to survey. Understanding your target audience is crucial for tailoring your questions and ensuring the data you collect is relevant and representative. Choose the appropriate survey method: Select a survey method that suits your research objectives and target audience.

  8. Guide: Conducting Survey Research

    Conducting Survey Research. Surveys represent one of the most common types of quantitative, social science research. In survey research, the researcher selects a sample of respondents from a population and administers a standardized questionnaire to them. The questionnaire, or survey, can be a written document that is completed by the person ...

  9. How to conduct your own market research survey (with example)

    How to write and conduct a market research survey. Once you've narrowed down your survey's objectives, you can move forward with designing and running your survey. Step 1: Write your survey questions. A poorly worded survey, or a survey that uses the wrong question format, can render all of your data moot. If you write a question that results ...

  10. How To Conduct A Survey That You Can Trust In 8 Steps

    Step 4: Design Your Questionnaire. Next, we're going to move on to designing the questionnaire itself. This will largely depend on the survey platform you use. As aforementioned, you'll need to use a strong market research SaaS platform that offers a variety of features and services to form a robust survey campaign.

  11. A quick guide to survey research

    Within the medical realm, there are three main types of survey: epidemiological surveys, surveys on attitudes to a health service or intervention and questionnaires assessing knowledge on a particular issue or topic. 1. Despite a widespread perception that surveys are easy to conduct, in order to yield meaningful results, a survey needs ...

  12. How to conduct a survey

    Demographic research - An analysis of economic and population data. Demographic research helps you gather an audience most likely interested in your product or service. Employee research - An analysis of your employees' beliefs, thoughts, and opinions about job and industry-related subjects.; Market research - Industry or buyer research to determine how your business measures up to ...

  13. Understanding and Evaluating Survey Research

    Understanding and Evaluating Survey Research. A variety of methodologic approaches exist for individuals interested in conducting research. Selection of a research approach depends on a number of factors, including the purpose of the research, the type of research questions to be answered, and the availability of resources.

  14. Writing Survey Questions

    May 26, 2021. Writing Survey Questions. Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions.

  15. Survey Research

    How to Conduct Survey Research. Conducting survey research involves several steps that need to be carefully planned and executed. Here is a general overview of the process: Define the research question: The first step in conducting survey research is to clearly define the research question. The research question should be specific, measurable ...

  16. Questionnaire Design

    Questionnaires vs. surveys. A survey is a research method where you collect and analyze data from a group of people. A questionnaire is a specific tool or instrument for collecting the data.. Designing a questionnaire means creating valid and reliable questions that address your research objectives, placing them in a useful order, and selecting an appropriate method for administration.

  17. Designing, Conducting, and Reporting Survey Studies: A Primer for

    Good practice in the conduct and reporting of survey research: The checklist and recommendations focus on designing questionnaires and ensuring the reliability of non-web-based surveys only. The checklist and recommendations are not based on the Delphi method. + Eysenbach, 2004 11: Checklist for Reporting Results of Internet E-Surveys (CHERRIES)

  18. Survey research

    Survey research is a research method involving the use of standardised questionnaires or interviews to collect data about people and their preferences, thoughts, and behaviours in a systematic manner. Although census surveys were conducted as early as Ancient Egypt, survey as a formal research method was pioneered in the 1930-40s by sociologist Paul Lazarsfeld to examine the effects of the ...

  19. 7 Steps to Conduct a Survey- Best Practices, Tools, & More

    4. Set a Baseline & Compare. You can leverage the redeployment ability of surveys to compare customer experiences over time and track the changes. For example, a simple NPS survey on your product page can help you track customers' loyalty and repurchase probability with time.

  20. How to Conduct Surveys

    Contents. Features. The seventh edition of How to Conduct Surveys: A Step-by-Step Guide by Arlene Fink provides a concise and reliable resource for developing and running surveys. This practical guide walks the reader through the development of rigorous surveys and how to evaluate the credibility and usefulness of surveys created by others.

  21. How to Conduct Survey Research

    Survey research is a method of measuring attitudes, opinions and behaviours by asking people questions. These lists of questions are created to meet a particular aim, which varies based on the context of research. The population of interest will also differ between surveys based on the researcher's goals. For example, an academic survey would ...

  22. Likert scale surveys: Definition, examples, and how-tos

    Likert scale surveys: Definition, examples, and how-tos. Conducting Likert scale surveys can improve your research. Learn how this rating scale works and can give you insightful data. A Likert scale is a way to answer a survey question on a five- or seven-point scale. The choices range from Strongly Agree to Strongly Disagree so the survey ...

  23. how to conduct Survey research

    how to conduct Survey research | step by step guide | survey researchSurvey Research is defined as the process of conducting research using surveys that rese...

  24. How to Create a Marketing Survey? (+ 10 Essential Tips)

    In the following article, we're digging into what it means to create a marketing survey that doesn't bore your customers and what you should do to get started conducting your own market research. Table of contents. 1. Clearly define the survey objectives. 2. Know your target markets well. 3.

  25. Reporting Survey Based Studies

    Abstract. The coronavirus disease 2019 (COVID-19) pandemic has led to a massive rise in survey-based research. The paucity of perspicuous guidelines for conducting surveys may pose a challenge to the conduct of ethical, valid and meticulous research. The aim of this paper is to guide authors aiming to publish in scholarly journals regarding the ...

  26. 10 Different Types of Survey Methods + Pros & Cons

    15. Mobile App Surveys. Today, most companies have a mobile app. These can be an ideal way to conduct surveys as people have to willingly download your app; this means, they already have a level of engagement with your company or brand making them more likely to respond to your surveys. 16. QR Code Surveys

  27. SPCMSC Team deploys instruments on Breton Island Louisiana

    By St. Petersburg Coastal and Marine Science Center May 10, 2024. A team from the Saint Petersburg Coastal and Marine Science Center will conduct a field experiment at Breton Island, Louisiana to test and improve the Total Water Level and Coastal Change Forecast in the Gulf of Mexico. Sources/Usage: Public Domain. View Media Details.

  28. Birmingham Water Works spends $69,000 for 'image improvement plan' survey

    And board members Wednesday voted to spend $69,000 on another survey to better understand how the public feels about them and to create an "image improvement plan.". The board voted to hire ...

  29. About Stop Overdose

    Key points. Through preliminary research and strategic workshops, CDC identified four areas of focus to address the evolving drug overdose crisis. Stop Overdose resources speak to the reality of drug use, provide practical ways to prevent overdoses, educate about the risks of illegal drug use, and show ways to get help.

  30. Tracking Changes in Program Implementation: Findings from Multiple

    Some states plan to conduct program component evaluations on job readiness workshops, intensive career services, RESEA meetings, or service delivery modes.Brief: Tracking Changes in Program Implementation: Findings from Multiple Rounds of the Reemployment Services and Eligibility Assessments (RESEA) Implementation Survey