On this page

Back to top

Survey Research & Your Master’s: Best Practices for Grad Students

Surveys can be a powerful research tool for graduate students, but proper execution is key. Our guide covers everything you need to know about survey design, administration, and analysis. Get started now and ace your research.

Author: Shannon Daigle

Editor: Staff Editor

Find your school in just 60 seconds

Find Your School in 3 Easy Steps

  1. Take our quiz
  2. Match with schools
  3. Connect with favorites
Five professionals collaboratively working around a table, reviewing documents and engaged in a discussion in a brightly lit office setting.

Survey data can be a powerful way to test theories, gather evidence supporting or refuting claims, and draw data-driven conclusions on diverse topics from the meaning of life to the importance of COVID vaccination status in online dating profiles. Given its versatility, it’s no wonder that survey-based research is an integral tool for graduate students. Yet it can be intimidating, especially for survey rookies.

To assist you, we’ve compiled a comprehensive guide to help you navigate the world of survey research. We’ll cover the different types of surveys, provide a step-by-step roadmap for conducting them, and outline best practices for producing high-quality results. Keep reading for the complete guide, accompanied by a curated list of 15 resources to enhance your surveying pursuits.

Types of Surveys

Choosing the appropriate type of survey for your project is critical to the success of your research. When crafting a survey, it’s essential to understand the main formats available to you and the pros and cons of each one, so you can select the one that best aligns with your research goals, target demographic, and available resources. While all surveys involve posing questions to respondents and analyzing their answers, various approaches to the process can be employed. The following section will explore three primary survey techniques, as well as their unique advantages and potential drawbacks.

Oral Surveys

Oral surveys involve a researcher directly asking questions to participants, either face-to-face or using technology such as a phone or computer. Oral surveys can be conducted in a group setting or one-on-one. Oral delivery is especially well-suited to surveys that ask complex or personally intrusive questions, given that interviewers have the ability to clarify questions, adapt inquiries during the survey, and establish rapport to encourage more honest answers. Cons of oral surveys include potential interviewer bias, higher costs, and time-consuming data collection. Additionally, many respondents may refuse phone or in-person interviews when approached.

Written Surveys

Written surveys are typically paper-based questionnaires that participants complete independently or as a group. With written surveys, it’s convenient to solicit answers from larger sample sizes over a wider geographical area at a low cost. Plus, because participants aren’t under pressure to respond personally, they can answer anonymously and take as long as they choose on each question. However, written surveys are inherently limited by the literacy, language, and physical ability of the potential respondent pool. They may also suffer from lower response rates, limited opportunities for clarification, and potential issues with handwriting legibility.

Electronic Surveys

In the past, survey researchers were limited to in-person or written surveys, but living in the digital age means it’s increasingly common to distribute surveys via the internet. Electronic surveys may be distributed and completed online via email, QR codes, social networks, or through platforms like SurveyMonkey or Google Forms. Pros of this method include ease of distribution, the ability to rapidly collect large amounts of data, automated data analysis, and environmental friendliness. The downsides include the need for internet access, potential sampling biases due to limited technology access or aptitude, and potential privacy concerns. Electronic surveys are increasingly popular due to their efficiency and adaptability to various research needs.

Conducting Surveys Step-by-Step

Beginning a survey-based research project might seem daunting, but when you use a systematic approach, you can feel confident in your ability to produce accurate, meaningful results. This section will guide you through each stage of conducting surveys, from defining a purpose to analyzing the data you collect. By following these time-tested steps, you’ll be well-equipped to design and execute an effective survey that will provide valuable data to support your research.

Step 1: Determine Your Purpose

The purpose of your survey is the foundation of every other decision you’ll make, so it’s important to clarify it as your first step. A well-defined purpose will guide you in creating relevant and targeted questions and will provide valuable insights into your chosen demographic. It will also help you stay focused and craft a cohesive set of questions. To determine your purpose, ask yourself what problem or question you’d like to address, then consider what information you’d need to collect to reach valuable conclusions.

Step 2: Narrow Down Your Sample Demographic

Once you’ve determined a purpose, clearly defining your sample demographic will ensure you collect data from a specific and relevant population. As a result, your survey results will be more accurate, reliable, and applicable to your central research purpose and question. Start by identifying the prominent characteristics of your target audience, such as age, gender, occupation, or education level. Then, tailor your survey questions and methodology to the needs and preferences of your respondents.

Step 3: Set a Goal Number of Responses

Establishing a desired number of responses helps you determine the necessary sample size for your survey. An appropriate sample size establishes results representative of your target population and minimizes sampling error. With that clear goal in mind, you can plan effective survey distribution, analyze data more effectively, and ensure you garner adequate responses to draw conclusions. To set your goal, consider factors such as the overall population size, desired confidence level, and margin of error.

Step 4: Choose Your Start & End Dates

Choosing start and end dates for your survey helps you efficiently and effectively manage the data collection process. Setting reasonable deadlines also increases response rates and encourages respondents to complete the survey promptly. When choosing dates, consider factors such as the urgency of your research, the availability of your target demographic, and any external factors that may influence response rates, such as holidays or events. A well-planned timeline ensures that your survey runs smoothly and generates quality data; build in a few days’ worth of wiggle room so you aren’t scrambling for last-minute results.

Step 5: Create a High-Quality Survey

Approach survey-building with a preference for clear, concise, and relevant questions. Be sure to cover all relevant areas of your chosen topic while avoiding leading questions and confusing language. Surveys shouldn’t be overly long or short, as both can negatively affect response rates. Using a mix of question types will keep respondents engaged until they reach the end of the survey. Once you’ve created a complete questionnaire that meets your criteria, put it through a test run to identify potential issues and make improvements before beginning official distribution.

Step 6: Distribute Your Survey

Now it’s time to make your survey available to your audience. Choosing an appropriate distribution strategy will maximize response rates and gather a diverse and representative sample. Consider the preferences and habits of your respondents, then select one or multiple survey types that are most likely to reach your audience and encourage participation. Remember to follow ethical guidelines, protect respondent privacy, and make the survey process as approachable and trustworthy as possible to promote engagement.

Step 7: Collect & Analyze the Data

Here is the final step in the survey process. With a keen eye toward detail, it’s time to organize, examine, and interpret your findings. Start by cleaning up your data, removing duplicate responses, and addressing missing or incomplete answers. You can also use data-gathering and survey software to help you sort through information and analyze it quickly. Remember to consider the limitations of your data and remain objective in your interpretation, allowing the evidence to guide your conclusions.

Building Your Survey

How you decide to build your survey is a pivotal step, directly influencing the quality and quantity of relevant data collected. Consequently, this phase requires meticulous attention and dedication. To develop an effective survey that resonates with your audience, you’ll need to choose the best format for delivery and question types. Below, you’ll find the necessary tools to help seamlessly bring these elements together and ensure survey success.

The Art of Questioning

The overall success of your survey depends on the questions you ask. Well-crafted questions can give accurate insights, while poorly designed questions can result in skewed or inadequate data. To construct good survey questions, consider the type of questions you want to ask and how you might structure them to encourage honest answers.

Open-Ended Questions

No predetermined answers are provided for open-ended questions. Instead, they allow participants to respond to questions freely and in their own words. People may express their answers in a few words or paragraphs, depending on the survey and the parameters set by researchers. These types of questions can result in detailed and nuanced insights that provide a deeper look into participants’ thoughts and feelings. However, they can be time-consuming to answer, and the resulting data will take more effort to categorize and analyze.

Closed Questions/Yes or No Questions

In closed questions, respondents are limited to yes/no or multiple-choice answers. When writing the survey, researchers decide on an available set of acceptable answers, and each respondent must choose which answer most closely aligns with their views. Yes/no, true/false, agree/disagree, and multiple-choice answers are all examples of closed questions. These questions are easy to answer, resulting in quantitative data that are convenient to analyze. However, they may limit the depth of responses or the full spectrum of opinion.

Basic Demographic Questions

Basic demographic questions gather information about participants’ characteristics, such as age, gender, education, and income. These questions help researchers describe the population of responders, identify trends, make comparisons, and assess the representative accuracy of their sample. Often, researchers make hypotheses related to one or more demographic subsets, like, for example, political affiliation or marital status. These questions can therefore help to illuminate disparities or relationships. Demographic questions should be straightforward, inclusive, and respect participants’ privacy.


How a question is asked can be as significant a factor in research design as what you ask, impacting participant responses and answer quality. Elements like question order and available answer options actively shape the respondents’ experience, so they should be chosen with thought and care. A well-formatted survey will foster engagement and elicit well-thought-out answers.

Rating or Ranking Scales

Rating and ranking scales are ideal for allowing respondents to evaluate ideas, concepts, or items on a predetermined scale. Rating scales measure attitudes toward each item individually, while ranking scales require participants to rate the available items in an ordered list by preference. These scales make it easy to collect quantifiable data, but they may be limited in their ability to capture the full range of opinions.

Magnitude Estimation Scales

With magnitude estimation scales, participants rate the importance of a concept, assigning numeric values to subjective experiences. Usually, the answer is fixed against an established point value. For example, respondents might be asked to rank their favorite item as 100 points, then rank another item against that 100-point standard. While magnitude estimation scales are helpful for placing the relative value weight of varying factors, they’re easily affected by individual bias and interpretation of the scale.

Funneling the Questions

Funneling refers to the process of strategically arranging questions so respondents are guided through complex issues and concepts in a particular way. Questions begin with broad, general inquiries and progressively narrow down to more specific topics. Funneling can help respondents gradually warm up to the survey process. Alternatively, an inverted funnel moves from specific questions to more general options, which can be helpful if participants are not previously knowledgeable about the survey content.

Simple Strategies for Effective Surveys

When conducting survey-based research, the most reliable path to obtaining accurate and valuable results is to craft effective surveys. In this section, we’ll outline practical tips and techniques to help you create the best surveys possible. By focusing on engaging content, user-friendly platforms, clarity, and avoiding bias, you’ll build a solid foundation for your research project. Let’s dive into these essential strategies for designing and executing high-quality surveys.

Make it Engaging

When you create engaging survey content, you’ll more likely maintain respondents’ interest and receive thoughtful and thorough answers that result in more valuable data. To encourage engagement, phrase questions in a conversational, easy-to-read tone, and incorporate visuals when possible. Add variety to your question formats to keep your audience interested and gain a range of response types. Additionally, explaining the purpose of the survey at the outset can get participants invested, connected, and curious about your cause.

Choose a User-Friendly Platform

You’ll have many platform choices for your survey, from traditional pen-and-paper varieties to face-to-face interviews to electronic survey platforms. When choosing an electronic modality, ensure a user-friendly platform that respondents can easily navigate and enables them to understand the survey without frustration or confusion. Google Docs and SurveyMonkey are two common examples that are easy for researchers to set up and feature intuitive interfaces, clear instructions, and compatibility across various devices. Seamless user experiences encourage better overall participation, reduce dropout rates, and promote better survey outcomes.

Keep it Short and Simple

The KISS philosophy — “keep it short and simple” — should be your guiding mantra when choosing survey questions. Simplicity is vital to keeping respondents’ attention and preventing question fatigue. Ask only essential questions that directly contribute to your study’s objectives and collect relevant data; forego the rest. Avoid redundancies or unnecessary questions that might distract participants. When your question-asking is streamlined, your respondents will maintain a stronger focus and provide higher-quality answers for your research.

Ask Clear and Direct Questions

Clarity is kind to researchers and survey participants, allowing for meaningful answers and reliable survey data. Strive for a balance of open- and closed-ended questions; too many closed questions can feel monotonous, while an excess of open questions can become exhausting. Additionally, be mindful of language, avoiding slang and adjectives or qualifiers that can skew participants’ opinions or viewpoints. The more specific and easily understood your survey questions are, the more accurate and reliable respondents’ answers will be.

Avoid Leading or Bias

Leading or biased questions can inadvertently skew the reliability and accuracy of data collected during your survey research. Avoid using double negatives, confusing terms, or language that positions one option as more favorable or unfavorable than the other options available. Additionally, avoid words and phrases that might frustrate respondents within a specific demographic. Remember that communication choices can unintentionally influence responses or compromise engagement and data quality. When you mitigate leading and bias from your questions, you create an ideal environment for receiving quality data.

Keep Question Order in Mind

Generally, it’s best to begin with warm-up questions that will allow participants to ease into the survey process. Starting with closed questions can help establish a good rhythm. As the questions progress, you can build momentum toward more complex, controversial, or open-ended inquiry. It’s also preferable to arrange questions by topic, group similar concepts together, and use transitional questions to move from one topic to another. Avoid placing the most important questions on the survey last, because participants are at risk of losing interest and skipping those questions.

Avoid Questions That Are Too Personal

Avoid asking personal questions, especially about commonly sensitive issues, unless it’s necessary for the survey data you’re trying to collect. Unnecessarily intrusive questions may upset respondents and cause them to provide inaccurate responses or quit the survey altogether. If sensitive questions are essential to your research, plan to address them later in the survey, allowing participants to become more comfortable before encountering them. You may even consider adding a preface to the survey that prepares participants for potentially sensitive topics, clearly explaining the purpose of these questions while assuring confidentiality.

Test Drive Your Survey & Check for Errors

While creating the perfect survey is nearly impossible, trials can help you identify strengths and weaknesses in question format, wording, and order. You can conduct these trials in two ways. The first is to inform participants that they’re part of a practice run and gather active feedback as they work through the survey. Alternatively, you could administer the survey just as usual, then evaluate the results to determine any deviations in analysis or standardization. Researchers often use a combination of these methods, starting with the first, to fine-tune their surveys.

Resources for Conducting Surveys & Research

Now that you’re familiar with survey types, formats, question types, and strategies for success, it’s time to let you explore further. To that end, we’ve curated this list of 15 valuable resources on conducting surveys and research, with tips on building outstanding research surveys, collecting data, and interpreting results. Our collection includes a diverse mix of how-to guides, research resources, survey hosting platforms, and more. Dive into the resources on this list to expand your knowledge and enhance your survey research skills.

  • The American Association for Public Opinion Research – This professional organization is dedicated to promoting best practices and ethical standards in survey research. You’ll find newsletters, journals, standard definitions, and more.
  • The Art of Asking: Ask Better Questions, Get Better Answers – Use this book to learn how to ask questions more effectively, handle challenging situations, and use questions to identify problems and inspire innovation.
  • Conversations for Research Rockstars – This podcast focuses on market research and features conversations and lectures on research methods, analysis, and overall trends in the field.
  • Designing and Conducting Survey Research: A Comprehensive Guide – Researchers know this as the industry-standard resource book covering highlights of survey process, data analysis techniques, and practical guidance on every step of survey creation.
  • Good Practice in the Conduct and Reporting of Survey Research – This journal article informs readers about good research practices that will hold work to a higher standard.
  • LimeSurvey – Build intuitive, fast, and anonymous online surveys with this online tool, which includes multiple-choice, slider-based, rating, emoji, and text-based answer options.
  • Pew Research Center – A nonpartisan research organization, Pew conducts surveys on various topics and offers valuable insights and methodologies.
  • ResearchGate – Researchers can network, share papers, and find potential collaborators for research projects on this professional social networking site for scientists.
  • SurveyLegend Blog – This mobile-survey development platform offers a wealth of industry insights, research tips, and survey how-to’s in its blog content library.
  • SurveyMethods – Use this online survey tool that offers questionnaire design, data collection, and analysis tools. The news and insights section of the website also provides readers with ideas and tips for using online survey software.
  • Survey Research Methods – This book provides the latest methodological information on surveys, guides users on the process, and details in depth how every aspect of a survey can affect the precision and credibility of outcomes.
  • this IS research – Hosted by two professors from the University of Notre Dame and the University of Hamburg, the podcast explores topics in research methodology and trends.
  • Typeform – Check out this interactive survey platform that allows for engaging and personalized survey experiences, forms, and quizzes. One question at a time is displayed, allowing for a customized path all respondents must follow.
  • wikiHow: How to Create a Survey – You’ll find a step-by-step guide on how to construct, implement, and analyze survey-based research, complete with practical question examples and data-collection tips.
  • World Association for Public Opinion Research (WAPOR) – An international organization that promotes the understanding and use of survey research methodologies worldwide, WAPOR has promoted high standards for polling ethics and techniques for over 70 years.

Interview with a Research Expert

Ian Servin

Ian Servin is a Continuity Specialist on the Emergency Management team at the Department of Energy’s Lawrence Berkeley National Laboratory in Berkeley, California. He earned his Master’s in Emergency Management from Georgetown University in 2022 with a focus on emergency communication and community engagement, and his capstone project included a survey exploring the use of social media by emergency management practitioners. Ian also works with the Red Cross as a Government Operations and Disaster Action Team volunteer where he coordinates mass care responses in partnership with government agencies and deploys nationwide to large scale disasters.

1. What are some of the general best practices to follow for designing effective survey questions?

I think good survey questions start with establishing a clear understanding of your research objectives. This sounds straightforward, but in my experience translating my high-level research question into specific topics and components was challenging and took more time than I had anticipated. However, investing in this area paid dividends later on because I had a solid reference for when I developed my survey questions to make sure they were relevant and focused.

In a similar vein, it’s important to refine your questions so they are as simple and concise as possible. Even for open ended questions, it’s important to have focused questions for a variety of reasons. It makes it easier for your respondents to understand, it leads to higher quality and relevant answers, and it makes analysis more straightforward. In my research, I used a mix of closed ended and open-ended questions, using the latter in a more targeted manner to help add context to topics I was asking about with my closed ended questions. This was helpful not just for drawing conclusions in my research, but provided strong quotes that helped bring my quantitative results to life in the paper.

The structure of the questions also matters and arranging your questions in a logical order helps respondents complete the survey quickly and also impacts how your respondents answer questions. In my research, I had several topics I wanted to cover and there was a natural order in how one topic led to the other as I was structuring my research plan and I made sure the finished survey reflected this. Within each subsection it’s also useful to consider the individual order of questions. I found it helpful to start with more general questions and then follow up with more specific questions. This also makes it easier to use features like branching logic or conditional questions in your survey platform. Those features are useful for preventing respondents from answering questions that are not relevant to their experiences which helps with answer quality and completion rate.

2. Based on your experience, what are some of the common pitfalls to avoid when conducting survey research?

I think one of the most significant things to look out for is sampling bias. Sampling bias is when the people who responded to your survey don’t actually represent the population you hoped to study. This is a really challenging aspect of survey research because completing surveys takes effort and it can be hard to convince people that it’s worth their time. There may be certain groups that are more likely to participate than others and it can be easy to leave people out. For this reason, it’s really important to spend an adequate amount of time on recruiting respondents, making sure your survey is straightforward and easy to understand, and building a survey that is as short as possible while adequately capturing the information you need for your analysis.

Failing to pre-test your survey is another common mistake researchers make. Even if you feel like you have a strong understanding of the people you are surveying (or are even part of that cohort yourself), sharing your survey with a few people in your study population before you actually conduct research is extremely helpful for catching issues like poorly worded questions, confusing instructions or technical errors. It’s so much easier to fix those problems before you send out your survey for real.

It’s also helpful to have at least one research expert review your questions for common issues and biases that may negatively impact your research. This could come from an advisor, faculty member, or even a peer. Again, doing this review in advance is vastly preferable to trying to fix issues after you’ve conducted your survey. At Georgetown, we had research experts in our library staff that were very helpful resources on the technical side of things.

3. How do you ensure the validity and reliability of survey data?

First and foremost, I think it’s very helpful to build your survey around tried-and-true research methods that are in common use. Looking at the research methodologies used in similar research can provide an extremely helpful starting point for your own research and help ensure that you’re following best practices and are producing good quality research. You’ll already be reviewing prior art in your literature review, so it’s easy to pay some extra attention to methodologies and think about what strategies seem to have worked well for researchers and what their benefits and limitations are and how those might apply to your research. Especially when it comes to quantitative analysis, it’s important to avoid reinventing the wheel and be conservative about your methodology choices.

Along the same lines, even though a good survey may use multiple types of questions and measurements, it is helpful to avoid too much variation to ensure there is some overall consistency in your survey questions. This is especially helpful if you are exploring multiple dimensions of a topic and wish to compare and contrast results between different topics or questions. It is much easier to make comparisons when the format and measurement methodology are the same.

4. What are some of the ethical considerations you've encountered when conducting survey research?

My research has been focused on the Emergency Management and Public Safety space and not only deals with some sensitive topics around emergency response, but also deals with people who may not be able to talk about some aspects of their work in certain contexts. Being very clear about the purpose of the survey and my research was important to building trust with my respondents. It’s useful (when you are able) to allow respondents to answer your survey anonymously. It’s also important to understand the specific ethical guidelines and rules you’re operating within. Your academic program or institution likely has their own policies and procedures, and in some cases, you may need to present your research plan to a review board. Understanding these requirements at the start of your research is an important logistical step as well as crucial for ensuring you are conducting research ethically.

5. Do you have any tips for encouraging higher survey response rates?

Two things have worked well for me. First, when you are conducting outreach to recruit participants it’s very helpful that your communications are clear and make a compelling case for why responding to the survey is important. While you want to avoid being overly wordy, it increases response rates when respondents feel some connection to the research. Second, all things being equal, shorter surveys have a higher likelihood of being completed.

Expect to go through an iterative process as you develop your question set where you tweak and remove questions that are not essential to your research question. Many online platforms will give you an estimate of how long your survey will take to complete on average, use this along with feedback from pre-testers to determine if your survey is too long. Like I mentioned earlier, it’s also helpful to use branching logic and/or conditional questions to make sure your respondents only need to respond to questions that are relevant to them.

6. How can students use survey data to draw strong conclusions and make meaningful recommendations?

An important part of building your survey is determining the types of data analysis methods you plan to use. These are key to being able to actually draw conclusions from your data. Even simple analysis like calculating frequencies, percentages and averages can be helpful in drawing meaningful information out of your raw data. Similarly, identifying patterns, trends and relationships in the data is crucial to exploring your research question. Using visual aids like charts and graphs is usually extremely helpful for presenting your analysis clearly and illustrating your findings in a more impactful way to your audience.

7. What are some techniques you used to report survey results in a clear and meaningful way in your capstone project?

I used a lot of visual aids in my paper and presentations. Especially when illustrating things like trends and comparisons, showing charts and graphs was a more impactful way of conveying my results compared to simply recounting numerical results. When doing data visualization, it is important to keep in mind basic best practices to ensure you’re illustrating your data in a clear and concise way that doesn’t overwhelm or confuse your audience. In my charts and graphs, I used simple color schemes and selected formats that were appropriate to the data and analysis I was depicting. I also reviewed my graphics to ensure they were relevant and did indeed help illustrate my findings and enhance my work rather than duplicate words already on the page.

8. How did you adapt traditional survey methodologies to your study of emergency management practitioners' use of social media?

For my specific research questions, there wasn’t a lot of prior study so my focus was on conducting foundational research. That meant that I could use fairly simple survey and analysis methods and still draw useful conclusions from my data. I looked at other survey-based studies and looked at how they structured their research, identified their study population and built their surveys and used this information, along with established best practices, to develop my own research. My primary survey consisted of 40 closed ended questions grouped into six thematically distinct sections and a set of screener questions to confirm respondents were in my target study population. In the screener section, I gave respondents the choice to answer anonymously or identify themselves and also asked for their consent for a follow up qualitative survey conducted via phone to explore my area of study in further detail.

9. Did you face any unique challenges as a part of designing or conducting your survey? How did you overcome them?

Recruiting participants was very challenging. My goal was to include a representative sample of emergency management practitioners at all levels of government including federal, state, tribal, and local organizations. I was only able to achieve this goal through a significant amount of research to identify and validate potential respondents and I spent a lot of time building my outreach email lists. I also identified specific people that could themselves identify and recruit qualified respondents and used them to increase my overall reach. Overall respondent recruitment took a lot more time and effort than I had originally anticipated, but it was critical to ensuring I had the data I needed to adequately explore my research questions. Leveraging my own network within the Emergency Management industry, and using the resources of my advisor and other faculty helped immensely with the recruiting effort.

10. What advice would you give to researchers looking to study the impact of social media on specialized fields? How can they ensure their research is effective and relevant?

Social media is a unique area of study because it is both an established communication channel while still rapidly changing and evolving. For this reason, it’s important to understand some of the potential limitations of your research and that survey data you collect is really a snapshot in time and attitudes and behaviors are liable to change significantly over time. While this is something to watch out for, it also has some upsides in that social media users have lived through major changes and shifts in technology and usage. If you ask questions about how things have changed and their experiences over time, you can often readily identify clear trends and make strong comparisons that will shed light on your research topic.