Learning and Evaluation/Archive/Grantmaking and Programs/Learning & Evaluation portal/How to survey
This page is currently a draft. More information pertaining to this may be available on the talk page. Translation admins: Normally, drafts should not be marked for translation. |
This page in a nutshell: An overview of how to create effective surveys for Wikipedia projects and programs, with examples and templates to get you started. Provides advice on setting up a survey, distributing it, and analyzing and reporting results. |
The Basics
editSurveys are a great way to get feedback from people who have participated in a project, program, experiment or event. Surveys can also be a valuable tool for gathering community input for planning an event, proposing a new project, and generally identifying what kinds of things a group of people do, want or need.
Members of the Wikimedia movement—individual volunteers, community groups, grant recipients, WMF chapters and employees—have used surveys for many years, but we lack shared information resources that describe how to create and distribute surveys and how to analyze and interpret survey data. As a result, we may not be getting as much value out of our surveys as we could and we may sometimes struggle to interpret and act on our survey results effectively. This page is intended to serve as a foundation for easier and more effective survey research. It will:
- Provide an introduction for people who are creating their first survey by providing steps, tips, considerations, use cases, example questions and links to helpful tools and other resources to get them started.
- Outline a set of basic standards, guidelines and templates so that we don't have to 'reinvent the wheel' every time we create a new survey, assure that we get the highest quality survey data possible from our surveys, and make it easier for us to compare survey results with one another and make informed decisions.
This page is set up like a tutorial, but even those with extensive experience in survey research may find these guidelines and their related resources helpful when constructing surveys.
Survey setup: identifying why, who and when
editSurveys can be used to gather all sorts of useful data about what people do, what they want, and what they think—but only if you ask the right questions to the right people at the right time. This section provides some guidance on the steps you should perform before you start writing out your list of questions.
Defining survey objectives
editConsider the following questions and create a brief rationale for your survey. This will help you explain to others why you want to conduct research in the first place, and guide your own thinking throughout the process.
- what information do I want to get?
- what do I intend to do with that information?
- are there other ways (besides surveys) that would be better for gathering this information? Why or why not?
Identifying survey participants
editTo identify your survey participants, you first need to identify a population, and then a sample. The population is the full set of people whose experiences, opinions or activities you are potentially interested in learning about. The sample is the set of people you are actually going to ask to take your survey.
In some cases, the population and the sample will be the same: for example, you may want to get feedback from all 75 people who attended a workshop. In other cases, the population is huge and you want to survey a sample of it. In these cases, you might want to gather a random sample (e.g. 100 people who have attended at least 1 Wikimania since its inception), or a more targeted sample (e.g. the 50 most active dispute resolution mediatiors on fr.wikipedia.org).
Sometimes you want to give different surveys to different groups: for example, you may wish to give one survey to people who submitted an Individual Engagement Grant, and a slightly different one to IEG reviewers. This approach can be useful if you want different kinds of input from people who participate in the same project or process, but filled different roles.
- Sample size
Another important consideration is sample size. You should never expect responses from everyone you ask to take your survey. The response rate can vary a great deal and can be hard to predict, but it's often between 10% and 20% and could even be less! So when considering how many people to ask to take the survey, consider:
- ...the maximum number of eligible participants. How many people qualify for your survey? If the answer is "everyone who has every edited Wikipedia", you might not need a 20% response rate. Then again, you might get some pretty skewed data from a target population that diverse, so you might want to narrow it down a bit. If there are only 45 possible respondents, you probably want to hear from more than half of them.
- ...the minimum number of responses you need to perform your analysis. This varies a lot based on the kind of analysis you want to do. For quantitative analysis such as comparing numerical editor satisfaction ratings between male and female editors, you will need in excess of 100 respondents (mind the gap!). If you are asking mostly open-ended questions that require respondents to write prose sentences, 100 respondents may actually be more data than you can realistically synthesize into "findings" within a reasonable timeframe.
- ...the number of people you realistically expect will respond. If your target population is small and distinct, you may get a higher response rate than if it is large and poorly-defined. If you are asking people to fill out a survey about something that is very recent (e.g. yesterday's Edit-a-thon) or very relevant (something you know they feel very passionate about), your response rate is also likely to be higher. The personal touch also matters: a survey request posted on someone's User Talk page or made in a face-to-face meeting will usually get a higher percentage of responses than email spam or a banner advertisement. Finally, offering a small honorarium may help in some cases, but usually isn't necessary.
Deciding when to deploy your survey
editWhen deciding to survey, it is important to think about whether you are asking people about something that is happening, will happen or has already happened. These are different kinds of surveys and you will likely want to ask different questions.
- Surveying before you start (formative evaluation)
If you are planning to put on an activity (such as an Edit-a-Thon) or are looking for ways to support the editors in your community, you can use a survey for formative evaluation. Formative evaluation is the process of figuring out what the current state of affairs among your target population: What is the status quo? What do people like about it? What could to be improved? How should this project move forward?
Some questions that can be answered by this kind of survey include:
- why do people volunteer in dispute resolution noticeboards?
- what kind of help resources do new editors want most?
- what is the best social media channel for reaching out to college-aged women who may be interested in editing Wikipedia?
- Surveying after you finish (summative evaluation)
Summative evaluation is the process of assessing something that has already happened: who participated, what they thought about it, and what lessons might be learned from it.
- Surveying as you go (iterative evaluation)
In some long term projects, it can also be useful to deploy surveys while the project is still ongoing. This can help you get a sense of how well what you are doing is working, and how you might change it (assuming you have the means to do so). However, because surveys take time to create, deploy and analyze, you may not have the resources to survey people before, during and after your project, let alone perform multiple iterations. Realistically, using surveys for iterative evaluation may only be useful if you are able to gather and analyze your results very quickly and then make changes based on them.
Survey design: What and how to ask
editCreating a list of questions
editThe most important step in performing a survey is figuring out what it is you want to learn. This sounds obvious, but many researchers realize after they have already deployed their survey that they failed to ask an important question.
To make sure you cover all of your bases, frame your survey around three or four types of question: background questions, activity questions, experience questions, and opinion questions. You don't need to have an equal number of questions in these four categories, and you might not need them all, but many surveys will be composed of a mix of all four types. For example, you may not be interested in people's general opinions about dispute resolution at all—just what forums they participate in, how regularly, and what role they play. But in general thinking about your questions in terms of background, activity, experience and opinion is a good strategy.
- Background questions
Questions about the participant themselves: who they are, where they're from, what they do off Wikipedia, etc.
Example background questions | ||||||||
---|---|---|---|---|---|---|---|---|
|
- Activity questions
Questions about what they do now, or have done in the past.
Example activity questions | ||||||||
---|---|---|---|---|---|---|---|---|
|
- Experience questions
Questions about how they feel (or felt), or what they think, about the specific activities they participated in.
Example experience questions | ||||||||
---|---|---|---|---|---|---|---|---|
|
- Opinion questions
Questions about what they think in general about the subject you're interested in, beyond their own specific experiences.
Example opinion questions | ||||||||
---|---|---|---|---|---|---|---|---|
|
- Tips & tricks
When deciding which questions to ask, and how many, it is also helpful to keep these considerations in mind:
- Keep your survey as short as possible. People have short attention spans. If you ask them to fill out a 50 question survey, many of them will get bored or frustrated and quit partway through, leaving you with gaps in your data and lingering ill will. Don't ask every possible question; figure out which questions are necessary and only ask those. Realistically, keep the length of time it takes to fill out your survey (in the amount of detail you desire) under 10 minutes.
- Make multiple choice answers required and free text answers optional. No one likes being forced to write. Making basic questions multiple choice allows respondents to give you critical information quickly and easily. Making answering multiple choice questions required to finish the survey (and most online survey tools will allow this) gets you good data, as long as your questions are clear and the choices you provide make sense. Free response questions can be very useful, and most surveys will contain at least a few. If people have strong opinions about something, you can be sure that they will take advantage of this—no need to force them.
Framing your questions
editTo get good quality responses from a survey you should make sure you're asking questions the right way.
Framing multiple-choice questions
editMake sure your multiple-choice questions cover all the reasonable options, but avoid asking for more detail than you actually need or more detail than the participant can reasonably provide.
- Examples
- 1a. (needs some work)
- How active are you on Wikipedia?
- 1-10 edits a month
- 10-20 edits a month
- 20-30 edits a month
- 30-40 edits a month
- 40-50 edits a month
- 50-60 edits a month
- ...etc...
- 1b. (much better)
- In a normal month, how many edits do you make to Wikipedia?
- 1-10 edits
- 11-100 edits
- 101 - 1000 edits
- more than 1000 edits
- I don't edit regularly edit Wikipedia
The second example is an improvement over the first example in several ways:
- Ask for an appropriate level of detail. Most people won't be able to accurately tell you off the top of their heads whether they average 20 or 30 edits per month. And if you're just trying to get a sense of how active your participants are, you probably don't need that much detail. Someone who averages 500 edits per month is a lot more active than someone who makes 20. But someone who makes 400 edits and someone who makes 900 edits are both "highly active Wikipedians" by anybody's definition.
- Use clear language. The first example asks "how active are you?" and then talks about edit counts. This is a good measure, but it is only one measure of activity on Wikipedia! Make sure you are clear about what you are asking. The first example also repeats "per month" in every option; it would be more readable if it just specified this once, like the second example does.
- Capture important exceptions. What if one (or more) of your participants doesn't edit Wikipedia at all? This might not be the case if you're doing a dispute resolution participant survey, but what if you are surveying conference participants, some of whom are not yet editors? You don't always have to cover every edge case, but you should try your best to anticipate the range of possible responses when you set up a multiple choice question.
Framing open-ended questions
editOpen-ended questions are used to draw out more detailed, personal responses. People may skip over these because they don't have anything to add or they don't feel like writing a lot. However, you can increase both the number and the quality of your open-ended responses by phrasing them in a way that encourages storytelling.
Again, let's consider two different ways of asking the same questions:
- Examples
- 2a. (needs work)
- Has the kind of editing you do changed since you first joined Wikipedia?
- 2b. (much better)
- How has the kind of editing you do changed since you first joined Wikipedia?
- 3a. (needs work)
- What is the most difficult thing about editing Wikipedia?
- 3b. (much better)
- Tell me about a recent editing experience where you felt frustrated.
- Tips & tricks
- Avoid the yes/no trap. It's easy to imagine someone just answering "Yes" or "No" to Example 2b. If that's all you want to know, you should probably make this question multiple choice! But if you actually want to get some detail, you should ask them how their work has changed. This will prompt them to think for at least a few seconds before they answer. Their answer may still be "It hasn't changed", but they are more likely to explain why it hasn't changed—which is a useful thing to know.
- Ask for concrete examples. People generally find it easier to remember concrete things than abstract ones. Specific examples are especially useful data, but it can be hard to think one up on command. Asking people to think of things that happened recently—things that are still fresh and clear in their memory—makes it easier to recall a specific example. If that makes them think of a better example that's less recent, even better!
- Ask for their experience, not their opinions. In most cases, personal experiences are more valuable survey results than vague impressions, gut reactions or unsubstantiated opinions. Example 3a could be interpreted in any number of ways: does "difficult" mean a negative experience? or one that was challenging but ultimately positive? Asking someone how they were affected by an experience (i.e. how they felt) will lead to more accurate and useful feedback every time.
Survey deployment: where and by what means
editContacting participants
editOnce you have your survey written, you need to get it in front of your participants. There are many ways to do this; here are a few options.
- Email. If you have the email addresses of the people you want to contact, email is a great way to contact them and may yield a higher response rate than other methods. Include the link to the survey in the email. You can also use the email this user feature on MediaWiki—although keep in mind that not all editors have that feature enabled.
- In person. If you are hosting an offline event such as a conference, workshop or edit-a-thon, you may consider asking participants to fill out your survey at the event itself. You can either set up a table in a prominent location (such as entrances and exits), or walk around with a clipboard. If you want to get information on the background of your event participants, ask them at the beginning of your event. If you want to get feedback on the event itself (while it's still fresh in peoples' minds), you can ask them as they leave.
- Global message delivery. If you know the home wiki and usernames of your participants, you can use Wikimedia's Global Message Delivery bot which posts messages to users' talk pages. You will need to either request access to the access list, or contact one of the editors who are already listed there to distribute your survey using Global Message Delivery.
- Manual posting. If you are only surveying a few editors, your fastest option may be to just manually post a templated message including a link to the survey on the editors' talk pages.
- Banners. If you would like to get your survey out to a large number of editors on one or more Wikimedia wikis, but don't care who takes the survey, you may be able to use Central Notice. However, this option is not appropriate for most surveys because you will usually want to poll a specific group of editors. You will also need to get an administrator to set up your Central Notice banner, and you may need additional approvals as well.
- Noticeboards and WikiProjects. If you are interested in polling specific groups of editors, such as administrators or editors interested in particular subjects or activities (such as public art or dispute resolution), consider posting the survey request on the talk page of a noticeboard or WikiProject.
Eliciting responses
edit- phrasing of the survey solicitation (with examples)
- legal considerations
Closing your survey
editIf you are conducting an online survey, you will have to decide when stop accepting responses. Usually, you will want to keep your survey open for at least a week or two to make sure all potential participants have an opportunity to respond. If you are using a survey tool that allows you to see how many responses have come in, check back every few days to see how many new responses you are getting. If after a week or so you are no longer getting any new responses, you may decide to close your survey early. However, you should be careful not to close your survey too early, because you may lose valuable data from people who intend to fill out your survey but haven't gotten to it yet. It's a good idea to state in your solicitation message how long the survey will be open for, and to stick to that schedule.
Analysis and reporting: understanding and communicating your results
editOnce you've gathered your survey data, the next steps are analysis and reporting. It's important to spend some time looking through your data and crafting a report: you don't ever just want to dump your data onto a wiki page and say "this is what happened." Remember: the reason you used a survey in the first place is to discover things you couldn't have discovered otherwise, and to share that information with others. In order for you or anyone else to learn from your survey, you need to study your results and describe them.
Analysis and reporting may sound intimidating and complex, but they don't need to be. If it helps, think of the process as interpreting and storytelling instead. The kind of analysis you perform depends on the kind of data you collected. If your survey involves only open-ended questions, you'll want to share examples of your participants responses. If your survey involves only multiple-choice answers, you will want to organize your data into tables and charts, and possibly present statistics. Most surveys will include both kinds of data.
And no matter what kind of data you have, there are certain details you should include in all survey reports.
- Basic details to include in all reports
- Who you sent your survey to. What were the criteria for being offered a survey?
- The number of people you sent the survey to. How many did you offer the survey to?
- The number of people who responded to the survey. How many people responded? If you surveyed 40 people and got 35 responses, people may interpret your results differently than if you surveyed 700 people and got 35 responses.
- Demographic data about survey participants. If you asked respondents to provide demographic data (such as their age, gender, country of origin, home wiki, etc), you should report this information either in aggregate (e.g. 22 male/15 female) or in a list (e.g. Countries represented: Indonesia, Kenya, Mexico, Italy, Japan.)
- A link to the survey questionnaire. The basic survey questionnaire, ideally on a wiki page or in a flat format such as PDF.
- A link to the full dataset of the survey. All responses to all questions, especially if you only include aggregate responses or responses to selected questions in your report. A text file, comma-separated value file, spreadsheet file or wiki page are fine. However, if you ask participants for personally-identifiable information such as their real name or their email address, these should not be included.
- A description of your methods and the tools you used. How did you decide who to offer the survey to? How did you contact them (e.g. email, talk page post)? How long was the survey open for? What survey tool did you use (e.g. SurveyMonkey, LimeSurvey, printed sheets)?
- Reporting quantitative data
- Highlight important findings with simple visualizations. If you have a large survey, you don't need to create a chart or a graph for every one of your questions. But any quantitative findings you want people to notice should be displayed in a table or a chart with a descriptive caption. These visualizations don't have to be complex to be useful: in many cases simple line and bar charts are the best option.
- Provide simple counts and statistics in tables. If you collected more than 10 responses to your survey, include simple descriptive statistics. These may include sample size, averages, percentages and (in some cases) standard deviations and cross tabulations. For example, if 33 people responded to your question How satisfied were you with the quality of the help documentation on a scale of 1 to 10?, provide their mean satisfaction score (e.g. 7.3). For larger surveys (especially ones attempting to provide answers to research questions), you may decide to also include inferential statistics such as t-tests, chi squares or correlations. But if you don't have a background in statistics already and/or don't need to prove or disprove a hypothesis, you may want to stick to what you know.
- Reporting qualitative data
- Highlight important themes with quotes. If in reading through your response you see different respondents providing similar answers to the same question, or if you notice people mentioning the same thing over and over, you may have identified a theme. If multiple people mention that they appreciated the one-on-one support they received at your editing workshop, that is probably a theme that you want to highlight in your report because it indicates that one-on-one support is a useful strategy for workshops, or because you think it explains why people gave the workshop such high satisfaction ratings on the multiple-choice responses. In this case, you could include one or two quotes from respondents in your report under a header like "Attendees appreciated one-on-one editing support", and then talk about why you think that strategy worked so well.
- Highlight important exceptions. In your report, you want to give the reader a sense of the range of responses that were given to your questions, not just the most common ones. In the previous example, this could mean including a quote from a workshop participant who didn't appreciate the hands-on support they received (they thought it was too invasive, or condescending, or they generally prefer to figure things out themselves, etc.). You don't have to report every single exception, just the ones that are really exceptional (e.g. reports of exceptionally negative or positive experiences), or that you feel are exceptionally important!
- Summing up
You should conclude your survey with a section that sums up what you did and what you found. The kind of information you include in this conclusion section depends on why you conducted the survey and how you would like your findings to be used. Some sections that are often useful to include in this summary are:
- Key takeaways. This is recommended for all reports. This section can be a short paragraph or a bulleted list of the things that you really want your reader to remember about your survey. Examples: you found that women preferred video tutorials 3 times more than men did; you found that satisfaction with Dispute Resolution processes in 25% lower this year than last year; you found that 15 of your 30 workshop participants did not know how to edit a page at the beginning of your workshop.
- Reflections. This is recommended for all reports. This section can include your own reflection on the survey process or the findings. What would you do differently if you had to deploy the survey again? What surprised you, encouraged you or disappointed you most about your findings?
- Recommendations. This is recommended for most reports. In most cases, you will want to end your survey with some specific recommendations. These can be based on your findings (e.g. include more video tutorials in help documentation to support female editors) or your experience (e.g. ask survey respondents to list both their country of origin and their home wiki in international surveys.). It's perfectly okay for these recommendations to be re-statements of your Key takeaways and Reflections—these are your most important points, after all.
Tips & tricks
edit- Clean your data. If you are using a survey tool that logs information such as how long it took someone to complete a survey, or whether the respondent stopped answering in the middle, you may want to clean your data. Cleaning your data involves removing responses that seem to be invalid. For example, if a respondent only took 10 seconds to complete a 20 question survey, you may not want to include their data in your findings because it is unlikely that they read and answered all the questions. If you do remove invalid responses from your survey, you should say so in your report.
- Spread the word about your findings. Let people who may be interested in your results know about your report. Did you run a post-workshop survey? Then tell other people who are contemplating putting on a similar workshop about your survey. Send a message on relevant email lists and noticeboards. Tweet about it! Facebook about it! You may even want to write a short blog post for the Wikimedia blog. And always make sure to reach out to your survey participants specifically and let them know that their responses have been reported.
Survey research: ethics and standards
editWhen performing research on any community it is important to consider the ethical implications and potential impact of your research activities and your results. In a diverse volunteer-driven movement like ours, it is likewise of paramount importance that we as researchers hold ourselves to a high standard in the way we motivate, conduct and report the research we do.
- Research ethics
- Respect for privacy and confidentiality. How you ask for information determines how you may use it. You need to explain in surveys how you will use the responses. Participants must agree, and you must keep any promises about use and confidentiality. In general, do not ask users to provide personally identifiable information in surveys, or ask for demographic information beyond what you need for the purposes of your research.
- Respect for time. Surveys should only be offered to users for whom the survey is relevant—avoid spamming thousands of random users with a survey. Requests to opt out of such solicitations should be respected. To avoid survey fatigue, surveys should only be as long as they need to be to gather the necessary information. Less is more.
- Inclusivity. Surveys should be designed as much as possible to allow all target participants to provide their views and relate their experiences. Participants should never be intentionally excluded from participating in a survey as a means of muting or de-emphasizing a valid perspective on the issue under investigation.
These ethical considerations do not constitute legal advice and are not legally binding. Furthermore, research ethics policies have been proposed and articulated in other spaces on Wikimedia wikis. The considerations presented here are not intended to supersede existing policies or proposals, but rather to provide guidance for members of our movement who engage in survey research.
- Research standards
- Transparency. All relevant documentation for a survey project should be accessible. The rationale for conducting the survey should be clearly stated, the methods used for sampling, analysis, etc. should be described in detail, and the findings should be published openly. Survey participants in particular should be made aware of the findings.
- Compatibility. Every survey should increase our knowledge about our movement and its members. We can't learn very much from research findings that cannot be verified, reproduced or reconsidered in the light of other information. To the fullest extent possible, surveys should be conducted in such a way that their findings can be compared to the results of other surveys or to research conducted using other methods (such as quantitative analysis, interviews, or focus groups).
- Quality. We perform research to help us make better decisions. Poor quality research can lead to bad decisions. Respect your participants, yourself and your wiki by being the best researcher you can be.
Sample surveys
editSample question sets
editBelow are some survey question sets from previous Wikimedia surveys.
- IEG Round 1 surveys for reviewers and proposal submitters
- Teahouse Pilot wrap up survey questions
- Dispute resolution project survey questions
- Help Project Fellowship survey questions and results
- Wikimedia Strategic Planning former administrators survey questions
- Wikimedia Strategic Planning former contributors survey questions
- 2011 Wikimedia Editors survey questionnaire (PDF)
Sample survey reports
editBelow are some examples of project reports that include data from surveys.
- Wikipedia Dispute Resolution Community Fellowship survey
- Wikipedia Help Project Community Fellowship survey
- Wikimedia Foundation Editor Survey 2011
- Research Committee expert involvement survey
- Wikimedia Strategic Planning Former Contributors survey
- Individual Engagement Grants participant and reviewer survey - April 2013
- Wikipedia Teahouse project research reports (including survey methodology and responses) (pilot research report,phase 2 research report)
- Berlin Hackathon 2012 retrospective