Online Survey Checklist

Prior to starting my PhD program, I worked in customer service, technical support, and account management at two tech startups in the San Francisco Bay Area; designed websites on the side; and was a volunteer webmaster for two nonprofits.

I was first introduced to the notion of using online surveys distributed to panels of paid workers for the purpose of conducting academic research in early 2020, when I volunteered for a research lab at UC Berkeley. Since then, I have been fascinated by the process, and in particular see great opportunity in improving the design and distribution of online surveys to increase the quality of responses received.

Checklist

Survey Design

I will expand on these tips in future posts, but in the meantime, here is some general advice for surveys used in academic research:

  1. Unless there is a very good reason to, do not design a survey that will take more than about 10 minutes to complete.

  2. Plan on paying at least $10/hour. (At the time of writing, the U.S. Federal minimum wage is $7.25/hour.)

  3. Always include an informed consent and, as appropriate, debriefing page. I recommend giving participants an explicit option to opt out on the consent page—rather than just telling them they can close the window.

  4. When applicable and available, have the participant’s panel ID and confirmation code auto-populate in the survey. This reduces the burden on participants and helps ensure that the information entered is accurate. The panel provider you are using (e.g. MTurk or Prolific) should provide instructions on how to do that.

  5. As appropriate, ‘brand’ your survey with your institution’s logo—at least on the consent page.

  6. Use standard fonts, font sizing, and text formatting. All-too-often, I see text that is too small, and every once in awhile, I encounter text that is too big. Both are distracting.

  7. Whenever possible, use established items/scales versus creating your own questions on the fly. This applies to demographic questions as well.

  8. Avoid using any feature that is likely to cause problems for users on smaller screens (e.g. mobile devices), for example, slider and matrix-style questions.

  9. Avoid using any feature you don’t understand. Either invest the time to learn how to use the feature properly, or skip it!

  10. Rename your blocks and questions so that when you are reviewing the survey or analyzing data, you can intuitively distinguish between sections.

  11. As appropriate for your survey, randomize the display order of blocks using Survey Flow and questions/answer choices using block and question randomization settings.

Survey Distribution

Again, I will expand on these topic in future posts, but here is general guidance on distributing your survey.

  1. Select the appropriate panel platform. Two popular choices are MTurk and Prolific. Each has its own pros and cons.

  2. Set up prescreeners as needed, including prescreening by age, employment status, quality rating, or gender.

  3. If using MTurk, consider using CloudResearch to access additional prescreening tools, such as the ability to screen for “CloudResearch-Approved” participants only.

  4. Thoughtfully name and describe your survey. This is the primary information participants consider prior to deciding whether to take your survey.

  5. Enable the feature that auto-fills worker ID’s, when possible, to reduce unnecessary burden on participants as well as user error.

Final checks

Internal Pilot

Once your survey is ready, conduct an internal pilot (e.g. in your lab or department.) Ask for feedback on how long it took to complete the survey; whether the instructions, questions, and answer choices were clear; whether anyone encountered typos or other errors in the survey, etc. When the internal pilot is complete, you should review response data to look for any possible issues, such as missing data (for example, because participants weren’t presented with a question as you’d intended due to an issue in Survey Flow or branching logic), unexpected responses (perhaps due to a data validation issue), imbalance in experimental conditions, etc.

Paid Review

I strongly recommend paying an expert to review your survey, both as a participant and administrator. There are several sellers on Fiverr, for example, who will thoroughly review your survey starting at $5 or $10. Aside from their expertise, having a fresh, independent set of eyes on your survey can help catch mistakes that those more familiar with it might repeatedly overlook. I also offer QSF review services—either a complimentary ‘quick check’ or a more comprehensive review. Check out my QSF Review page for more information.

External Pilot

When you feel that your survey is ready for the next step, consider running an external pilot on the platform you are planning on using. Depending on your survey and the sorts of issues you are looking for, 10-20 participants is probably sufficient.

Preregister

Preregister your survey file in a format that will allow someone to fully reproduce your survey if they would like to (e.g. QSF for a Qualtrics survey.) To download a QSF version of your Qualtrics survey:

  1. Log in to the survey in Qualtrics

  2. Click Tools at the top

  3. Click Import/Export and then Export Survey. This should download a QSF version of your survey.

Previous
Previous

Setting up survey questions