Best practices - Improving response rates
1. Keep Your Email Invitation Short
Keep your email invitation short and simple with a single - the one to the survey. Be sure to explain the following:
- Who you are and the purpose of your study.
- The survey's benefit to the individual.
- The length of the survey: if it is short, emphasize that aspect. Always be truthful about times because people are more likely to stick with longer surveys if they know how much time they will take in advance.
- A privacy statement, if required by your organization.
2. Make Your First Survey Page Simple
Once people have decided to take your survey, they will want to get started as soon as possible. Studies show most people do not read extensive instructions, so keep the welcome message at the start of the survey concise and to the point.
3. Be Clear About Privacy Protections
The first page of the survey is the place to include information about how you will be using responses. People are more comfortable sharing information on the Internet if they know how it will be used. Are responders anonymous? Are responses confidential or shared with others? Your organization may have a "Research Involving Human Subjects" policy - this too could be included at this point of the survey.
4. Send Reminder Emails
Some people will take your survey right away. You will get increased responses, however, if you send follow-up email reminders with the survey link included. It is best not to send more than two reminder emails. Be sure to filter out email addresses of people who do not wish to be contacted again.
5. Consider Offering Incentives (gifts, prizes, etc.)
Studies show that incentives do not to be large to increase response rates. A small token, gift certificate, etc., can increase responses considerably.
6. Some People Just Need To Share Their Opinion
A large number of people will complete a survey to share their information rather than to receive an incentive. Consider making your incentive optional; you must choose to be included in a raffle or donating to a relevant cause for example.
7. Use Graphics and Internet Features Strategically
Surveys generally do not need fancy graphics, and sometimes graphics can distract from the content of the survey, or influence answers. But there are a few ways to use graphics to improve your survey responses. These include providing an image and web link for a prize or incentive, using a multimedia embedded content survey.
8. Publish Results Online to Survey Participants
People who respond often want to see results, and getting these results will encourage them to complete the survey.
More tips
For further insight on improving response rates, we recommend reading the following articles.
Increase response rates
- How to Increase Your Course Evaluation Response Rates
- 8 Tips to Increasing Course Evaluation Completion Rates
Student communication
- 8 Points You Must Communicate to Students In Order to Achieve A Successful Course Evaluation Process
Bluenotes group community-led webinars
- Singapore University of Technology and Design (SUTD) - How SUTD Reached a 95% Response Rate on their Course and Instructor Evaluations in 6 Months - [watch webinar] [slide deck]
- Washetanaw Community College - Yes, it is possible to achieve 85% response rates to course evaluations for online programs - [watch webinar]
- University of Pennsylvania - Linking course evaluations to grade access: Penn’s approach to raising response rates - [watch webinar]
- Washington State University - Increasing engagement in online course evaluations at WSU – [watch webinar]
- Virginia Commonwealth University - Take your course evaluation email notifications to the next level - [watch webinar]
- Virginia Commonwealth University - Expert module evaluation email notifications that boost response - [watch webinar]
Bluenotes group case studies
- Universidad de las Américas Puebla (UDLAP) - [read case study]
- University of Sharjah - [read case study]
Bluenotes group conference presentations & slide deck
- University of Kentucky (Bluenotes APAC 2018) - Experiences of integrating Blue with Blackboard and Canvas: A deeper integration - [slide deck] [video presentation]
- University of Louisville (Bluenotes Americas 2018) - Using Blue Text Analytics to Measure Student Sentiment – Priority Grade Access Implementation - [slide deck]
- University of Louisville (Bluenotes Americas 2017) - Response Rate Monitoring: Technical Overview and Implementation Case Study - [video presentation]
- Duquesne University, University of Pittsburgh, University of Louisville (Bluenotes Americas 2017) - Marketing outreach to improve response rates - [video presentation]
References
- Young, K., Joines, J., Standish, T., & Gallagher, V. (2019). Student evaluations of teaching: the impact of faculty procedures on response rates. Assessment & Evaluation in Higher Education, 44(1), 37-49. [Read here].
- Standish, T. (2019). Marketing students at work: organisation citizenship behaviour training as a tool to minimise survey non-response. Assessment & Evaluation in Higher Education, 44(2), 203-215. [Read here].
- Walsh, Maeve (2019). New SEI platform results in greater student response rate, more representative results. The Lantern [Read here].
- Standish, T., Joines, J. A., Young, K. R., & Gallagher, V. J. (2018). Improving SET Response Rates: Synchronous Online Administration as a Tool to Improve Evaluation Quality. Research in Higher Education, 59(6), 812-823. [Read here].
- Minimal Response Rates by Population Size. Metrics that Matters: 2018
- Online Course Evaluations and Response Rates. University of Saskatchewan.
- Carlozzi, M. (2018). Rate my attitude: research agendas and RateMyProfessor scores. Assessment & Evaluation in Higher Education, 43(3), 359-368. [Read here].
- Standish, T., Joines, J. A., Young, K. R., & Gallagher, V. J. (2018). Improving SET Response Rates: Synchronous Online Administration as a Tool to Improve Evaluation Quality. Research in Higher Education, 1-12. [Read here].
- Oermann, M. H., Conklin, J. L., Rushton, S., & Bush, M. A. (2018, January). Student evaluations of teaching (SET): Guidelines for their use. In Nursing forum. [Read here].
- Linse, A. R. (2017). Interpreting and using student ratings data: Guidance for faculty serving as administrators and on evaluation committees.Studies in Educational Evaluation, 54, 94-106. [Read here].
- Spooren, P., & Christiaens, W. (2017). I liked your course because I believe in (the power of) student evaluations of teaching (SET). Students’ perceptions of a teaching evaluation process and their relationships with SET scores. Studies in Educational Evaluation, 54, 43-49. [Read here].
- Chapman, D. D., & Joines, J. A. (2017). Strategies for Increasing Response Rates for Online End-of-Course Evaluations. International Journal of Teaching and Learning in Higher Education, 29(1), 47-60. [Read here].
- Al Kuwaiti, A., AlQuraan, M., & Subbarayalu, A. V. (2016). Understanding the effect of response rate and class size interaction on students evaluation of teaching in a higher education. Cogent Education, 3(1), 1204082. [Read here]
- Sundstrom, E. D., Hardin, E. E., & Shaffer, M. J. (2016). Extra Credit Micro-Incentives and Response Rates for Online Course Evaluations: Two Quasi-Experiments.Teaching of Psychology, 43(4), 276-284. [Read here].
- Gerbase M.W., Germond M., Cerutti B., Vu N.V., Baroffio A. (2015). How Many Responses Do We Need? Using Generalizability Analysis to Estimate Minimum Necessary Response Rates for Online Student Evaluations.Teaching and Learning in Medicine 2015;27(4). [Read here].
- Zumrawi, A. A., Bates, S. P., & Schroeder, M. (2014). What response rates are needed to make reliable inferences from student evaluations of teaching?Educational Research and Evaluation, 20(7-8), 557-563. [Read here].
- Nulty, Duncan D. The adequacy of response rates to online and paper surveys: what can be done?. Assessment & evaluation in higher education (2008): 301-314 [Read here]
InCalculations
InSpecial reports
InCalculations
InCreate and distribute reports
InInitial report creation