Foreword

Public Works and Government Services Canada gratefully acknowledges the work of Steve Kiar and Alethea Woods of Phoenix Strategic Perspectives, Inc., who reviewed a wide range of practices throughout the discipline and industry of telephone survey research, analyzed the best practices described herein and wrote the report.

Executive Summary

This best practices document, Improving Respondent Cooperation for Telephone Surveys, is intended to provide public servants undertaking public opinion research on behalf of the Government of Canada with a practical guide to achieving and maintaining high response rates in telephone surveys. The Public Opinion Research Directorate (PORD) of Public Works and Government Services Canada (PWGSC) produced this report to help government departments and agencies conduct or obtain high-quality public opinion research, in order to ensure that they consider the needs and concerns of Canadians when designing and implementing policies, programs and services. This report was conceived primarily in response to two concerns: concerns expressed by suppliers and clients throughout the research industry about the ongoing need for high-quality information; and concerns about declining participation in telephone surveys worldwide, which may ultimately compromise survey quality and create samples that are not representative of the populations that they are intended to represent.

While the theme of this report is increasing response rates in telephone surveys, the report is not intended to be solely a guide to increasing response rates to meet an arbitrary objective. The focus is rather on strategies to help ensure that telephone surveys conducted for the Government of Canada achieve the highest possible response rates within the parameters of each study. These best practices incorporate guidelines and procedures to be used in the different phases of survey research.

This report is based on a combination of reviews of academic studies and interviews with, and written feedback from, knowledgeable persons in government, the market research industry and academia. It presents 50 best practices that can help to improve response rates in telephone surveys. Based on their primary and secondary research, the authors feel the following best practices are the most effective of the 50 presented.

  • Select the most appropriate survey method: For public opinion surveys conducted for the Government of Canada on public policy issues where a national sample of the adult population is required, randomly selected telephone samples are–at the time this report is being written–normally more representative of a larger proportion of the population than samples available on most Internet panels; however, many specialized populations are now more effectively sampled via the Internet.
  • Consider alternative methods: Alternative data collection methods may be more appropriate than traditional ones for hard-to-reach respondents. Mixed-mode surveys, which are based on more than one data collection method, have been found to yield higher response rates.
  • Lengthen the survey period: The length of the data collection period can have a direct impact on response rates. Studies quoted in this report have found that longer interviewing periods can double or even triple response rates. The length of time allotted for data collection should reflect incidence level, target audience and research objectives.
  • Keep the interview short: Longer interviews, especially those over 20 minutes, are widely thought to have a negative impact on response rates. In practical terms, surveys of 10 minutes or less are considered not overly burdensome. Controlling survey length necessarily involves considering the relative priority of questionnaires.
  • Include a good introduction at the beginning of the interview: Studies have found that the majority of refusals occur during the first minute of the call. Therefore, effective introductions may increase the likelihood that a potential respondent will become a participating respondent. The report recommends that interviewers use personalization, identify the sponsor, describe the survey objectives, and confirm that confidentiality and privacy will be respected.
  • Reveal the sponsor's identity: Telling potential respondents who is sponsoring the survey may increase survey response rates. Research suggests that government-sponsored or government-conducted surveys achieve higher response rates than surveys sponsored by most other organizations.
  • Consider incentives: There is a general consensus among researchers that monetary and non-monetary incentives are an effective way to increase response rates. For special-audience research, the distribution of a research summary is a valuable and relatively common type of non-monetary incentive. Where possible, the interviewer should offer the incentive when first contacting the respondent.
  • Vary the call scheduling: Varying the timing of calls can reduce the number of call attempts required to reach the respondent and increase the likelihood of reaching a household or business. Maximizing response rates requires calling at times that are most suitable for the survey sample while still ensuring that interviewing takes place across different time periods–such as different hours of the day or days of the week–in order to ensure that the sample is representative of the targeted population.
  • Increase the number of callbacks: An adequate number of callbacks can also improve the response rate. Increasing the number of callbacks up to a certain point will result in higher response. This approach should be combined with varying the call scheduling (see above).
  • Ensure that interviewers are well trained and well briefed: The use of well-trained and professional interviewers will improve response rates. Project-specific interviewer briefings should be provided for all telephone surveys.
  • Consider refusal conversions: The survey organization should attempt to convert respondents who have initially refused to participate. Refusal conversions are normally done in subsequent telephone calls by more senior, experienced interviewers.

The above best practices are those that the authors consider to have the greatest impact on survey response. Many other practices discussed in this report are thought to have a medium or low impact.

Please also see: Checklist of Best Practices and Assessment of Relative Impact of Best Practices on Response Rates, which follows.

This set of best practices was compiled to provide users of public opinion research in the Government of Canada with the information necessary to understand issues related to survey response and the factors that affect response rates. For more information on the 50 best practices, we invite the reader to examine the full set of best practices outlined in the following pages.

Checklist of Best Practices and Assessment of Relative Impact of Best Practices on Response Rates

This set of best practices is designed to help maximize response rates for Government of Canada telephone surveys. Use this checklist to guide decision-making at each stage of the research project. Remember, not all of these best practices will be appropriate or feasible for all POR studies. However, adopting as many best practices as possible when doing a study can be expected to increase response rates.

Just as not all of the best practices will apply to all telephone surveys, each best practice is not equal in terms of its impact on response rates. Some of the best practices will have a greater impact on maximizing response rates than others. For example, response rates are best addressed during the design and data collection phases of a study; efforts undertaken during analysis and reporting will do nothing directly to improve response rates. In addition, none of the best practices on its own can be expected to have a significant impact on response rates. Rather, adopting as many best practices as possible during a study can be expected to increase response rates. Conversely, not incorporating the best practices appropriate to a study can decrease response rates.

Given the differential impact of the best practices, and the unique constraints of budget and time for each POR telephone survey, it might be necessary to make trade-offs when designing research. The following guide to the approximate relative impact of the best practices on response rates can help organizations make those decisions. Estimates of the impact of these 50 best practices are based on qualitative assessments by the authors of this study, Phoenix Strategic Perspectives, Inc. In turn, the authors based these qualitative assessments on interviews with experts and practitioners in the field and on an extensive literature review.

Research Design/Assessment of Impact

Choose an appropriate data collection method (BP 1.0)

  • Select the most appropriate survey method. (BP 1.0.1) High
  • Consider alternative methods to contact hard-to-reach respondents. (BP 1.0.2) High
  • Consider allowing proxy respondents. (BP 1.0.3) Low
  • Collect the data at the most appropriate time of year. (BP 1.0.4) Medium
  • Allow adequate time to collect the data. (BP 1.0.5) High

Ensure adequate population coverage (BP 1.1)

  • Define the research population. (BP 1.1.1) Medium
  • Select an adequate sample size. (BP 1.1.2) Low
  • Reduce coverage error. (BP 1.1.3) Low

Minimize respondent burden (BP 1.2)

  • Keep the interview as short as possible. (BP 1.2.1) High
  • Design a well-structured questionnaire. (BP 1.2.2) Medium
  • Review the translated questionnaire. (BP 1.2.3) Medium
  • Pre-test the questionnaire. (BP 1.2.4) Medium

Incorporate methods to encourage participation (BP 1.3)

  • Notify potential respondents in advance of the fieldwork, where possible. (BP 1.3.1) Medium
  • Use effective survey introductions. (BP 1.3.2) High
  • Offer assurances of confidentiality. (BP 1.3.3) Low
  • Consider using incentives, where possible. (BP 1.3.4) High
  • Reveal survey sponsorship. (BP 1.3.5) High
  • Offer a validation source. (BP 1.3.6) Medium
  • Inform relevant government call centres or offices about the survey. (BP 1.3.7) Low

Data Collection/Assessment of Impact

Ensure effective sample management (BP 2.0)

  • Hire a data collection firm that submits to recognized field audits. (BP 2.0.1) Medium
  • Ration sample resources. (BP 2.0.2) Medium
  • Accurately track the disposition of calls. (BP 2.0.3) Low

Make efforts to maximize contact rates (BP 2.1)

  • Vary the call scheduling. (BP 2.1.1) High
  • Offer flexible callbacks and appointments. (BP 2.1.2) Medium
  • Ensure an adequate number of callbacks. (BP 2.1.3) High
  • Schedule extra callbacks to households with an initial language barrier. (BP 2.1.4) Low
  • Leave messages, for some studies. (BP 2.1.5) Medium
  • Provide a toll-free number for studies with hard-to-reach respondents. (BP 2.1.6) Medium

Take steps to minimize refusals and terminations (BP 2.2)

  • Ensure use of well-trained, effective interviewers. (BP 2.2.1) High
  • Request monitoring of data collection at all times. (BP 2.2.2) Medium
  • Monitor reasons for non-response during data collection. (BP 2.2.3) Low
  • Monitor non-response levels among different segments of the target population. (BP 2.2.4) Low
  • Attempt refusal conversions. (BP 2.2.5) High

Analysis/Assessment of Impact

Address survey non-response (BP 3.0)

  • Compare response rates across sub-groups. (BP 3.0.1) Low
  • Weight survey data, where possible. (BP 3.0.2) Low
  • Compare respondents and non-respondents. (BP 3.0.3) Low
  • Conduct non-respondent follow-ups. (BP 3.0.4) Low
  • Compare "early" to "later" respondents. (BP 3.0.5) Low

Reporting/Assessment of Impact

Document the response rate (BP 4.0)

  • Ensure the research supplier provides the record of calls. (BP 4.0.1) Low
  • Calculate the response rate using an approved method. (BP 4.0.2) Low
  • Ensure the response rate is recorded in the final report. (BP 4.0.3) Low

Document "Improving Respondent Cooperation for Telephone Surveys" Navigation