Standards and Guidelines for Pre-Field Planning, Preparation and Documentation

This section details the standards and guidelines for pre-field components of an online survey project:

Statement of Work

Most of the standards and guidelines for the Statement of Work (SOW) apply to both online and telephone surveys. Accordingly, most are taken from the report by the Advisory Panel on Telephone Public Opinion Survey Quality (Telephone report, for short) (Public Opinion Research in the Government of Canada) with some minor modifications to adapt these to online surveys specifically.

As well, the following commentary from the Telephone report is pertinent:

  • The Telephone Advisory Panel endorsed a principle stated by the European Society of Opinion and Market Research (ESOMAR) on the role of the Statement of Work (SOW) in the research process:

    The more relevant the background information the client can give the researcher, the greater the chances are that the project will be carried out effectively and efficiently.

  • The SOW is an important document as an internal Government of Canada (GC) tool, because it is:
    • A guide to the overall research process for the department
    • A central document stating the needs of the department

Standards for the Statement of Work

A Statement of Work must be a written plan that provides the research supplier with the following information:

Background

  • To provide context for the research, describe events/decisions that led to why research is required/being considered.
  • Include information/available resources to help the contractor better understand the subject matter of the survey (e.g., past research, web sites).

Purpose, how the research will be used

  • Provide information on the types of decisions or actions that are to be based on the findings, i.e., (a) what activities it will support; (b) how; (c) who will use the information.
  • Include any internal or external commitments regarding scheduling/timelines that may rely on the research findings (e.g., reporting requirements, events).

Objectives, research questions

  • Include, in the information requirements, the broad research questions that the study needs to answer. This will help in the development of the survey questionnaire, the data analysis and the report outline.
  • If relevant, prioritize the information required to ensure data quality in the event of budgetary or scheduling constraints.

Target Population

  • Wherever necessary and possible, indicate:
    • The demographic, behavioural and/or attitudinal characteristics of the target population for the survey
    • Whether or not Internet non-users are part of the target population
    • Information or estimates available on the size/incidence of these groups

Data Collection Method

  • If relevant, ask for input on other data collection approaches.

Deliverables

  • List major project milestones with anticipated timelines.
  • At minimum, details of reporting should reference all requirements identified by Public Opinion Research Directorate (PORD).

Sample Size Assumptions

  • To help the supplier generate a reasonable sample size assumption for costing purposes, at least one of the following indicators must be included:
    • Sample size required
    • Level of precision required, if applicable
    • Study budget

Guidelines for the Statement of Work

Other useful information that may be included in a Statement of Work include the following:

Sample Considerations

  • Provide any relevant information on the sampling frame, e.g., the availability of lists.
  • Indicate the expected sampling method - i.e., probability, non-probability, or attempted census.
  • Indicate any requirements to be taken into consideration in finalizing the total sample size and structure/composition of the sample, e.g., regional requirements, demographic groups, population segments (those aware vs. those not aware; users vs. non-users, etc.).

Data Analysis

  • Identify any need for special analyses, e.g., segmentation.

Proposal Documentation

Many of the standards for Proposal Documentation found in the Telephone report apply equally to online surveys, although some modifications specific to online surveys were necessary.

As general context for the standards on Proposal Documentation, the following commentary from the Telephone report is pertinent:

  • There is a clear delineation between the SOW and the Research Proposal:
    • The SOW is what the GC needs to know, from whom and when it needs this information
    • The Research Proposal is what the research firm will do to meet the needs of the GC and how this will be done

Therefore, there is much more detail required from research firms in the Proposal Documentation than is required from the GC in the SOW.

  • There is a need to find a balance between all the information required by the GC as a response to a SOW, and ensuring all data quality issues are also covered, but without overburdening either the research supplier or the GC.
  • There is a need for consistency in Proposal Documentation to make it easier to assess/confirm that the research firm has provided all the categories of information and the detail required in each proposal.

In proposal documentation, the different ways the Government of Canada contracts public opinion research also must be considered. Some contracts for online surveys will be issued to firms using a Standing Offer, while others may be awarded through competition on MERX or as sole source contracts (e.g., syndicated studies, omnibus surveys). To get on a Standing Offer, firms are required to go through a rigorous competitive bidding process. Firms selected through such a process will have already committed to certain practices which are also required elements in a proposal. For example, there may be various quality control procedures required in proposal documentation to which firms on a Standing Offer will have already committed as their standard practices. In these cases, it is suggested the research firms not be required to describe these again in each proposal they submit.

The approach used in the Telephone report is maintained here - that is, an asterisk has been placed next to items that might already have been addressed by firms in their Standing Offer submissions and which they would not be required to address again in each proposal submission against a call-up. Firms on a Standing Offer would be required to address only the non-asterisked items.

Firms awarded online survey contracts who are not on a Standing Offer would be required to address all required elements in their proposals.

Standards for Proposal Documentation

The Research Proposal must be a written document that uses the following headings and provides the following information, at a minimum. Note that an asterisk (*) identifies the areas that apply only to proposals from firms not awarded PWGSC's Quantitative Standing Offer.

A: Introduction

Purpose

  • Describe the firm's understanding of the problem/issues to be investigated and how the GC will use this information.

Research Objectives

  • Detail the information needs/research questions the research will address.

B: Technical Specifications of the Research

Overview

  • Provide a brief statement summarizing:
    • Data collection method, including rationale for proposed methodology
    • Total sample size
    • Target population

Sample/Sampling Details

  • Provide details related to target population:
    • The definition of the target population in terms of its specific characteristics and geographic scope, including the assumed incidence of the population and any key sub-groups
    • Whether or not Internet non-users are part of the target population
    • The total sample size and the sample sizes of any key sub-groups
  • Describe the sample frame, including:
    • The sample source
    • Sampling procedures, including what sampling method will be used - i.e., probability, non-probability, attempted census
    • Any known sampling limitations and how these might affect the findings
  • Explain respondent selection procedures.
  • Indicate the number of recontact attempts and explain recontact attempt procedures.
  • Define respondent eligibility/screening criteria, including any quota controls.
  • For non-probability samples:
    • Provide the rationale for choosing a non-probability sample
    • Describe the steps that will be taken to maximize the representativeness of the non-probability sample

Response/Success Rate and Error Rate

  • State the target response/success rate or the target response/success rate range for the total sample for online and multi-mode surveys and, if relevant, for key sub-groups.
  • For probability samples, state the level of precision, including the margin of error and confidence interval for the total sample and any key sub-groups.
  • For non-probability samples, state whether or not non-response biases are planned, If they are planned, the general nature of the analyses should be described. If they are not planned, a rationale must be stated.
  • Indicate any other potential source of error based on the study design that might affect the accuracy of the data.

Description of Data Collection

  • State the method of data collection.
    *For Access Panels, a description of the following must be provided, at minimum:
    • Panel size
    • Panel recruitment
    • Project management
    • Panel monitoring
    • Panel maintenance
    • Privacy/Data protection
    Note: When multiple panels are to be used in the execution of the survey, this must be disclosed and standards for use of multiple panels followed.
  • Provide details on any incentives/honoraria, including rationale.
  • Describe how language requirements will be addressed.
  • *Describe quality control procedures related to data collection, including at minimum:
    • Detecting and dealing with satisficing; as a guideline it is recommended the cost impact of any measures taken to detect and deal with satisficing be described
    • Fieldwork validation methods and procedures
  • *Describe how:
    • The rights of respondents will be respected, including if relevant the rights of children, youth and vulnerable respondents
    • Respondent anonymity and confidentiality will be protected
  • Describe any accessibility provisions in the research design to facilitate participation by respondents who are visually or physically disabled and who may be using adaptive technologies.
  • For multi-mode surveys, provide a rationale for using a multi-mode method rather than a single-mode method.

Questionnaire Design

  • Provide either an outline of the survey questionnaire or list the topics that will be covered in the questionnaire, including specifying the number of open-ends.
  • Provide an estimate of the length of the questionnaire. If the survey is estimated to require more than 20 minutes to complete, state the rationale for the length.
  • Describe how the questionnaire will be pre-tested, including:
    • The objectives of the pre-test
    • The method for the pre-test
    • The total number of pre-test questionnaires to be completed in total and by key sub-groups (e.g., language, age, gender)
    • How the results of the pre-test will be documented and communicated to the GC

    Note: A rationale must be provided if:

    • No pre-test is to be conducted
    • Less or more than 30 pre-test questionnaires are to be completed

Description of Data Processing/Data Management

  • Describe any weighting required.
  • *Describe quality control procedures related to data processing/data management, including at minimum:
    • Coding/coding training
    • Data editing
    • Data tabulation
    • File preparation/electronic data delivery

Data Analysis/Reporting

  • Describe how the data will be analyzed related to the objectives/research questions, including any special analyses (e.g., segmentation).
  • Provide an outline of the sections of the report.

Deliverables

  • List all deliverables including their coverage, scope, format, means of delivery and number of copies, including at minimum:
    • Questionnaire(s), including pre-test, if relevant
    • Data tabulation/processing
    • The report format(s), including the number of copies, language of report
    • The nature, location and number of presentations, including the language of presentations

Project Schedule

  • Provide a detailed workplan with dates and identify responsibilities.

C: Project Cost

Project Cost

  • Cost information must be presented in the format designated by PWGSC.

Questionnaire Design

The starting point for the Panel's deliberations was the section in the Telephone report on Questionnaire Design and the standards and guidelines that had been developed for telephone surveys. The Panel was asked to consider:

  • If any changes were required to the general standards and guidelines for online surveys
  • The appropriateness of the guideline on questionnaire length (a) for online surveys as opposed to telephone surveys, and (b) in light of new guidelines about questionnaire length issued by the Marketing Research and Intelligence Association (MRIA) in their Code of Conduct and Good Practice, December 2007
  • The need for a standard on questionnaire approval for online surveys

The Online Advisory Panel agreed that, with the exception of the guideline on survey length, the standards and guidelines for questionnaire design for telephone surveys apply equally to online surveys, the general principle being that only very broad standards and guidelines are required.

With regard to survey length, the Panel supported maintaining the designation of 20 minutes as a "reasonable" length for online surveys, but suggested adding language to flag that shorter online surveys may be preferable. A few Panelists suggested referring to the possibility of doing longer online surveys, albeit with the understanding that this should be more the exception than the rule.

There was consensus by the Panel to add a standard on questionnaire approval for online surveys. This would serve as a reminder to GC research buyers of the importance of approving both the wording/content of the survey and its online appearance and functionality.

There was general agreement by the Panel on the following standards and guidelines for Questionnaire Design.

Standards for Questionnaire Design

  • Survey questionnaires must be designed:
    1. To collect only the information essential to the objectives of the study, and
    2. To minimize the burden placed on respondents while maximizing data quality
  • The following are required elements of all Government of Canada online survey questionnaires:
    1. Inform respondents of (i) the subject and purpose of the study and (ii) the expected length of the interview
    2. Identify the research firm and either the Government of Canada or the department/agency sponsoring the survey
    3. Inform respondents that their participation in the study is voluntary and the information provided will be administered according to the requirements of the Privacy Act
    4. Inform respondents briefly of their rights under the Access to Information Act, most importantly the right to access a copy of the report and their responses
  • Unless the client provides the translation, firms are required to translate the questionnaire into the other official language (unless interviewing is to be unilingual), and where required into other languages. All translations must be in written form.
  • Government of Canada approval of an online survey questionnaire must include approval of the appearance and functionality of the questionnaire in its online form - i.e., as it would be experienced online by respondents.

Guidelines for Questionnaire Design

The following strategies may be used to achieve the standards:

  • The questionnaire is of reasonable length, i.e., requiring 20 minutes or less to complete. Shorter surveys are preferred over longer surveys.

    Longer surveys may be acceptable in some circumstances, depending on such factors as the target group, the subject, the possibility of respondents completing the questionnaire in parts, or where permission has been obtained in advance. However, the risk posed by an overly long questionnaire is that it may well result in significant non-response or drop-offs, which in turn can adversely affect data quality.

    The rationale for surveys longer than 20 minutes should be discussed in the Research Proposal.

  • The introduction to the survey and the respondent screening section are well-designed and as short as possible in order to maximize the likelihood people will agree to complete the questionnaire.
  • Questions are clearly written and use language appropriate to the target group.
  • Methods to reduce item non-response are adopted (e.g., answer options match question wording; "other," "don't know" and "refused" categories are included, as appropriate).
  • The questionnaire is designed for clear and smooth transition from question to question and from topic to topic.

Survey Accessibility

The Treasury Board of Canada Secretariat has issued accessibility standards for GC websites and these standards will impact online surveys hosted on GC websites. Depending on how the mandate and scope of the standards are interpreted, they may also impact GC surveys hosted on third-party websites and possibly syndicated online surveys purchased by the GC.

Accessibility is a very important matter for GC online surveys. However, the Advisory Panel did not include representatives from the group within Treasury Board of Canada Secretariat that enforces GC accessibility standards, nor did the Panel have access to legal advice pertaining to accessibility requirements. Therefore, the Advisory Panel did not feel it could unilaterally attempt to interpret how the Government of Canada's Common Feel and Look requirements should be applied to online surveys, whether for surveys hosted on a GC website or for surveys hosted by other parties.

The Panel did agree to make the following recommendations to the Public Opinion Research Directorate (PORD), and also agreed to add a standard to Proposal Documentation (Description of Data Collection) requiring online survey proposals to address any survey accessibility provisions.

Recommendations

  • Recommendation to PORD re clarification: The Advisory Panel recommends that PORD explore with the relevant program and legal authorities within Treasury Board of Canada Secretariat both what are best practices with respect to online survey accessibility, and what are minimum acceptable practices.
  • Recommendation re Standing Offer requirements: The Advisory Panel recommends that in the upcoming Request for Standing Offers that bidders for providing online survey research discuss in their proposals how they will work with respondents who are visually or physically disabled and who may be using adaptive technologies online. Bidders must demonstrate how these individuals can be included in the research.

With regard to these recommendations, there were two additional comments made by some members of the Panel:

  • There should be provision for consultation with the research industry to better understand the types of software currently in use by third parties (and in which some companies have heavily invested) before setting minimum requirements.
  • Once minimum acceptable standards are established, there will be a need for PORD to implement a plan to communicate these to potential online survey providers.

Pre-testing

There was consensus by the Panel to adopt the standards and guidelines for pre-testing that had been recommended for GC telephone public opinion surveys, with the following modifications:

  • Related to the role of pre-testing for online survey questionnaires, language has been added to reflect that there are multiple aspects of an online survey that should be considered in a pre-test, including both content and appearance/functionality.

  • Related to stating a minimum number of pre-test questionnaires to be completed, the Panel generally supported specifying a number of pre-test questionnaires that must be completed. However, there was some debate about what the minimum number should be, particularly in those instances when the sample is limited or when major revisions may be required to various aspects of a survey. Language has been added to the standard below stating a minimum target number for pre-testing which notes that other pre-test numbers may be justifiable.

    There was discussion among the Panel on the potential value of a step-wise approach when pretesting an online questionnaire - that is, start by completing a subset of pre-tests, then modify the questionnaire as appropriate based on the preliminary results, then complete another subset of pre-tests using the modified questionnaire. The specification of a target minimum number does not preclude a step-wise pre-test process. The only requirement is that the total target number of pretests be completed.

  • Related to pre-test documentation, the Panel felt the summary of pre-test results needs to include documentation of specific aspects of the pre-tests (e.g., both observational and factual data). Specific reporting requirements have been added to the standard below.

Standards for Pre-testing

  • In-field pre-testing of all components that may influence data quality and respondent behaviour is required for a new online survey questionnaire or a substantially revised questionnaire used in a previous survey.

    For online surveys, this includes both the content/wording of the questionnaire and the online appearance/functionality of the survey.
    • A periodic review of questionnaires used in ongoing or longitudinal surveys is required.
  • A minimum of 30 pre-tests are to be completed in total, 15 in English and 15 in French. When less or more than 30 pre-tests are to be completed, this must be justified in the Research Proposal.
  • The result(s) of the pre-test(s) must be documented, i.e., at minimum:
    • A description of the pre-test approach and the number of pre-tests completed
    • A summary of results including:
      • Observations on how respondents answered the questions
      • Occurrence and description of drop-offs
      • Questionnaire completion time
      • Responses to any special pre-test questions (e.g., respondents' comments on the survey questionnaire/experience)
    • A record of the decisions/changes made as a result of the pre-test findings
  • For syndicated online studies, research firms are required (a) to demonstrate that the survey questionnaire has been pre-tested, and (b) to provide details on the pre-test approach and number of pre-tests completed.

Guidelines for Pre-testing

  • For complex studies, highly influential surveys or surveys that are planned to be ongoing or longitudinal, a more complete in-field test of other components of a survey, not just the survey questionnaire, may be desirable. This may be a pilot test that, on a small scale, duplicates the final survey design including such elements as data capture, analysis of results, etc.
  • If there is a need to pre-test the questionnaire on criteria other than language, at least 4 pre-tests should be completed with each sub-group.
  • Pre-tests should not be included in the final dataset unless (a) there were no changes to the questionnaire, and (b) the pre-test was implemented in the exact same manner as in the final survey design.
  • Cognitive pre-testing (using qualitative methods) should be considered prior to field testing for new survey questionnaires or where there are revisions to wording or content of existing questionnaires, and particularly for complex surveys, highly influential surveys or surveys that are planned as ongoing or longitudinal. The main uses of cognitive pre-testing are:
    • To provide insight into how respondents react to a questionnaire:
      • Their understanding of the wording of questions and the flow of the questionnaire
      • Their ability to respond to questions accurately
      • Their thought processes as they answer the questions
    • To identify the impact of changes to an existing questionnaire (e.g., a tracking survey)
  • Whenever possible, schedule and budget permitting, omnibus survey questions should at least be pre-tested in-field. Whenever a pre-test has been conducted, the details of the pre-test should be documented, including the number of pre-tests completed.

Document "The Advisory Panel on Online Public Opinion Survey Quality - Final Report June 4, 2008" Navigation