Standards and Guidelines for Data Management and Processing

The report on The Advisory Panel on Telephone Public Opinion Survey Quality (Telephone report, for short) dealt with standards and guidelines for:

The Online Advisory Panel was not asked to comment on these standards and guidelines on the basis that most aspects apply equally to both telephone and online surveys.

This section reproduces the standards and guidelines from the Telephone report.

Coding

Standards for Coding

Use of Coding Software

  • If automated coding software is used, the error rate should be estimated. If the error rate exceeds 5%, the research firm shall:
    • Inform the Project Authority
    • Revise the dictionary

Developing code frames

  • The initial code list/frame shall be developed based on a systematic review of a minimum of 10% of open-ended responses and 50% of partial open-ended responses, where a frame does not already exist.
  • The research service provider shall ensure that coders working on the project are provided with instructions and training that shall include, as a minimum:
    • An overview of the project
    • Identification of questions or variables to be coded
    • The minimum proportion or number of a sample (and its make-up) used to produce code frames
    • Where necessary or appropriate, specific sub-groups required to develop code frames (e.g., by region, user or non-user)
    • Guidelines for the inclusion of codes in the code frame (e.g., decisions or rules regarding what should be included or excluded from a given code)
    • Any use to be made of code frames from a previous project or stage
    • Any other requirements or special instructions specific to the project

Code frame approval/coding procedures

  • The research firm project manager responsible for the project shall approve the initial code frame prior to the commencement of coding and shall document it. This approval may involve the netting, abbreviating, rewording, recoding or deletion of codes.
  • Also:
    • Where "don't know" and "no answer" responses have been used, these shall be distinguishable from each other
    • The research service provider shall have clear rules or guidelines for the treatment of responses in "other" or catch-all categories; if the "other" or catch-all category exceeds 10% of responses to be coded, the responses should be reviewed with a view to reducing the size of the group.
  • After initial code frame approval, when further codes become appropriate in the process of coding, all copies of the code frame shall be updated and any questionnaires already coded shall be amended accordingly.
  • Upon request, the research firm shall provide the Project Authority with the initial code frame and any updated versions.
  • The research firm shall provide the Project Authority the final version of the code frame.

Coding Verification

  • The research service provider shall have defined procedures for the verification of the coding for each project, including documenting the verification approach to be used. Procedures shall ensure that there is a systematic method of verifying a minimum of 10% of questionnaires coded per project and the verification shall be undertaken by a second person.
  • If a coder's work contains frequent errors, that coder's work (on the project) shall be 100% verified/re-worked. If necessary, appropriate retraining shall be given to that coder until error rates are acceptable. The effectiveness of the retraining shall be reviewed and documented.
  • The research service provider shall define the meaning of frequent errors and document that definition.

Guidelines for Coding

Developing Code Frames

  • For some variables, the research service provider should use existing established classification standards, such as those for industry, occupation and education.

Coding Verification

  • There are two basic approaches to verification: dependent and independent. Dependent verification means that the second person has access to the original coding. Independent verification means that the second person does not have access to the original coding. In independent verification, the original coding and the verification coding are compared and if they differ, the correct code is decided by an adjudication process. Independent verification detects more errors than dependent verification.

    Independent coding verification should be used wherever possible.

  • The final coded dataset should be reviewed, at least once, to ensure the internal consistency of the coding, and be corrected as necessary.

Data Editing/Imputation

Standards for Data Editing/Imputation

  • An accurate record of any changes made to the original data set shall be kept. No data shall be assumed/imputed without the knowledge and approval of the research firm project manager. Comparison to the original data source shall be the first step in the process. Any imputation processes, including the logic of the imputation method(s) used shall be documented and available to the client on request. All edit specifications shall be documented.
  • Where forced editing is used, the logic of the forcing shall be documented and test runs carried out, with the results documented to show that the forcing has the desired effect.
  • Data editing/imputation should be used cautiously. The degree and impact of imputation should be considered when analyzing the data, as the imputation methods used may have a significant impact on distributions of data and the variance of estimates.
  • The research firm shall include documentation of any imputation/forced editing, both in a technical appendix and in the final report.

Document "The Advisory Panel on Online Public Opinion Survey Quality - Final Report June 4, 2008" Navigation