Standards and Guidelines for: Data Management and Processing

Data Entry

There was consensus that a revised version of the ISO standards for Data Entry be adopted.

STANDARDS

6.2 Electronic data entry

  • It shall be the responsibility of the research service provider to ensure that data entry or capture specifications for CATI are correct as specified and accurate, based on the client-approved questionnaire.
  • The research service provider shall establish and maintain procedures to test both the design and the implementation of the electronic forms of questionnaires. The type of tests and the persons involved shall be documented.
  • Upon request, the research firm will provide the project authority with the CATI version of the questionnaire.

6.3.1 Hard copy data entry

  • Where logic data entry is used, the in-built checks shall be documented and tested prior to use. The nature of the tests used and the results obtained shall be documented. Irresolvable attempted entries (which are not accepted because of the in-built logic checks) shall be referred to the project manager/executive responsible for the project for a decision and resolution, with a record kept of any changes made to the data.
  • Unless otherwise specified where simple data entry is used, data shall be keyed in as recorded on the questionnaire. A record of any instructions shall be kept on file.

6.3.2 Data entry verification for paper documents

  • The research firm shall document the level of verification to be carried out. A systematic method of verifying data entry shall be carried out on a project or stage/wave. The minimum total percentage verification per project shall be 10% of entries. Procedures shall ensure that there is a systematic method of verification of each operator's work and the verification shall be undertaken by a second person.
  • If an individual operator's work contains frequent errors, that individual's work (on the project) shall be 100% verified/re-worked. If necessary, appropriate retraining shall be given to that operator until error rates are acceptable. The effectiveness of the retraining shall be reviewed and documented.
  • The research service provider shall define the meaning of frequent errors and document that definition.

Coding

There was consensus that a revised version of the ISO standards and guidelines for coding be adopted.

STANDARDS and GUIDELINES

6.5.1 Use of Coding Software

  • If automated coding software is used, the error rate should be estimated. If the error rate exceeds 5%, the research firm shall:
    • Inform the project authority
    • Revise the dictionary

6.5.2 Developing code frames

  • The initial code list/frame shall be developed based on a systematic review of a minimum of 10% of open-ended responses and 50% of partial open-ended responses, where a frame does not already exist.
  • The research service provider shall ensure that coders working on the project are provided with instructions and training that shall include, as a minimum:
    • An overview of the project
    • Identification of questions or variables to be coded
    • The minimum proportion or number of a sample (and its make-up) used to produce code frames
    • Where necessary or appropriate, specific sub-groups required to develop code frames (e.g., by region, user or non-user)
    • Guidelines for the inclusion of codes in the code frame (e.g., decisions or rules regarding what should be included or excluded from a given code)
    • Any use to be made of code frames from a previous project or stage
    • Any other requirements or special instructions specific to the project

Guideline

  • For some variables, the research service provider should use existing established classification standards, such as those for industry, occupation and education.

6.5.3 Code frame approval/coding procedures

  • The research firm project manager responsible for the project shall approve the initial code frame prior to the commencement of coding and shall document it. This approval may involve the netting, abbreviating, rewording, recoding or deletion of codes.
  • Also:
    • Where "don't know" and "no answer" responses have been used, these shall be distinguishable from each other
    • The research service provider shall have clear rules or guidelines for the treatment of responses in "other" or catch-all categories; if the "other" or catch-all category exceeds 10% of responses to be coded, the responses should be reviewed with a view to reducing the size of the group.
  • After initial code frame approval, when further codes become appropriate in the process of coding, all copies of the code frame shall be updated and any questionnaires already coded shall be amended accordingly.
  • Upon request, the research firm shall provide the project authority with the initial code frame and any updated versions.
  • The research firm shall provide the project authority the final version of the code frame.

6.5.7 Coding Verification

  • The research service provider shall have defined procedures for the verification of the coding for each project, including documenting the verification approach to be used. Procedures shall ensure that there is a systematic method of verifying a minimum of 10% of questionnaires coded per project and the verification shall be undertaken by a second person.
  • If a coder's work contains frequent errors, that coder's work (on the project) shall be 100% verified/re-worked. If necessary, appropriate retraining shall be given to that coder until error rates are acceptable. The effectiveness of the retraining shall be reviewed and documented.
  • The research service provider shall define the meaning of frequent errors and document that definition.

Guidelines

  • There are two basic approaches to verification: dependent and independent. Dependent verification means that the second person has access to the original coding. Independent verification means that the second person does not have access to the original coding. In independent verification, the original coding and the verification coding are compared and if they differ, the correct code is decided by an adjudication process. Independent verification detects more errors than dependent verification. Independent coding verification should be used wherever possible.
  • The final coded dataset should be reviewed, at least once, to ensure the internal consistency of the coding, and be corrected as necessary.

Data Editing/Imputation

There was consensus that a revised version of the ISO standards for Data Editing/Imputation be adopted.

STANDARDS

6.6.1 Editing data/imputation

  • An accurate record of any changes made to the original data set shall be kept. No data shall be assumed/imputed without the knowledge and approval of the research firm project manager. Comparison to the original data source shall be the first step in the process. Any imputation processes, including the logic of the imputation method(s) used shall be documented and available to the client on request. All edit specifications shall be documented.
  • Where forced editing is used, the logic of the forcing shall be documented and test runs carried out, with the results documented to show that the forcing has the desired effect.
  • Data editing/imputation should be used cautiously. The degree and impact of imputation should be considered when analyzing the data, as the imputation methods used may have a significant impact on distributions of data and the variance of estimates.
  • The research firm shall include documentation of any imputation/forced editing, both in a technical appendix and in the final report.

6.6.2 Data editing of paper documents prior to data entry

  • Where paper documents are hand edited prior to data entry, it shall be possible to distinguish the original answers of the respondent or interviewer from the codes or answers allocated by the person(s) carrying out the editing.
  • When this type of editing is used, the logic and rules being applied shall be documented and any staff working on this element of the project shall be briefed as to the types of checks and corrections they may carry out.

Document "Advisory Panel On Telephone Public Opinion Survey Quality - Final Report February 1, 2007" Navigation