Quality assurance

Filter results by

Search Help
Currently selected filters that can be removed

Keyword(s)

Type

1 facets displayed. 0 facets selected.

Survey or statistical program

1 facets displayed. 0 facets selected.
Sort Help
entries

Results

All (15)

All (15) (0 to 10 of 15 results)

  • Articles and reports: 11-522-X200800010950
    Description:

    The next census will be conducted in May 2011. Being a major survey, it presents a formidable challenge for Statistics Canada and requires a great deal of time and resources. Careful planning has been done to ensure that all deadlines are met. A number of steps have been planned in the questionnaire testing process. These tests apply to both census content and the proposed communications strategy. This paper presents an overview of the strategy, with a focus on combining qualitative studies with the 2008 quantitative study so that the results can be analyzed and the proposals properly evaluated.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010954
    Description:

    Over the past year, Statistics Canada has been developing and testing a new way to monitor the performance of interviewers conducting computer-assisted personal interviews (CAPI). A formal process already exists for monitoring centralized telephone interviews. Monitors listen to telephone interviews as they take place to assess the interviewer's performance using pre-defined criteria and provide feedback to the interviewer on what was well done and what needs improvement. For the CAPI program, we have developed and are testing a pilot approach whereby interviews are digitally recorded and later a monitor listens to these recordings to assess the field interviewer's performance and provide feedback in order to help improve the quality of the data. In this paper, we will present an overview of the CAPI monitoring project at Statistics Canada by describing the CAPI monitoring methodology and the plans for implementation.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010955
    Description:

    Survey managers are still discovering the usefulness of digital audio recording for monitoring and managing field staff. Its value so far has been for confirming the authenticity of interviews, detecting curbstoning, offering a concrete basis for feedback on interviewing performance and giving data collection managers an intimate view of in-person interviews. In addition, computer audio-recorded interviewing (CARI) can improve other aspects of survey data quality, offering corroboration or correction of response coding by field staff. Audio recordings may replace or supplement in-field verbatim transcription of free responses, and speech-to-text technology might make this technique more efficient in the future.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010956
    Description:

    The use of Computer Audio-Recorded Interviewing (CARI) as a tool to identify interview falsification is quickly growing in survey research (Biemer, 2000, 2003; Thissen, 2007). Similarly, survey researchers are starting to expand the usefulness of CARI by combining recordings with coding to address data quality (Herget, 2001; Hansen, 2005; McGee, 2007). This paper presents results from a study included as part of the establishment-based National Center for Health Statistics' National Home and Hospice Care Survey (NHHCS) which used CARI behavior coding and CARI-specific paradata to: 1) identify and correct problematic interviewer behavior or question issues early in the data collection period before either negatively impact data quality, and; 2) identify ways to diminish measurement error in future implementations of the NHHCS. During the first 9 weeks of the 30-week field period, CARI recorded a subset of questions from the NHHCS application for all interviewers. Recordings were linked with the interview application and output and then coded in one of two modes: Code by Interviewer or Code by Question. The Code by Interviewer method provided visibility into problems specific to an interviewer as well as more generalized problems potentially applicable to all interviewers. The Code by Question method yielded data that spoke to understandability of the questions and other response problems. In this mode, coders coded multiple implementations of the same question across multiple interviewers. Using the Code by Question approach, researchers identified issues with three key survey questions in the first few weeks of data collection and provided guidance to interviewers in how to handle those questions as data collection continued. Results from coding the audio recordings (which were linked with the survey application and output) will inform question wording and interviewer training in the next implementation of the NHHCS, and guide future enhancement of CARI and the coding system.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010968
    Description:

    Statistics Canada has embarked on a program of increasing and improving the usage of imaging technology for paper survey questionnaires. The goal is to make the process an efficient, reliable and cost effective method of capturing survey data. The objective is to continue using Optical Character Recognition (OCR) to capture the data from questionnaires, documents and faxes received whilst improving the process integration and Quality Assurance/Quality Control (QC) of the data capture process. These improvements are discussed in this paper.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010969
    Description:

    In a multi-divisional initiative within the U. S. Census Bureau, a highly sophisticated and innovative system was developed and implemented for the capturing, tracking, and scanning of respondent data that implements Intelligent Character Recognition (ICR), Optical Character Recognition (OCR), Optical Mark Recognition (OMR), and keying technology with heavy emphasis on error detection and control. The system, known as the integrated Computer Assisted Data Entry (iCADE) System, provides digital imaging of respondent questionnaires which are then processed by a combination of imaging algorithms, sent through Optical Mark Recognition (OMR) to collect check box data, and automatically collect and send only write-in areas to data-keying staff for the data capture process. These capabilities have produced great efficiencies in the data capture process and have led to a novel and efficient approach to post-collection activities.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010974
    Description:

    This paper will focus on establishment survey questionnaire design guidelines. More specifically, it will discuss the process involved in transitioning a set of guidelines written for a broad, survey methodological audience to a more narrow, agency-specific audience of survey managers and analysts. The process involved the work of a team comprised of individuals from across the Census Bureau's Economic Directorate, working in a cooperative and collaborative manner. The team decided what needed to be added, modified, and deleted from the broad starting point, and determined how much of the theory and experimental evidence found in the literature was necessary to include in the guidelines. In addition to discussing the process, the paper will also describe the end result: a set of questionnaire design guidelines for the Economic Directorate.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010975
    Description:

    A major issue in official statistics is the availability of objective measures supporting the based-on-fact decision process. Istat has developed an Information System to assess survey quality. Among other standard quality indicators, nonresponse rates are systematically computed and stored for all surveys. Such a rich information base permits analysis over time and comparisons among surveys. The paper focuses on the analysis of interrelationships between data collection mode and other survey characteristics on total nonresponse. Particular attention is devoted to the extent to which multi-mode data collection improves response rates.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010976
    Description:

    Many survey organizations use the response rate as an indicator for the quality of survey data. As a consequence, a variety of measures are implemented to reduce non-response or to maintain response at an acceptable level. However, the response rate is not necessarily a good indicator of non-response bias. A higher response rate does not imply smaller non-response bias. What matters is how the composition of the response differs from the composition of the sample as a whole. This paper describes the concept of R-indicators to assess potential differences between the sample and the response. Such indicators may facilitate analysis of survey response over time, between various fieldwork strategies or data collection modes. Some practical examples are given.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010985
    Description:

    In Canada, although complex businesses represent less than 1% of the total number of businesses, they contribute more than 45% of the total revenue. Statistics Canada recognized that the quality of the data collected from them is of great importance and has adopted several initiatives to improve the quality. One of the initiatives is the evaluation of the coherence of the data collected from large, complex enterprises. The findings of these recent coherence analyses have been instrumental in identifying areas for improvement. These, once addressed and improved, would be increasing the quality of the data collected from the large, complex enterprises while reducing the response burden imposed on them.

    Release date: 2009-12-03
Data (0)

Data (0) (0 results)

No content available at this time.

Analysis (15)

Analysis (15) (0 to 10 of 15 results)

  • Articles and reports: 11-522-X200800010950
    Description:

    The next census will be conducted in May 2011. Being a major survey, it presents a formidable challenge for Statistics Canada and requires a great deal of time and resources. Careful planning has been done to ensure that all deadlines are met. A number of steps have been planned in the questionnaire testing process. These tests apply to both census content and the proposed communications strategy. This paper presents an overview of the strategy, with a focus on combining qualitative studies with the 2008 quantitative study so that the results can be analyzed and the proposals properly evaluated.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010954
    Description:

    Over the past year, Statistics Canada has been developing and testing a new way to monitor the performance of interviewers conducting computer-assisted personal interviews (CAPI). A formal process already exists for monitoring centralized telephone interviews. Monitors listen to telephone interviews as they take place to assess the interviewer's performance using pre-defined criteria and provide feedback to the interviewer on what was well done and what needs improvement. For the CAPI program, we have developed and are testing a pilot approach whereby interviews are digitally recorded and later a monitor listens to these recordings to assess the field interviewer's performance and provide feedback in order to help improve the quality of the data. In this paper, we will present an overview of the CAPI monitoring project at Statistics Canada by describing the CAPI monitoring methodology and the plans for implementation.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010955
    Description:

    Survey managers are still discovering the usefulness of digital audio recording for monitoring and managing field staff. Its value so far has been for confirming the authenticity of interviews, detecting curbstoning, offering a concrete basis for feedback on interviewing performance and giving data collection managers an intimate view of in-person interviews. In addition, computer audio-recorded interviewing (CARI) can improve other aspects of survey data quality, offering corroboration or correction of response coding by field staff. Audio recordings may replace or supplement in-field verbatim transcription of free responses, and speech-to-text technology might make this technique more efficient in the future.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010956
    Description:

    The use of Computer Audio-Recorded Interviewing (CARI) as a tool to identify interview falsification is quickly growing in survey research (Biemer, 2000, 2003; Thissen, 2007). Similarly, survey researchers are starting to expand the usefulness of CARI by combining recordings with coding to address data quality (Herget, 2001; Hansen, 2005; McGee, 2007). This paper presents results from a study included as part of the establishment-based National Center for Health Statistics' National Home and Hospice Care Survey (NHHCS) which used CARI behavior coding and CARI-specific paradata to: 1) identify and correct problematic interviewer behavior or question issues early in the data collection period before either negatively impact data quality, and; 2) identify ways to diminish measurement error in future implementations of the NHHCS. During the first 9 weeks of the 30-week field period, CARI recorded a subset of questions from the NHHCS application for all interviewers. Recordings were linked with the interview application and output and then coded in one of two modes: Code by Interviewer or Code by Question. The Code by Interviewer method provided visibility into problems specific to an interviewer as well as more generalized problems potentially applicable to all interviewers. The Code by Question method yielded data that spoke to understandability of the questions and other response problems. In this mode, coders coded multiple implementations of the same question across multiple interviewers. Using the Code by Question approach, researchers identified issues with three key survey questions in the first few weeks of data collection and provided guidance to interviewers in how to handle those questions as data collection continued. Results from coding the audio recordings (which were linked with the survey application and output) will inform question wording and interviewer training in the next implementation of the NHHCS, and guide future enhancement of CARI and the coding system.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010968
    Description:

    Statistics Canada has embarked on a program of increasing and improving the usage of imaging technology for paper survey questionnaires. The goal is to make the process an efficient, reliable and cost effective method of capturing survey data. The objective is to continue using Optical Character Recognition (OCR) to capture the data from questionnaires, documents and faxes received whilst improving the process integration and Quality Assurance/Quality Control (QC) of the data capture process. These improvements are discussed in this paper.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010969
    Description:

    In a multi-divisional initiative within the U. S. Census Bureau, a highly sophisticated and innovative system was developed and implemented for the capturing, tracking, and scanning of respondent data that implements Intelligent Character Recognition (ICR), Optical Character Recognition (OCR), Optical Mark Recognition (OMR), and keying technology with heavy emphasis on error detection and control. The system, known as the integrated Computer Assisted Data Entry (iCADE) System, provides digital imaging of respondent questionnaires which are then processed by a combination of imaging algorithms, sent through Optical Mark Recognition (OMR) to collect check box data, and automatically collect and send only write-in areas to data-keying staff for the data capture process. These capabilities have produced great efficiencies in the data capture process and have led to a novel and efficient approach to post-collection activities.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010974
    Description:

    This paper will focus on establishment survey questionnaire design guidelines. More specifically, it will discuss the process involved in transitioning a set of guidelines written for a broad, survey methodological audience to a more narrow, agency-specific audience of survey managers and analysts. The process involved the work of a team comprised of individuals from across the Census Bureau's Economic Directorate, working in a cooperative and collaborative manner. The team decided what needed to be added, modified, and deleted from the broad starting point, and determined how much of the theory and experimental evidence found in the literature was necessary to include in the guidelines. In addition to discussing the process, the paper will also describe the end result: a set of questionnaire design guidelines for the Economic Directorate.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010975
    Description:

    A major issue in official statistics is the availability of objective measures supporting the based-on-fact decision process. Istat has developed an Information System to assess survey quality. Among other standard quality indicators, nonresponse rates are systematically computed and stored for all surveys. Such a rich information base permits analysis over time and comparisons among surveys. The paper focuses on the analysis of interrelationships between data collection mode and other survey characteristics on total nonresponse. Particular attention is devoted to the extent to which multi-mode data collection improves response rates.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010976
    Description:

    Many survey organizations use the response rate as an indicator for the quality of survey data. As a consequence, a variety of measures are implemented to reduce non-response or to maintain response at an acceptable level. However, the response rate is not necessarily a good indicator of non-response bias. A higher response rate does not imply smaller non-response bias. What matters is how the composition of the response differs from the composition of the sample as a whole. This paper describes the concept of R-indicators to assess potential differences between the sample and the response. Such indicators may facilitate analysis of survey response over time, between various fieldwork strategies or data collection modes. Some practical examples are given.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X200800010985
    Description:

    In Canada, although complex businesses represent less than 1% of the total number of businesses, they contribute more than 45% of the total revenue. Statistics Canada recognized that the quality of the data collected from them is of great importance and has adopted several initiatives to improve the quality. One of the initiatives is the evaluation of the coherence of the data collected from large, complex enterprises. The findings of these recent coherence analyses have been instrumental in identifying areas for improvement. These, once addressed and improved, would be increasing the quality of the data collected from the large, complex enterprises while reducing the response burden imposed on them.

    Release date: 2009-12-03
Reference (0)

Reference (0) (0 results)

No content available at this time.

Date modified: