Quality assurance

Filter results by

Search Help
Currently selected filters that can be removed

Keyword(s)

Survey or statistical program

1 facets displayed. 0 facets selected.
Sort Help
entries

Results

All (9)

All (9) ((9 results))

  • Articles and reports: 82-003-X201501114243
    Description:

    A surveillance tool was developed to assess dietary intake collected by surveys in relation to Eating Well with Canada’s Food Guide (CFG). The tool classifies foods in the Canadian Nutrient File (CNF) according to how closely they reflect CFG. This article describes the validation exercise conducted to ensure that CNF foods determined to be “in line with CFG” were appropriately classified.

    Release date: 2015-11-18

  • Articles and reports: 11-522-X201300014267
    Description:

    Statistics Sweden has, like many other National Statistical Institutes (NSIs), a long history of working with quality. More recently, the agency decided to start using a number of frameworks to address organizational, process and product quality. It is important to consider all three levels, since we know that the way we do things, e.g., when asking questions, affects product quality and therefore process quality is an important part of the quality concept. Further, organizational quality, i.e., systematically managing aspects such as training of staff and leadership, is fundamental for achieving process quality. Statistics Sweden uses EFQM (European Foundation for Quality Management) as a framework for organizational quality and ISO 20252 for market, opinion and social research as a standard for process quality. In April 2014, as the first National Statistical Institute, Statistics Sweden was certified according to the ISO 20252. One challenge that Statistics Sweden faced in 2011 was to systematically measure and monitor changes in product quality and to clearly present them to stakeholders. Together with external consultants, Paul Biemer and Dennis Trewin, Statistics Sweden developed a tool for this called ASPIRE (A System for Product Improvement, Review and Evaluation). To assure that quality is maintained and improved, Statistics Sweden has also built an organization for quality comprising a quality manager, quality coaches, and internal and external quality auditors. In this paper I will present the components of Statistics Sweden’s quality management system and some of the challenges we have faced.

    Release date: 2014-10-31

  • Articles and reports: 11-522-X200800010954
    Description:

    Over the past year, Statistics Canada has been developing and testing a new way to monitor the performance of interviewers conducting computer-assisted personal interviews (CAPI). A formal process already exists for monitoring centralized telephone interviews. Monitors listen to telephone interviews as they take place to assess the interviewer's performance using pre-defined criteria and provide feedback to the interviewer on what was well done and what needs improvement. For the CAPI program, we have developed and are testing a pilot approach whereby interviews are digitally recorded and later a monitor listens to these recordings to assess the field interviewer's performance and provide feedback in order to help improve the quality of the data. In this paper, we will present an overview of the CAPI monitoring project at Statistics Canada by describing the CAPI monitoring methodology and the plans for implementation.

    Release date: 2009-12-03

  • Surveys and statistical programs – Documentation: 75F0002M2008005
    Description:

    The Survey of Labour and Income Dynamics (SLID) is a longitudinal survey initiated in 1993. The survey was designed to measure changes in the economic well-being of Canadians as well as the factors affecting these changes. Sample surveys are subject to sampling errors. In order to consider these errors, each estimates presented in the "Income Trends in Canada" series comes with a quality indicator based on the coefficient of variation. However, other factors must also be considered to make sure data are properly used. Statistics Canada puts considerable time and effort to control errors at every stage of the survey and to maximise the fitness for use. Nevertheless, the survey design and the data processing could restrict the fitness for use. It is the policy at Statistics Canada to furnish users with measures of data quality so that the user is able to interpret the data properly. This report summarizes the set of quality measures of SLID data. Among the measures included in the report are sample composition and attrition rates, sampling errors, coverage errors in the form of slippage rates, response rates, tax permission and tax linkage rates, and imputation rates.

    Release date: 2008-08-20

  • Articles and reports: 11-522-X20030017707
    Description:

    The paper discusses the structure and the quality measures Eurostat uses to provide European Union and EU-Zone with economic seasonally adjusted series.

    Release date: 2005-01-26

  • Articles and reports: 11-522-X20010016238
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Research programs building on population-based, longitudinal administrative data and record-linkage techniques are found in England, Scotland, the United States (the Mayo Clinic), Western Australia and Canada. These systems can markedly expand both the methodological and the substantive research in health and health care.

    This paper summarizes published, Canadian data quality studies regarding registries, hospital discharges, prescription drugs, and physician claims. It makes suggestions for improving registries, facilitating record linkage and expanding research into social epidemiology. New trends in case identification and health status measurement using administrative data have also been noted. And the differing needs for data quality research in each province have been highlighted.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016244
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Over the past few years, Statistics New Zealand (SNZ) has experienced an increase in the volume of business survey data supplied by e-mail. However, up until now, SNZ has not had the business processes available to support electronic collection in a way that meets both the needs of SNZ and data suppliers. To this end, SNZ has invested a lot of effort over the last year in investigating how best to approach the problems and opportunities presented by electronic data collection. This paper outlines SNZ's plans to move the e-mail supplied data to a secure lodgement facility and the future development of an internet-based data collection system. It also presents a case study of the Monthly Retail Trade Survey data currently supplied by e-mail. This case study illustrates some of the benefits of electronic data, but also examines some of the costs to the organization and the data quality problems encountered. It also highlights the need to consider the data collection methodology within the wider context of the total survey cycle.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016262
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The demand for information on the electronic economy requires statistical agencies to assess the relevancy and improve the quality of their existing measurement programs. Innovations at the U.S. Census Bureau have helped the Bureau meet the user's urgent needs for this information, and improve the quality of the data. Through research conducted at the U.S. Census Bureau, as well as tapping into the expertise of academia, the private sector, and other government agencies, the new data on electronic commerce and electronic business processes has been strengthened. Using both existing and new data, the U.S. Census Bureau has discovered that research provides new key estimates of the size, scope, and impact of the new economy.

    Release date: 2002-09-12

  • Articles and reports: 12-001-X198900114573
    Description:

    The Census Bureau makes extensive use of administrative records information in its various economic programs. Although the volume of records processed annually is vast, even larger numbers will be received during the census years. Census Bureau mainframe computers perform quality control (QC) tabulations on the data; however, since such a large number of QC tables are needed and resources for programming are limited and costly, a comprehensive mainframe QC system is difficult to attain. Add to this the sensitive nature of the data and the potentially very negative ramifications from erroneous data, and the need becomes quite apparent for a sophisticated quality assurance system on the microcomputer level. Such a system is being developed by the Economic Surveys Division and will be in place for the 1987 administrative records data files. The automated quality assurance system integrates micro and mainframe computer technology. Administrative records data are received weekly and processed initially through mainframe QC programs. The mainframe output is transferred to a microcomputer and formatted specifically for importation to a spreadsheet program. Systematic quality verification occurs within the spreadsheet structure, as data review, error detection, and report generation are accomplished automatically. As a result of shifting processes from mainframe to microcomputer environments, the system eases the burden on the programming staff, increases the flexibility of the analytical staff, and reduces processing costs on the mainframe and provides the comprehensive quality assurance component for administrative records.

    Release date: 1989-06-15
Data (0)

Data (0) (0 results)

No content available at this time.

Analysis (8)

Analysis (8) ((8 results))

  • Articles and reports: 82-003-X201501114243
    Description:

    A surveillance tool was developed to assess dietary intake collected by surveys in relation to Eating Well with Canada’s Food Guide (CFG). The tool classifies foods in the Canadian Nutrient File (CNF) according to how closely they reflect CFG. This article describes the validation exercise conducted to ensure that CNF foods determined to be “in line with CFG” were appropriately classified.

    Release date: 2015-11-18

  • Articles and reports: 11-522-X201300014267
    Description:

    Statistics Sweden has, like many other National Statistical Institutes (NSIs), a long history of working with quality. More recently, the agency decided to start using a number of frameworks to address organizational, process and product quality. It is important to consider all three levels, since we know that the way we do things, e.g., when asking questions, affects product quality and therefore process quality is an important part of the quality concept. Further, organizational quality, i.e., systematically managing aspects such as training of staff and leadership, is fundamental for achieving process quality. Statistics Sweden uses EFQM (European Foundation for Quality Management) as a framework for organizational quality and ISO 20252 for market, opinion and social research as a standard for process quality. In April 2014, as the first National Statistical Institute, Statistics Sweden was certified according to the ISO 20252. One challenge that Statistics Sweden faced in 2011 was to systematically measure and monitor changes in product quality and to clearly present them to stakeholders. Together with external consultants, Paul Biemer and Dennis Trewin, Statistics Sweden developed a tool for this called ASPIRE (A System for Product Improvement, Review and Evaluation). To assure that quality is maintained and improved, Statistics Sweden has also built an organization for quality comprising a quality manager, quality coaches, and internal and external quality auditors. In this paper I will present the components of Statistics Sweden’s quality management system and some of the challenges we have faced.

    Release date: 2014-10-31

  • Articles and reports: 11-522-X200800010954
    Description:

    Over the past year, Statistics Canada has been developing and testing a new way to monitor the performance of interviewers conducting computer-assisted personal interviews (CAPI). A formal process already exists for monitoring centralized telephone interviews. Monitors listen to telephone interviews as they take place to assess the interviewer's performance using pre-defined criteria and provide feedback to the interviewer on what was well done and what needs improvement. For the CAPI program, we have developed and are testing a pilot approach whereby interviews are digitally recorded and later a monitor listens to these recordings to assess the field interviewer's performance and provide feedback in order to help improve the quality of the data. In this paper, we will present an overview of the CAPI monitoring project at Statistics Canada by describing the CAPI monitoring methodology and the plans for implementation.

    Release date: 2009-12-03

  • Articles and reports: 11-522-X20030017707
    Description:

    The paper discusses the structure and the quality measures Eurostat uses to provide European Union and EU-Zone with economic seasonally adjusted series.

    Release date: 2005-01-26

  • Articles and reports: 11-522-X20010016238
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Research programs building on population-based, longitudinal administrative data and record-linkage techniques are found in England, Scotland, the United States (the Mayo Clinic), Western Australia and Canada. These systems can markedly expand both the methodological and the substantive research in health and health care.

    This paper summarizes published, Canadian data quality studies regarding registries, hospital discharges, prescription drugs, and physician claims. It makes suggestions for improving registries, facilitating record linkage and expanding research into social epidemiology. New trends in case identification and health status measurement using administrative data have also been noted. And the differing needs for data quality research in each province have been highlighted.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016244
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Over the past few years, Statistics New Zealand (SNZ) has experienced an increase in the volume of business survey data supplied by e-mail. However, up until now, SNZ has not had the business processes available to support electronic collection in a way that meets both the needs of SNZ and data suppliers. To this end, SNZ has invested a lot of effort over the last year in investigating how best to approach the problems and opportunities presented by electronic data collection. This paper outlines SNZ's plans to move the e-mail supplied data to a secure lodgement facility and the future development of an internet-based data collection system. It also presents a case study of the Monthly Retail Trade Survey data currently supplied by e-mail. This case study illustrates some of the benefits of electronic data, but also examines some of the costs to the organization and the data quality problems encountered. It also highlights the need to consider the data collection methodology within the wider context of the total survey cycle.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016262
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The demand for information on the electronic economy requires statistical agencies to assess the relevancy and improve the quality of their existing measurement programs. Innovations at the U.S. Census Bureau have helped the Bureau meet the user's urgent needs for this information, and improve the quality of the data. Through research conducted at the U.S. Census Bureau, as well as tapping into the expertise of academia, the private sector, and other government agencies, the new data on electronic commerce and electronic business processes has been strengthened. Using both existing and new data, the U.S. Census Bureau has discovered that research provides new key estimates of the size, scope, and impact of the new economy.

    Release date: 2002-09-12

  • Articles and reports: 12-001-X198900114573
    Description:

    The Census Bureau makes extensive use of administrative records information in its various economic programs. Although the volume of records processed annually is vast, even larger numbers will be received during the census years. Census Bureau mainframe computers perform quality control (QC) tabulations on the data; however, since such a large number of QC tables are needed and resources for programming are limited and costly, a comprehensive mainframe QC system is difficult to attain. Add to this the sensitive nature of the data and the potentially very negative ramifications from erroneous data, and the need becomes quite apparent for a sophisticated quality assurance system on the microcomputer level. Such a system is being developed by the Economic Surveys Division and will be in place for the 1987 administrative records data files. The automated quality assurance system integrates micro and mainframe computer technology. Administrative records data are received weekly and processed initially through mainframe QC programs. The mainframe output is transferred to a microcomputer and formatted specifically for importation to a spreadsheet program. Systematic quality verification occurs within the spreadsheet structure, as data review, error detection, and report generation are accomplished automatically. As a result of shifting processes from mainframe to microcomputer environments, the system eases the burden on the programming staff, increases the flexibility of the analytical staff, and reduces processing costs on the mainframe and provides the comprehensive quality assurance component for administrative records.

    Release date: 1989-06-15
Reference (1)

Reference (1) ((1 result))

  • Surveys and statistical programs – Documentation: 75F0002M2008005
    Description:

    The Survey of Labour and Income Dynamics (SLID) is a longitudinal survey initiated in 1993. The survey was designed to measure changes in the economic well-being of Canadians as well as the factors affecting these changes. Sample surveys are subject to sampling errors. In order to consider these errors, each estimates presented in the "Income Trends in Canada" series comes with a quality indicator based on the coefficient of variation. However, other factors must also be considered to make sure data are properly used. Statistics Canada puts considerable time and effort to control errors at every stage of the survey and to maximise the fitness for use. Nevertheless, the survey design and the data processing could restrict the fitness for use. It is the policy at Statistics Canada to furnish users with measures of data quality so that the user is able to interpret the data properly. This report summarizes the set of quality measures of SLID data. Among the measures included in the report are sample composition and attrition rates, sampling errors, coverage errors in the form of slippage rates, response rates, tax permission and tax linkage rates, and imputation rates.

    Release date: 2008-08-20
Date modified: