Principles of quality assurance at Statistics Canada
Purpose and scope of the guidelines

Statistical information is critical to the functioning of a modern democracy. Without good data, the quality of decision-making, the allocation of billions of dollars of resources, and the ability of governments, businesses, institutions and the general public to understand the social and economic reality of the country would be severely impaired. A national statistical agency, such as Statistics Canada, plays an essential role in the production and dissemination of statistical information.

The credibility of a statistical agency in fulfilling this key role rests on the following pillars: production of high quality statistical information; cost efficiency; privacy; confidentiality; and the maintenance of a highly capable and motivated workforce.

More specifically, the quality of the information it produces, its relevance in particular, is of fundamental importance to a statistical agency. Unless the statistical agency is capable of producing high quality data, both the users and the suppliers of statistical data would soon lose confidence in the statistical agency, making its job impossible. As an introduction to the guidelines, this section presents the principles of quality assurance at Statistics Canada within which they are applied.

Principles of quality assurance at Statistics Canada

The Agency's management structure, policies and guidelines, consultative mechanisms, project development and management approach, and environment have been developed to facilitate and assure effective management of quality. The basic mechanisms for managing quality are described in Statistics Canada's Quality Assurance Framework (Statistics Canada, 2002c).

The framework consists of a wide variety of mechanisms and processes acting at various levels throughout the Agency's programs and across its organization. The effectiveness of this framework depends not on any one mechanism or process but on the collective effect of many interdependent measures. These build on the professional interests and motivation of the staff. They reinforce each other as means to serve client needs. They emphasize the Agency's objective professionalism, and reflect a concern for data quality. An important feature of this strategy is the synergy resulting from the many players in the Agency's programs operating within a framework of coherent processes and consistent messages. Within the framework, the Quality Guidelines provide an accompanying document that describes a set of best practices for all of the "steps" of a statistical program, and is aimed at project team members who are charged with the development and implementation of statistical programs.

Underlying all of these mechanisms, processes and practices are eight guiding principles.

I. Quality is relative, not absolute

A significant feature of the management of quality is the balancing of quality objectives against the constraints of financial and human resources, the goodwill of respondents in providing source data, and competing demands for greater quantities and detail of information.  The management of quality is not the maximization of quality at all costs, but the achievement of an appropriate balance between the quantity and quality of information yielded by the Agency's programs and the resources available.  Within individual programs the challenge is to make the appropriate trade-offs between the evolving needs of clients, costs, respondent burden, and the various elements or dimensions of quality.

Statistical data are important because of the use to which they are put. It follows that statistical data can only be judged against their relevance and how well they represent the world we seek to describe. It also follows that it is important for the statistical agency to have a thorough understanding of the uses to which its data are put, and to do so it must maintain ongoing relations with its user community.

This principle also recognizes that achieving "perfect" quality is neither desirable nor affordable (in fact it is rarely even possible). Data are subject to numerous sources of error, both sampling and non-sampling, and it is the job of the statistical agency to balance factors such as accuracy, cost and burden on respondents in developing a statistical program. Minimizing error itself is not the goal; each statistical program must be designed within the context of what is feasible and how important the data are to users.

Statistics Canada strives to build relevance and quality into all its programs and products. The quality of its official statistics is founded on the use of sound scientific methods adapted over time to changing client needs, to the changing reality that the Agency aims to measure, and to the capacity or willingness of respondents to supply reliable and timely data. The Quality Guidelines are one of the tools that will aid in building quality into the design of each program.

II. Quality is multidimensional

During the past twenty years, statistical agencies have arrived at a consensus that the concept of "quality" of statistical information is multi-dimensional. Statistics Canada defines quality in terms of six dimensions; other statistical agencies and organizations have defined similar frameworks. While these definitions may differ slightly, they all recognize that there is no one single measure of data quality.

At Statistics Canada, the dimensions of quality are defined as follows:

The relevance of statistical information reflects the degree to which it meets the real needs of clients. It is concerned with whether the available information sheds light on the issues of most importance to users. Hence, relevance is the most important dimension of quality; one could even consider it among the pillars of a statistical agency. It is largely in the domain of users of the information; it is not something that a statistical agency can establish by itself. Comparatively, the other dimensions of quality are much more within the control of the statistical agency.

The accuracy of statistical information is the degree to which the information correctly describes the phenomena it was designed to measure. It is usually characterized in terms of error in statistical estimates and is traditionally decomposed into bias (systematic error) and variance (random error) components. It may also be described in terms of the major sources of error that potentially cause inaccuracy (e.g., coverage, sampling, nonresponse, response).

The timeliness of statistical information refers to the delay between the reference point (or the end of the reference period) to which the information pertains, and the date on which the information becomes available. It is typically involved in a trade-off against accuracy. The timeliness of information will influence its relevance.

The accessibility of statistical information refers to the ease with which it can be obtained from the Agency. This includes the ease with which the existence of information can be ascertained, as well as the suitability of the form or medium through which the information can be accessed. The cost of the information may also be an aspect of accessibility for some users.

The interpretability of statistical information reflects the availability of the supplementary information and metadata necessary to interpret and utilize it appropriately. This information normally covers the underlying concepts, variables and classifications used, the methodology of data collection and processing, and indications of the accuracy of the statistical information.

The coherence of statistical information reflects the degree to which it can be successfully brought together with other statistical information within a broad analytic framework and over time. The use of standard concepts, classifications and target populations promotes coherence, as does the use of common methodology across surveys. Coherence does not necessarily imply full numerical consistency.

These dimensions are overlapping and interrelated; in managing quality all of them must be considered. A failure in any one dimension will result in the failure of the entire statistical program.

III. Every employee has a role to play in assuring quality

Through its policies, guidelines and internal communications, Statistics Canada has made it clear to its employees that everyone has a role to play in assuring quality, from the employees working on daily production tasks to the highest level of management. This reflects the philosophy of Deming (1982) that quality is not something that can be "inspected into" the process, but should be built into the process to begin with. Thus there is no one body at Statistics Canada explicitly charged with quality assurance.

As well, the human resources practices of the agency reflect the principle that quality is everyone's business. The recruitment, training and development programs of the Agency put considerable emphasis on technical competencies and an understanding of what constitutes high quality data.

IV. Balancing of the dimensions of quality is best achieved through a project team approach

Because quality is multidimensional, it follows that different dimensions tend to be the area of expertise of different groups in the Agency. For many years, Statistics Canada has realized that the tradeoffs among the various dimensions of quality are best achieved through a project team approach.

The management of quality at Statistics Canada occurs within a matrix management framework – project management operating within the functional organization.  The Agency is functionally organized into six Fields.  Three of these are primarily responsible for statistical programs of data production and analysis in various subject-matter areas (e.g. social statistics, business statistics and national accounts). The other three Fields are primarily involved in the provision of infrastructure and services to be used by the statistical programs (e.g. methodology, informatics, collection operations, dissemination and management systems).  A typical statistical program is managed by one of the subject-matter divisions and draws heavily on the resources of infrastructure and service areas for inputs to the program.

The use of an interdisciplinary project team approach for the design or redesign of a statistical program is important in ensuring that quality considerations relating to all the components and steps in the program receive appropriate attention during design, implementation and assessment. It is the responsibility of the functional organizations to ensure that project teams are adequately staffed with people able to speak with expertise and authority for their functional area.  Subject-matter staff bring knowledge of content, client needs, and relevance.  Methodologists bring their expertise in statistical methods and data quality trade-offs, especially with respect to accuracy, timeliness and cost.  Operations experts bring experience in operational methods, and concerns for practicality, efficiency, field staff and respondents.  The system experts bring a systems view, and knowledge of technology standards and tools.

It is within such a project team that the many decisions and trade-offs necessary to ensure an appropriate balance between concern for quality and considerations of cost and response burden are made.  Together the team has to balance the conflicting pressures in order to develop an optimal design. The fact that each member of the team is a part of a specialized functional organization, from which a variety of more specialized and management resources can be called upon when warranted, helps in resolving both technical challenges and conflicts arising in a project.

Projects are normally guided by a more senior Steering Committee that may include managers from each of the major participating areas.  This Committee, which is part of the formal approval mechanism for the design and implementation of the program, provides overall guidance, broad budgetary and design parameters, and helps to ensure that appropriate resources are available to the project.  It also provides a forum for resolving any issues that cannot be satisfactorily resolved within the project team.

V. Quality must be built in at each phase of the process

Statistical agencies have often modeled the statistical process, as an aid in managing it. The second edition (April 1987) of Statistics' Canada's Quality Guidelines contained a schematic of the statistical survey process. More recently, agencies such as Statistics New Zealand, the Australian Bureau of Statistics, Statistics Sweden, Statistics Norway, Statistics Netherlands, and the Joint UNECE / Eurostat / OECD Work Sessions on Statistical Metadata (METIS) have developed various drafts of a Generic Statistical Business Process Model (GSBPM). The model developed by METIS (UNECE Secretariat 2008), which is based on that of Statistics New Zealand but with input from several other agencies, including Statistics Canada, is shown in Figure 1.

These various models have in common the division of the process into a number of phases or steps. While the details of the various models vary, all contain common elements: the specification of user needs, the design of the program, the implementation or "build" phase (specifications, systems, operations manuals, training, etc.), the execution phase (collection, verification, etc.,) and the evaluation phase. A basic principle of quality assurance is that it must be considered at all of these phases. If the user needs are not understood or are incorrectly specified, then all the steps that follow will only result in data that are not relevant. Measures must also be taken to ensure that the design is done properly; if it is not then no amount of perfect implementation and execution will compensate. However good design is not enough; if it is not implemented or executed correctly then the good design has gone to waste. And without proper evaluation the statistical agency will not know whether the statistical program has met its objectives or not.

The principle that quality must be built into each phase, together with the notion that quality is multidimensional, leads logically to the conceptualization of quality assurance management as a matrix defined by the dimensions of quality (relevance, accuracy, etc.,) in one dimension and the phases of a survey (specification of needs, design, implementation, execution, evaluation) in the other dimension. A comprehensive approach to the management of quality demands that all the cells of this matrix be considered.

VI. Quality assurance measures must be adapted to the specific program

At Statistics Canada, statistical program managers are responsible and accountable for delivering their programs. Each statistical activity manager has the responsibility to ensure that the Agency's concern for quality is adequately reflected in the statistical program's methods and procedures. It has always been clear that the Quality Guidelines are exactly that: guidelines to assist program managers, and not rules to be followed. It is clearly not the expectation that every program will adhere to every guideline; such a proposition would be prohibitively expensive and unnecessary, given the variation in the importance of the various statistical programs. Instead, statistical program mangers and the project teams that support them are expected to make the necessary decisions.

VII. Users must be informed of data quality so that they can judge whether the statistical information is appropriate for their particular use

Finally, in order for the user to make informed use of the statistical information provided, he or she must be able to assess whether the data are of sufficient quality. For some dimensions of quality, such as timeliness, users are able to assess the quality for themselves. Other aspects, such as interpretability, coherence and even relevance may not be as obvious. The dimension of accuracy in particular is one which users may often have no way of assessing and must rely on the statistical agency for guidance.

Over the years the Agency has developed policies and tools to assist users. It has long had a Policy on Informing Users of Data Quality and Methodology (PIUDQM), which prescribes minimum amounts of information on data quality and methodology that are to be provided to users.  All data are released through The Daily and are accompanied by a link to the Integrated Meta Database (IMDB) that provides information on concepts, definitions, data sources and methodology for each statistical program.

Concerning accuracy, the Policy also specifies that all data releases are to be accompanied by information on three common sources of error: coverage (the difference between the target population and the frame used to conduct the survey), nonresponse (the portion of the sample who did not respond), and sampling error (a source of error when a sample rather than a census was conducted), as well as any other significant sources of error (e.g., response errors, processing errors, errors introduced for disclosure control).

VIII. Quality must be at the forefront of all activities

Finally, unless proactive actions are taken, the quality of data should, and does, deteriorate over time.  For example, a "relevance gap" often occurs given the time it takes between the emergence of a need for data and the ability to produce it. A gap on other quality dimensions can also manifest itself as response rates decline over time due to changes in societal attitudes, or as systems become outdated or methodology becomes in need of redesign.

A statistical agency must keep quality at the forefront of all its activities in order to minimize the relevance gap and prevent a significant decrease in quality over time.
The quality of program processes and outputs must be assessed, checked or reviewed constantly. Such review mechanisms must be integrated into the agency's planning and decision making processes. Statistics Canada's National Statistics Council and senior bilateral arrangements with key departments and agencies (effective dialogue with stakeholders), Integrated Program Reporting (systematic assessment of statistical outputs) and Quality Review Program (formal assessment of statistical processes) are examples of such quality assessment or review mechanisms.

Purpose and scope of the guidelines

Section 2 of this document brings together guidelines and checklists on many issues that need to be considered in the pursuit of quality objectives in the execution of statistical activities. Its focus is on how to assure quality through effective and appropriate design and implementation of a statistical program from inception through to data evaluation, documentation and dissemination. These guidelines draw on the collective knowledge and experience of many Statistics Canada employees. It is expected that guidelines will be useful to staff engaged in the planning and design of surveys and other statistical programs, as well as to those who evaluate, analyze and use the outputs of these programs.

The main purpose of the Quality Guidelines is to provide a comprehensive list of guiding principles and good practices in survey design. To better appreciate the scope of the guidelines, it is important to define its use of the words survey and design.

The term survey is used generically to cover any activity that collects or acquires statistical data. Included are:

  1. a census, which attempts to collect data from all members of a population;
  2. a sample survey, in which data are collected from a (usually random) sample of population members;
  3. collection of data from administrative records, in which data are derived from records originally kept for non-statistical purposes;
  4. a derived statistical activity, in which data are estimated, modeled, or otherwise derived from existing statistical data sources.

The guidelines are written with censuses and sample surveys as the main focus. The quality of derived statistical activities is, of course, largely determined by the quality of the component parts, and as such, derived statistical activities are not the direct focus of this document.

The term design is used to cover the delineation of all aspects of a survey from the establishment of a need for data to the production of final outputs (the microdata file, statistical series, and analysis).

The core of this document (Section 2) concentrates on quality issues as they relate to the design of individual surveys. It is, however, important to keep in mind that the context in which each individual survey is developed imposes constraints on its design. Each new survey, while aiming to satisfy some immediate information needs, is also contributing information to a base of statistical data that may be used for a range of purposes that go well beyond those identified at the time of the survey's design. It is therefore important to ensure that the output from each individual survey can, to the extent possible, be integrated with, and used in conjunction with, data on related topics derived from other surveys. This implies a need to consider and respect the statistical standards on content or subject-matter that have been put in place to achieve coherence and harmony of data within the national statistical system. These include statistical frameworks (such as the System of National Accounts), statistical classification systems (such as those for industry or geography), as well as other concepts and definitions that specify the statistical variables to be measured. The usefulness of new statistical data is enhanced to the extent that they can be utilized in conjunction with existing data.

Levels 1 and 2 of the Generic Statistical Business Process Model


Brackstone, G. 1999. "Managing data quality in a statistical agency." Survey Methodology. Vol. 25. p. 139-149.

Deming, W.E. 1982 Quality, Productivity, and Competitive Position, Cambridge, MA: Massachusetts Institute of Technology.

Fellegi, I. 1996. "Characteristics of an effective statistical system." International Statistical Review. Vol. 64. p. 165-197.

Statistics Canada. 1987 Quality Guidelines. Second Edition. Ottawa, Ontario.

Statistics Canada. 2000c. "Integrated Metadatabase – Guidelines for Authors." Standards Division Internal Communications Network.  No date. http://stdsweb/standards/imdb/imdb-menu.htm.

Statistics Canada. 2000d. "Policy on Informing Users of Data Quality and
Methodology." Statistics Canada Policy Manual. Section 2.3. Last updated March 4, 2009.

Statistics Canada. 2002c. Statistics Canada's Quality Assurance Framework - 2002. Statistics Canada Catalogue no. 12-586-X.

Statistics Canada. 2003d. Statistics Canada Policy Manual.  Last updated May 27, 2009.

Trewin, D. 2002. "The importance of a quality culture." Survey Methodology. Vol. 28. p. 125-133. 

UNECE Secretariat. 2008. "Generic Statistical Business Process Model: Version 3.1 – December 2008". Joint UNECE/Eurostat/OECD Work Session on Statistical Metadata (METIS).

Date modified: