Chapter 1.5: Quality Management
To adequately inform decisions and public debate, National Statistical Offices (NSOs) need to provide credible statistical information to the public. Credibility can be achieved only if users have faith in the quality of the data produced and in the integrity of the statistical system. An NSO's reputation as an independent source of trustworthy information could be undermined if the quality of its statistical products is suspect; effective quality management is, therefore, imperative.
While there are several general definitions of "quality" in the statistical context, one of the most succint and commonly used definitions is fitness for use or fitness for purpose. In other words, the concept of data quality refers to the degree to which a set of inherent characteristics fulfils requirements. It is generally recognized that the quality of statistical information is multi-dimensional and cannot be measured through a single dimension.
The six generally accepted dimensions, or components, of data quality are the following:
- Relevance: the degree to which information meets the needs or requirements of clients, users, stakeholders or the audience (refer to Chapter 1.4: Understanding users' needs and maintaining relationships).
- Accuracy and reliability: the degree of closeness of estimates to true values.
- Timeliness and punctuality: Timeliness refers to how fast the data are released or made available, while punctuality refers to whether data are delivered on the dates promised, advertised or announced (refer totwochapters: Chapter 2.4: The Departmental Project Management Framework and Chapter 3.1: The Corporate Business Architecture).
- Accessibility and clarity: the degree to which statistics can be found or obtained without difficulty. Data are presented clearly and in such a way that they can be accessed and understood, by all types of users, on an impartial and equal basis. The data are available in various convenient formats, as well as affordable, if not offered free of charge (refer to two chapters: Chapter 4.1: Disseminating data through the website and Chapter 4.2: External communications and outreach).
- Coherence and comparability: statistics should be consistent internally and comparable over time, and should be produced using common standards with respect to scope, definitions, classifications and units (refer to Chapter 1.3: Following international standards).
- Interpretability and metadata: information about the underlying concepts, variables and classifications used, the methodology of data collection and processing, and indications of quality of the statistical information are available for users (refer to Chapter 4.3: Access to, and management of, metadata.
Other dimensions of quality, such as integrity, topicality, serviceability and methodological soundness, can also be considered or included as a subset of the above dimensions. It is important to note that these dimensions are complementary and balanced with one another. Finally, it can be argued that the relevance dimension is paramount. If the information that is produced responds to the needs of users, the other dimensions become very important. If it does not, the other dimensions are irrelevant.
Because quality is so fundamental to statistical practices, every statistical organization should have a quality management system (or equivalent) to ensure that the desired level of quality is achieved through management of institutional aspects, processes and outputs.Endnote 1 This chapter will focus on strategies, mechanisms and tools that could be adopted by statistical agencies to improve their quality management practices.
Strategies, mechanisms and tools
NSOs typically use various mechanisms and tools to manage quality and develop a quality culture within their organization. These tools can be grouped into four categories:
- a strong quality management governance
- a quality management framework and guidelines
- quality assessments and reporting tools
- other management frameworks contributing to quality enhancement.
1. Quality management governance
In order to champion quality management initiatives, statistical agencies or systems should consider establishing a neutral focal point or resource centre for quality management that is supported and empowered by senior management. Having such a quality unit ensures that quality remains a constant priority and that quality assurance mechanisms and tools are continuously improved upon to meet the demands resulting from a changing environment.
It is with this perspective that Statistics Canada has created a quality unit called the Quality Secretariat, whose mandate is to promote and support the use of sound quality management practices across the organization. The role of the Secretariat is to
- support the development, revision and implementation of key quality management documents, such as a quality assurance framework, quality guidelines, and policies;
- support the development and implementation of new quality improvement initiatives or procedures;
- promote sound quality management practices;
- provide advice and assist programs as regards compliance with good practices that support quality;
- support corporate management in the preparation of performance reports on quality;
- answer requests from other statistical agencies for information or assistance related to quality management.
The activities of the Quality Secretariat are overseen by the Methods and Standards Committee, whose role is to act as the focal point for the review and monitoring of corporate data quality practices and initiatives and to ensure that these activities are coherent with other management frameworks and policies. The Methods and Standards Committee reports to Statistics Canada's most senior management committee, the Executive Management Board. Through this governance structure, the Quality Secretariat is neutral with respect to statistical programs, and is supported by senior management. Neutrality ensures that quality assurance tools are uniformly relevant across statistical programs, and senior management support guarantees compliance.
It is also a recommended practice to have a quality unit or focal point within statistical programs. The objective is not to absolve all other team members from taking responsibility for quality assurance. On the contrary, the responsibilities of a program-level quality unit include the following:
- encouraging and monitoring quality assurance activities;
- monitoring and interpreting performance indicators;
- investigating quality issues when they arise;
- staying up-to-date on quality-related policy instruments and how they apply to the program;
- liaising with the Quality Secretariat; and
- tailoring quality management practice,s as necessary, to the subject-matter particularities of the program.
2. Quality Management Framework and Guidelines
Although various quality management approaches, models and frameworks exist within the statistical community, systematic quality management within a national statistical office or system usually takes the form of a national quality assurance framework (NQAF). A NQAF is typically an overarching framework that provides context for quality concerns, activities and initiatives, and explains the relationships between the various quality concerns. Recognizing the importance of such a document in quality management, the statistical community has asked an expert group to prepare a Template for a Generic National Quality Assurance Framework and some accompanying guidelines to assist national statistical systems in developing, implementing and enhancing their NQAF. While the template should not be approached as a one-size-fits-all framework, it provides examples of elements required to manage quality at four levels: statistical system, institutional environment, statistical processes and statistical outputs.
Statistics Canada's first Quality Assurance Framework was published in 1997. It was followed by a second version, in 2002, and a third one in 2016. As the statistical environment is not static, the subsequent versions of Statistics Canada's quality assurance framework have taken into account the evolution of the organization's programs, methodology, technology and policies, as well as the progress made internationally in quality management practices.
While this document was originally intended for reference and training purposes, it has become central to developing a quality culture within the organization and in providing a baseline to continuous quality practices improvement. It also contributes to greater transparency and, therefore, reinforces the image of the organization as a credible source of statistics. Finally, it allows Statistics Canada to share ideas and best practices on quality management nationally and internationally.
Another key document in managing quality at Statistics Canada is the Quality Guidelines.Endnote 2 The first version was published in 1998, followed by new versions in 2003 and 2009. These guidelines are expected to be revised periodically as methodologies and approaches evolve. They are highly valuable in terms of ensuring the use of sound and consistent methodologies throughout the various programs. The focus is on how to assure quality with respect to the effective and appropriate design or redesign of a statistical project or program from inception through to data evaluation, dissemination and documentation. The guidelines provide strategies and tools to minimize the risk of introducing sampling and non-sampling errors in the survey process and to address some of the quality concerns related to the use of administrative data.
In 2009, Statistics Canada developed and adopted a reference model describing the sub-processes in the production of statistical products based on the Generic Statistical Business Process Model (GSBPM) developed by the United Nations Economic Commission for Europe (UNECE), whose membership includes non-European countries, such as Canada and the United States. This process model, which was revised in 2013,Endnote 3 is now used to establish and communicate quality assurance practices for particular activities in the production of statistical products; it is also used for reference and training. The GSBPM will be applied to the development of future versions of Statistics Canada's quality guidelines.
It is important to mention that the array of possible quality measures proposed by the quality guidelines are not necessarily applied uniformly to every program: program managers have the responsibility to determine which measures should be applied and to make these choices very explicit. However, most programs should be subject to a periodic, independent and systematic quality review to ensure proper checks and balances.
Most recently, in September 2015, the Organisation for Economic Co-operation and Development (OECD) Council on Good Statistical Practice issued draft recommendations.Endnote 4
From an OECD perspective, the quality of its statistics and analytical work depends largely on the quality of official statistics produced and transmitted by countries. This draft Recommendation would provide a common reference against which the quality of national statistical systems can be assessed. In addition, the good practices contained in this draft Recommendation would also serve as a reference for the statistical work done by the OECD. Further, the draft Recommendation and its good practices would be referred to in the context of applying the Organization's Quality Framework and Guidelines for OECD Statistical Activities.
3. Quality Assessment and Reporting Tools
Quality can be assessed through multiple means: self-assessments, audits, peer reviews or certification processes. This work can be performed by internal or external experts. Assessment processes can lead to the awarding of an official statistics or certification label. The breadth, depth and length of these evaluations can vary depending on the scope. However, the objective of all quality assessment is more or less identical: "the identification of improvement actions or opportunities in processes and products."Endnote 5
Quality assessment at Statistics Canada has evolved. There has been a long tradition of assessing statistical quality at Statistics Canada. Through the 1990s and early 2000s, this was done by means of the Quality Assurance Framework. User feedback has also been an important feature in quality assessment at Statistics Canada, directly reflecting the relevance, coherence, accessibility, interpretability and timeliness of statistical products. In 2006, the agency started conducting quality assurance reviews, which were aimed at identifying risks to data quality, and sharing good practices to mitigate those risks. Independent program evaluations conducted by a specialized internal organizational unit that follows a formal protocol to evaluate relevance, quality and efficiency were reinitiated in 2011, following a period during which this function was carried out by program managers as part of their ongoing responsibilities (see Chapter 2.8: Program Evaluation). Recently, beginning in 2015, the focus has been on strengthening the quality assessment aspect of Program Evaluation, particularly with regard to how the implementation and execution of statistical programs is assessed.
Quality reporting is also an important aspect of quality management. First, product quality indicators shared with data users allow them to determine whether the statistical product fulfills the requirements of their intended use. Second, process and intermediate product indicators can be used by producers and managers to monitor data production and improve quality practices on an ongoing basis. Finally, at the corporate level, quality performance indicators allow an organization to assess how well it is performing on the quality front and to address any risks that might prevent it from successfully fulfilling its mandate.
Statistical agencies communicate about quality, using quality indicators and quality reports. At Statistics Canada, quality indicators and reports are produced by the various programs during data production. While the type of quality indicators produced internally might differ from one program to another, the adoption of the GSBPM and a corporate business architecture will serve to rationalize the development of a common set of quality indicators for data production in the near future.
Quality reporting for both internal and external audiences is standardized in both its structure and content through the Standard on Statistical Metadata. Quality of statistical products is communicated to users in a common format prescribed by the Policy on Informing Users of Data Quality and Methodology (see Chapter 4.3: Management of, and access to, metadata).
Finally, in order to know how well Statistics Canada performs in terms of quality management, performance measures have been developed. The corporate suite of performance indicators includes several indicators of product and process quality (for the detailed list of performance indicators, refer to Appendix A). A dashboard will allow program managers to view their performance indicators at any time. They can tailor their display to compare performance against targets and to monitor levels. Corporate-level indicators required by Treasury Board Secretariat—the federal department responsible for overseeing corporate reporting and performance measurement— are produced on a quarterly and annual basis.
In the context of decentralized statistical systems, quality assessment processes can also be a tool to improve the quality of statistical products across the system. Box 1.5.2 describes how Colombia's National Administrative Department of Statistics implemented a quality certification process for all official statistics.
How is quality assessment of the statistical process carried out in Colombia?
The quality assessment of the statistical process is defined as the set of procedures that determines the level of compliance of a statistical operation with the quality requirements established by DANE. These, in turn, are based on the quality attributes defined by the United Nations (UN), the International Monetary Fund (IMF), the Organisation for Economic Co-operation and Development (OECD), and EUROSTAT.
With this initiative, DANE seeks the improvement of its statistical production and various other entities of the Colombian State, and the generation of statistical information that is credible, reliable and transparent to be used by the general public. The methodology for quality assessment establishes tools, processes and instruments for the measurement of conformity with the attributes and quality requirements adopted and adapted by DANE.
The process is structured in four stages: awareness-raising, collection, assessment and certification. It was developed by a Committee of Independent Experts (CEI,its acronym in Spanish), consisting of a subject-matter expert, a statistical expert and an expert in the statistical process, and is supported by DANE. This committee is responsible for verifying compliance with the quality requirements related to relevance, accuracy, punctuality and timeliness, accessibility, interpretability, coherence, integrity and consistency in statistical operations, such as censuses, sampling, and operations based on administrative records. There are three types of certification based on the grade obtained in the assessment process: A, B and C.
Positive effects and impact
This initiative has renewed and strengthened the image of DANE as the governing and coordinating body of the National Statistical System (NSS). DANE has identified the main weaknesses in the production process for statistics from NSS information-producing entities, and promoted the implementation of standards and best practices recognized internationally. NSS statistical operations have improved in the following respects:
- Documentation of statistical operations (methodological sheet and general methodology) and of databases (data dictionary, validation and consistency manual, etc.) generated.
- Integrity and consistency of databases (correction of inconsistencies found in the databases of the statistical operations, as well as in fields, registers and variables).
- Timely dissemination of results with respect to the period in which the phenomenon is being measured. Dissemination of historic series begun, and access to microdata provided.
- Emphasis on the needs of major users, and development of new ways to contact them and increase their participation in technical committees.
- The number of NSS entities implementing nomenclatures and classifications has increased.
Lessons learned in the implementation of the quality certification process in Colombia
The first regulations regarding quality assessment and certification of statistics were issued on November 2, 2006. Since then, DANE has faced a number of challenges, from which it has drawn the following lessons:
- It was not appropriate to have the same CEI to assess statistical operations of different subjects. Consequently, the composition of the CEI was changed to include a subject-matter expert with expertise in the subject of the operation under assessment.
- To encourage producers of statistics to agree to have their statistical processes assessed, it was necessary to strengthen the regulatory framework for statistical quality. With this in mind, in 2011, regulations were issued regarding the reassignment of the "coordination and certification of good practices in the process of statistical production" to DANE, and a regulated assessment process was instituted. These regulations resulted in greater participation by entities in the assessment process for their statistical operations.
- Given that the assessment and certification process is voluntary for entities, a campaign to raise awareness was needed. For this reason, the first stage was included.
- To achieve better quality statistics,it was necessary to follow up on the implementation of the recommendations from the assessment. Hence, a team to monitor the implementation of improvements plans was established.
The road to consolidation
Consolidation of the quality assessment of the statistical process requires that technical and methodological capacities within DANE's technical team be enhanced to enable the organization to carry out its new responsibilities, under the National Development Plan. The goal is to improve the assessment model and confirm DANE's transparency, credibility and capacity in the implementation of this model.
4. Other management frameworks contributing to quality enhancement
Most operations and functions of a statistical agency have an impact on the quality of the agency's information. The management of quality is therefore an integral part of the management of almost every statistical activity carried out by the agency and an important component of the agency's management as a whole. Frameworks are used to manage a statistical agency's quality, human resources, financial resources and overall performance towards achieving its objectives or mandate.
The extent to which a statistical agency can fulfill its mandate and related objectives depends on its ability to optimize its management and operations through organizational efficiency. A significant feature of the management of quality is the balancing of quality and quantity objectives against financial and human resources constraints. These trade-offs, while inevitable in the real world, must be made as explicit as possible so that users can understand the limitations of certain data and why they exist. Ethical standards, fundamental values and principles, and utmost transparency should guide the personnel of a statistical agency in fulfilling their official duties and responsibilities. These principles serve to maintain and enhance public and user confidence in the integrity of the agency. To achieve its mandate and objectives, a statistical agency needs an effective governance and management structure, one that integrates strategic priority setting and decision making and ensures accountability.
At Statistics Canada, quality-management strategies are coherent with the mandate and objectives of management committees for such things as information management, communications and dissemination, administrative data management, collection planning, corporate business architecture and human resources management. Ultimately, this coherence is achieved through strong governance mechanisms, including coordination of all management committees through the Executive Management Board and clear and consistent promotion of corporate priorities and strategies.
Key success factors
One of the most important factors contributing to successful quality management is having a quality culture. At Statistics Canada, this culture is developed and maintained through various means:
- The acknowledgement that every employee involved in the production and the dissemination of statistics has a role to play in quality assurance and that quality should be an integrated dimension of every management practice;
- The inclusion of the concept of quality in Statistics Canada's mission statement and mandate. Quality is also considered to be a core value in the corporate management framework;
- Quality-assurance good practices are incorporated into flagship training courses;
- The Quality Guidelines and the Quality Assurance Framework serve as training and reference material;
- Investment proposals (see Chapter 2.2: Integrated Strategic Planning) are assessed in terms of their contribution to enhancing quality of statistical products or their potential for reducing risks that could impact quality.
Effective quality management requires clear governance, as well as engagement and buy-in from senior management. In fact, success depends not only on strong leadership and clear governance, but also on the professionalism, dedication and diligence of employees. A positive work environment that respects human values and promotes career development and innovation is the atmosphere in which people will exercise their commitment to quality on a daily basis.
Finally, having a permanent statistical unit (the Quality Secretariat for Statistics Canada) allows the agency to ensure that quality management is considered in all modernization initiatives, and that quality management tools reflect modernized business processes and structure.
Developing and maintaining credibility is a particular challenge for NSOs. Without the trust of the public and policy makers, statistical products have little value. To achieve this level of trust, quality needs not only to be ensured, but also to be continually improved. One way to do this is through certification from a recognized source, such as the International Organization for Standardization or the IMF, or through satisfying the requirements for OECD membership. Although the stamp of approval is a clear demonstration that certain clearly defined quality measures are in place, the effort required to achieve and maintain the certification can be significant, and the effectiveness (impact on actual quality) is not guaranteed. The quality improvement principles of LeanEndnote 6 and Six SigmaEndnote 7 were developed for application in manufacturing; however, they are a good fit with the core business of producing official statistics, and some NSOs have adopted them, in whole or in part.
It is often a challenge for people to recognize what quality assurance practices they can incorporate into their work activities. It might take a culture shift to get people to think in terms of validating their deliverables, producing diagnostics to demonstrate the effectiveness of their efforts, following a checklist of tasks, or signing off that a particular activity has been completed. These seemingly simple checks represent the most basic examples of quality assurance.
Another challenge, still on the theme of staff resistance, relates to quality assessment and reporting. There is often a misconception that an outcome of quality assessment will be some form of punishment for poor performance. Nothing could be further from the truth. The commitment to quality needs to permeate from senior management all the way down to the most junior and inexperienced staff members, and the message needs to be consistent: the goal of quality assessment is to maintain and improve the quality of processes and products, not to manage people.
Another challenge is getting and sustaining buy-in, particularly at the middle-management level. The commitment of middle managers to quality management is absolutely essential because they have the authority to supervise and direct the activities of their staff and the opportunity to lead by example. If quality assurance activities are reduced when budgets are cut or timelines shortened, the implicit message to staff is that quality assurance is not a priority. It is a challenge for the entire organization to resist the temptation to cut corners on quality when times get tough.
Finally, compliance is a challenge as there is a growing body of policies, directives, guidelines and best practices, many of them related to quality. Program managers sometimes need time and resources to adopt new practices or migrate to new tools and concepts. Compliance challenges should be considered in the resource allocation process and supported by a team of quality experts if necessary.
The organization participates in an international working group to develop quality indicators for all phases of the GSBPM. The agency is reaching a state of indicator overload. To make the most effective use of indicators, there is a need to organize and streamline them and to present them in a format that facilitates comprehension and appropriate reaction. This should be automated to the extent possible.
Considerable resources have been dedicated towards maximizing the use of administrative data, in particular, of Big Data. Frameworks have been developed guiding the agency's practices for acquiring, exploiting and maintaining large data holdings. Although the quality principles are the same as for survey data, the challenges, expectations and uses are quite different. It is therefore important to ensure that quality assurance practices for alternative data sources be relevant and effective.
Appendix A: Examples of corporate performance indicators
|Quality dimension||Performance Indicator|
|Punctuality and timeliness||Number of data products released as scheduled|
|Timeliness of release|
|Accuracy||Post-release corrections for reasons of accuracy|
|Mean absolute revision|
|Level of accuracy achieved|
|Interpretability||Number of pages viewed in the sources and methods documents available on the Statistics Canada website|
|Up-to-date metadata in the Integrated Metadatabase|
|User guide documentation up-to-date|
|Coherence||Compliance with standard variables and classifications|
|Accessibility||Number of visits to the Statistics Canada website|
|Number of visits to The Daily|
|Number of analytical and data products accessed|
|Number of media citations|
|Number of professional citations|
|Percent of clients who receive the information that they requested|
|Percent of website visitors that found the information they were looking for|
|Number of users engaged in Statistics Canada's social media|
|Number of postsecondary institutions and governmental and other organizations receiving access to microdata files|
|Number of cycles of confidential microdata files and public-use microdata files available to Canadian postsecondary institutions, research data centres and other institutions|
|Number of active deemed-employee researcher contracts|
|Real-time Remote Data Access—Number of account-submissions|
|Cost efficiency||Annual operational costs|
|Percent of respondents that were offered an e-questionnaire|
|Index of response burden hours|
|Business surveys using tax/administrative data|
|Volume of cost-recovery contracts conducted by Statistics Canada— Statistical surveys and related surveys|
|Value of cost-recovery contracts conducted by Statistics Canada—Statistical surveys and related surveys|
|Value of cost-recovery contracts conducted by Statistics Canada—Custom requests and workshops|
|Relevance to measure the effectiveness of statistical infrastructure programs||Percent of programs using methodology services|
|Percent of programs using statistical infrastructure services|
|Percent of programs using operational statistical services|
|Number of programs that undergo a review of their methodology and/or statistical infrastructure|
|Percent of programs reviewed to which the methodology and/or statistical infrastructure provided approved solutions|
|Proportion of proposed solutions adopted by the programs|
|Percent of Collection and Operations Services Agreements (COSA) components that are met (main estimates)|
|Percent of Collection and Operations Services Agreements (COSA) components that are met (cost-recovery)|
|Percent of investments in the Continuity and Quality Maintenance Investment Plan implemented as planned|
- Endnote 1
United Nations Statistics Division. 2012.
Return to endnote 1 referrer
- Endnote 2
Statistics Canada. 2009.
Return to endnote 2 referrer
- Endnote 3
United Nations Economic Commission for Europe.
Return to endnote 3 referrer
- Endnote 4
Organisation for Economic Co-operation and Development. 2015.
Return to endnote 4 referrer
- Endnote 5
United Nations Statistics Division. 2012.
Return to endnote 5 referrer
- Endnote 6
Cardiff University. 2015.
Return to endnote 6 referrer
- Endnote 7
Return to endnote 7 referrer
Brodeur, Doherty, Dufour, Lussier, Mayda, Mcauley, Ménard,Ridgeway, Royce and Smith. 2009. Statistics Canada. Report of the Task Force on Corporate Business Architecture.
Cardiff University. 2015. Lean Principles. Retrieved from http://www.cardiff.ac.uk/lean/principles and Wikipedia. 2015. Lean Six Sigma. Retrieved from http://en.wikipedia.org/wiki/Lean_Six_Sigma
Organisation for Economic Cooperation and Development (OECS). September 30, 2015. Draft Recommendation of the Council on Good Statistical Practice. (Note by the Secretary-General).
United Nations Economic Commission for Europe (UNECE). 2013. Generic Statistical Business Process Model. Consulted on the 11th of March 2016, and retrieved from http://www23.statcan.gc.ca/standards-normes/gsbpm-msgpo/2013/index-eng.html
United Nations Statistical Division (UNSD). 2012. Guidelines for the template for a generic national quality assurance framework (NQAF). Consulted on the 11th of March 2016, and retrieved from http://unstats.un.org/unsd/dnss/QualityNQAF/nqaf.aspx
- Date modified: