Statistics Canada - Statistique Canada
Skip main navigation menuSkip secondary navigation menuHomeFrançaisContact UsHelpSearch the websiteCanada Site
The DailyCanadian StatisticsCommunity ProfilesProducts and servicesHome
CensusCanadian StatisticsCommunity ProfilesProducts and servicesOther links

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Statistics Canada Quality guidelines Main page Defining quality Survey steps Management context Acronyms More information PDF version Online catalogue
Management   The management context >

The quality assurance framework

Managing relevance
Managing accuracy
Managing timeliness
Managing accessibility
Managing interpretability
Managing coherence
Environment

The Quality Assurance Framework is the set of management, operating and consultative practices, procedures, and mechanisms that are used by Statistics Canada to manage the quality of its information products. This framework has been developed and adapted over a period of many years, and continues to evolve. It links user needs with program products and provides for feedback, performance review, and ongoing planning and development. It gives direction and guidance to project and program managers and in turn, to their teams, to achieve overall coherence and balance within programs among what may be conflicting priorities, constraints, and design and quality issues. The Quality Assurance Framework is summarized below in the context of the six elements of quality – relevance, accuracy, timeliness, accessibility, interpretability and coherence - with a brief discussion of key supports to quality under the topic of the Environment of the Agency.

Managing relevance

The management of relevance embraces those processes that lead to the determination of what information the Agency produces and the level of resources to be devoted to each program. It deals essentially with the translation of user needs into program approval and budgetary decisions within the Agency. The processes that are used to assure relevance also permit basic monitoring of other elements of quality and correspondingly to assess user requirements in these other dimensions.

To fulfill its mandate it is paramount that the Agency’s programs and outputs properly and continuously reflect the country’s most important information needs. Since these needs evolve over time, a process for continuously reviewing programs in the light of client needs and making necessary adjustments is essential.

User needs are identified through bilateral and multilateral liaison with major users, through information and advice provided by statistical organizations and consultative groups and through user feedback on existing products and services. Regular reviews of all programs are conducted through biennial and quadrennial program reports, as well as through ad hoc reviews or audits.

Data analysis also provides feedback on information gaps and limitations: directly from analysts; through published articles and through the peer review processes for these articles and through feedback in reaction to and commentary on analytical results; and through the use of analytical frameworks such as the System of National Accounts, that integrate and reconcile data from different sources within Statistics Canada.

Program decisions and adjustments usually take place through an annual strategic and long-term planning process that examines new and developing information needs. In addition to user needs and costs, respondent burden, public sensitivities, and the Agency’s capacity and expertise have to be taken into account. Judgements have to be made in light of current public policy priorities as to which statistical programs are in most need of redevelopment or of new or additional investment.

There are, however, constraints on change or adjustment. It has been estimated that more than 90% of the Agency’s budgetary resources are devoted to ongoing programs that are non-discretionary at a given point in time. These programs serve the information needs of a broad clientele through provision of basic information on Canadian society and the Canadian economy, and they meet the legislative and regulatory needs specified in approximately two dozen Acts of Parliament.

A second constraint on adjustment is the interdependency between different programs. In many cases information from one program feeds another (e.g., retail sales information feeds into GDP calculations, vital statistics are used in population estimates) so that the impact of adjustments in one program on other programs has to be considered.

New or emerging information needs must therefore be funded through savings within non-discretionary programs that do not imperil their outputs, through redirection of resources within the discretionary component, or through persuading clients (particularly federal government clients) to finance such worthy additions to the national database.

Managing accuracy

Processes described under relevance determine which programs are going to be carried out, their broad objectives, and the resource parameters within which they must operate. Within those “program parameters” the management of accuracy requires particular attention during the design and implementation, and assessment phases of a statistical activity, each one built on the others.

Program design and implementation

The accuracy achieved through program design - as well as the degree of timeliness and coherence - will depend on the explicit methods put in place and the quality assurance processes built in to identify and control potential errors at the various stages of implementation of the program. Decisions on what constitutes acceptable accuracy are left to the individual program to determine and justify in light of its knowledge of user requirements and the circumstances, budget and other constraints, opportunities and objectives within which it has to work.

These Quality Guidelines describe specific practices, methods and considerations that should be taken into account in designing programs, and indicate where formal standards or guidelines exist. While the strength of the survey methodology will depend on the judgements of the survey design team, whatever specific methods are applied must be within the realm of commonly accepted and defendable statistical practices under the given circumstances. The use of new technologies and innovations to improve quality and efficiency is encouraged, but must be adequately tested to minimize risk. It must be possible to monitor quality, to react effectively to unanticipated problems and to be able to verify or support the credibility of the results, as well as to understand their limitations.

The results of implementation depend not only on the specific design and the survey instruments (e.g., the sample design and the questionnaire), but also on the instruments of implementation. These latter instruments will include the resource and material plans, the supervisory structure, the schedules, the operations, procedures and checks, the training, the publicity, etc., developed and specified during the design phase. Mechanisms for monitoring implementation should be built into survey processes as part of design. Two types of information are required. The first is information to monitor and correct, in real time, any problems arising during implementation. The second is information to assess, after the event, whether the design was carried out as planned, whether some aspects of the design were problematic in operation, and what lessons were learned from the operational standpoint to aid future designs. Information pertaining directly to accuracy as well as information related to costs and efficiency of operations is equally important to the consideration of accuracy for future designs.

Accuracy assessment

The assessment of accuracy entails determining what level of accuracy has actually been achieved. It needs to be a consideration at the design stage since the measurement of accuracy often requires information to be recorded as the survey is taking place.

As with design, the extent and sophistication of accuracy assessment measures will depend on the size of the program, and on the significance of the uses of the estimates. Statistics Canada’s Policy on Informing Users of Data Quality and Methodology (Statistics Canada, 2000d) requires at least the following four primary areas of accuracy assessment to be considered in all programs: assessment of the coverage of the survey; assessment of sampling error where sampling was used (standard errors, or coefficients of variation, should be provided for key estimates); nonresponse rates and estimates of the impact of imputation; and descriptions or measures of other serious accuracy or consistency problems with the survey results. Measures of accuracy are also an important input for Program Review for assessing whether user requirements are being met, and for allowing appropriate analytic use of the data. They are also a crucial input to the management of interpretability as elaborated below.

In light of the high technical content of many design issues, programs are encouraged to incorporate independent technical reviews into their design, implementation and accuracy assessment plans. This may be done, for example, through an internal technical review committee for major programs; referral of issues of technical standards, or general methods or approaches to the Methods and Standards Committee; or an Advisory Committee.

Managing timeliness

Timeliness of information refers to the length of time between the reference point, or the end of the reference period, to which the information relates, and its availability to users. Information that is available to users well within the period during which it remains useful for its main purposes is considered to be timely.

Planned timeliness is a design decision, often based on trade-offs with accuracy and cost. Improved timeliness is not, therefore, an unconditional objective. However, timeliness is an important characteristic that should be monitored over time to warn of deterioration, and across programs, to recognize extremes of tardiness, and to identify good practices. Major information releases should have release dates announced well in advance. The achievement of planned release dates also should be monitored as a timeliness performance measure, as should changes in planned release dates, over longer periods.

For some programs, the release of preliminary data followed by revised and final figures is used as a strategy for making data timelier. In such cases, the tracking of the size and direction of revisions can serve to assess the appropriateness of the chosen timeliness-accuracy trade-off. It also provides a basis for recognizing any persistent or predictable biases in preliminary data that could be removed through estimation.

For ad hoc surveys and new surveys, and for programs that offer customized data retrieval services, the appropriate timeliness measure is the elapsed time between the receipt of a clear request and the delivery of the information to the client. Service standards should be in place for such services, and their achievement monitored.

Improvements in timeliness might be expected as new technologies are developed and as uses of data change. There may be an ongoing need to assess current practices to achieve and improve timeliness through operational evaluations, experimentation, testing and process measurement. The ability to inform users on timeliness constraints is also an important aspect of the management of timeliness.

Managing accessibility

Accessibility of information refers to the ease with which users can learn of its existence, locate it, and import it into their own working environment. Statistics Canada’s dissemination objective is to maximize the use of the information it produces while ensuring that dissemination costs do not reduce the Agency’s ability to collect and process data in the first place. Corporate-wide dissemination policies and delivery systems determine most aspects of accessibility.

Program managers are responsible for designing statistical products, choosing the appropriate delivery systems and ensuring that statistical products are properly included within corporate catalogue systems. In determining what information products and services to offer, program managers must liaise with clients, research and take careful account of client demands and monitor client feedback on the content and medium of their products. (The Agency’s Marketing Division provides services to assist in or facilitate these processes.) Program managers must also ensure that products comply with the policies and standards requirements in Highlights of Publications, Informing Users of Data Quality and Methodology, Presentation of Data, and Review of Information Products (Statistics Canada, 2003d).

At the corporate level, the primary dissemination vehicles include: The Daily for the initial release of all data; CANSIM as the repository of all publicly available data; the Statistics Canada website as a primary entry point for those seeking data; and an extensive program of publications and analytical reports for specific client groups.

Advisory Services provides a single point of access to Statistics Canada information and services through a network of Regional Reference Centres across the country. The Government’s depository libraries program ensures that all our products are available to libraries across the country. The Agency’s Data Liberation Initiative makes sure that universities have access to an array of Agency products for educational and research purposes at a reasonable cost.

A variety of options are open to program managers to make their data files more accessible for analytical purposes, including: the production of public-use microdata files that have been screened (and approved by the Microdata Release Committee) to protect confidentiality; the provision of a custom retrieval service; contracting with an external analyst under the Statistics Act; and referral to the Research Data Centres program administered by the Social Sciences and Humanities Research Council of Canada.

Managing interpretability

Providing sufficient information to allow users to properly interpret statistical information is a responsibility of the Agency. Managing interpretability is primarily concerned with the provision of metadata or ‘information about information’.

The information needed to understand statistical data falls under three broad headings:

a) the concepts, variables and classifications that underlie the data;
b) the methodology used to collect and compile the data; and
c) indicators of the accuracy of the data.

In the case of public-use micro-data files, information regarding the record layout and the coding/classification system used to code the data on the file is an essential tool to allow users to understand and use the data files.

Statistics Canada’s standards and guidelines for the provision of metadata derive from the Policy on Informing Users of Data Quality and Methodology (Statistics Canada, 2000d). Program managers are responsible for ensuring that their products meet the requirements of this policy and for documenting their programs within the Integrated Metadatabase (Statistics Canada, 2000c).

A further aid to Statistics Canada’s clients is interpretation of data as they are released through commentary in The Daily and through the highlighting of the principal findings in all statistical publications as required by the Policy on Highlights of Publications (Statistics Canada, 1985b). Serious public misinterpretations of data are responded to by policy (Statistics Canada, 1986b).

Managing coherence

Coherence of statistical data includes coherence between different data items pertaining to the same point in time, coherence between the same data items for different points in time, and international coherence. Three complementary approaches are used for managing coherence in Statistics Canada.

The first approach is the development and use of standard frameworks (e.g., the System of National Accounts), concepts, variables and standard classification systems for all major variables as well as consideration of international standards where these exist.

The second approach aims to ensure that the process of measurement does not introduce inconsistency between data sources even when the quantities being measured are defined in a consistent way: e.g., through the use of a common business register as the frame for all business surveys; the use of commonly formulated questions; the application of “harmonized” methodologies and systems; the use of the Quality Guidelines; the use of established centres of expertise in certain methodologies and technologies; reference to international codes of best practice.

The third approach analyses the data themselves and focuses on the comparison and integration of data from different sources or over time (e.g., the integration of data in the national accounts, benchmarking or calibration of sub-annual and annual estimates). This kind of analysis attempts to recognize situations where variation or inconsistency exceeds levels implied by the expected accuracy of the data. Feedback from external users and analysts of data that point out coherence problems with current data is also an important component of coherence analysis.

Environment

The management of the six dimensions of quality, of course, takes place in an organizational environment. In place are measures that aim to create an environment and culture that recognizes the importance of quality to the Agency’s effectiveness and that promotes quality.

The measures include a program of entry-level recruitment and development for major occupational groups, and an overall training and development framework. They include a variety of communication vehicles to provide employees with information and to seek employee feedback on how to improve programs and the organizational environment. They include explicit measures to develop partnerships and understandings with the Agency’s suppliers. Particular attention is paid in following-up on respondent complaints. Questionnaires are tested to ensure minimal intrusion on privacy, to respect public sensitivities and to gain overall social acceptability. Cooperative arrangements with data respondents are pursued through a number of means including a respondent relations program and a response burden management program.

They also include programs of data analysis and methodological research that encourage a continuous search for improvement. Conducting data analysis promotes the relevance, accuracy and coherence of the Agency’s statistical data while allowing staff to obtain broader contacts and experience. Similarly, research and development of methods and tools of a statistical, subject matter, informatics or operational nature helps to achieve high quality and to create a culture of quality improvement, in addition to yielding efficiency gains.

 



Home | Search | Contact Us | Français Return to top of page
Date Modified: 2014-04-10 Important Notices