Audit of Labour Force Survey

Audit Report

June 2019
Project Number: 80590-109

Executive summary

The Labour Force Survey (LFS) is a household survey carried out monthly by Statistics Canada. The LFS is a key economic indicator, and is one of the Agency's twelve mission-critical programs (programs necessary to meet Statistics Canada's legislated mandate or essential to the functioning of government). It is also the only source of monthly estimates of total employment in Canada. In addition to the standard labour market indicators such as the unemployment rate, employment rate and participation rate, the LFS provides information on the personal characteristics of the working-age population, including age, sex, marital status, educational attainment, and family characteristics.

Preparing the LFS is a significant undertaking. Each month over a ten day period, Statistics Canada interviewers contact a random sample of 56,000 households. Interviews are conducted by telephone interviewers working out of regional offices or by personal visits from field interviewers. Once the survey data is collected, it is processed and published in The Daily ten days following the completion of data collection.

The Labour Force Survey Section, within the Labour Statistics Division (LSD), is responsible for overseeing the production of the LFS each month. In January 2019, the Agency implemented a new computer system to edit, code and process LFS data. This new system, the Social Survey Processing Environment (SSPE), is a generalized survey data processing system used by a number of other Statistics Canada surveys, and replaces the Head Office Processing System (HOPS), which was in place for more than 25 years.

Why is this important?

The Labour Force Survey is a mission critical survey that is undergoing significant redesign. Ongoing efforts to maintain quality and manage the risk of error remain a priority during this transition, particularly given the importance and public visibility of this survey and the information it produces.

Key audit findings

The LFS program is compliant with subject matter and process based quality assurance guidelines. However, a formal validation strategy has not been developed, and formal validation reports are not prepared for each production cycle.

Risks to the launch of SSPE were well managed. Management was proactive in informing users of the change in systems, and were transparent in explaining the differences between the new and old systems.

With only minor exceptions, the recommendations from management's Review of the July 2014 Labour Force Survey Release were applied in the implementation of the new processing system (SSPE).

The increase in LFS nonresponse is being well managed. All key stakeholders are actively involved in addressing this challenge. Response rates and their impact on quality are actively monitored throughout the collection period, and further analyzed after collection is completed. However, some work remains to better understand the degree of bias in nonresponse and how it affects the quality of LFS outputs.

LFS management consults with key users/stakeholders of LFS outputs through a variety of formal and informal means. In 2018, these consultations included a broad base of internal and external stakeholders, including federal, provincial, and territorial government users, as well as municipalities, non-governmental organizations, and private sector companies. Management considers, and where possible, applies user and stakeholder feedback.

Key evaluation findings

Data from the LFS are a key economic indicator providing vital labour market information on the national/provincial/local landscape. Data from the survey were used for many purposes including monitoring, reporting, forecasting, and policy formulation. The LFS generally provided respondents with the information they needed, however, the sample size limited its usefulness. To mitigate some of these limitations, respondents used complementary data sources such as the census of population and the Survey of Employment, Payrolls and Hours.

Respondents believed that the LFS accurately reflected the labour market at the national and provincial levels. Sample size limitations came into play however for more detailed levels of analysis such as at lower levels of geography and/or for sub-population groups. Finally, timeliness and consistency of release of the LFS was viewed as a strength.

Examined across several dimensions (use, meeting needs, accuracy, timeliness), respondents had a significant level of trust in the LFS; especially with more aggregate results such as at the national and provincial levels.

Data tables from the Statistics Canada website and customized tabulations were the primary mechanisms used to access LFS data. Respondents also viewed The Daily however it was as a secondary source to provide a first glance of national and provincial results.

Finally, Statistics Canada's increased engagement of stakeholders was noted and well received by respondents. Better communication of consultation results, action items and next steps would be welcomed.

Overall conclusion

Management has implemented an adequate quality control framework to ensure the accuracy and relevance of Labour Force Survey outputs. Subject matter based and process based quality assurance activities are performed as required by the Validation Guidelines, however, formal validation strategy and validation reports are required to better support LSD management in their assessment of data quality each month. Additionally, some work remains to better understand the bias in LFS nonresponse and how it might affect the quality of the survey should nonresponse rates change.

Users are generally satisfied with the LFS. It is a trusted data source, is considered timely, and is used for many purposes including monitoring, reporting and forecasting. However, opportunities exist to address data gaps at the sub-provincial level. Further, results from consultation activities could be better communicated to users and consultation participants.

Conformance with professional standards

The audit was conducted in accordance with the Mandatory Procedures for Internal Auditing in the Government of Canada, which include the Institute of Internal Auditors' International Standards for the Professional Practice of Internal Auditing.

Sufficient and appropriate audit procedures have been conducted, and evidence has been gathered to support the accuracy of the findings and conclusions in this report, and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined, and for the scope and period covered by the audit.

Steven McRoberts
Chief Audit and Evaluation Executive

Introduction

Background

The Labour Force Survey (LFS) is a household survey carried out monthly by Statistics Canada. The LFS is a key economic indicator, and is one of the Agency's twelve mission-critical programs (programs necessary to meet Statistics Canada's legislated mandate or essential to the functioning of government). Since its inception in 1945, the objectives of the LFS have been to divide the working-age population into three mutually exclusive categories in relation to the labour market — employed, unemployed, and not in the labour force — and to provide descriptive and explanatory data on each of these groups.

Initially a quarterly survey, the LFS became a monthly survey in 1952. In 1960, the Interdepartmental Committee on Unemployment Statistics recommended that the LFS be designated as the source of the official measure of unemployment in Canada. This endorsement was followed by demand for a broader range of labour market statistics, particularly more detailed regional data. The scope of the survey has expanded considerably over the years, with a major redesign of the survey content in 1976 and again in 1997, and provides a rich and detailed picture of the Canadian labour market.

The LFS is the only source of monthly estimates of total employment in Canada. In addition to the standard labour market indicators such as the unemployment rate, employment rate and participation rate, the LFS provides information on the personal characteristics of the working-age population, including age, sex, marital status, educational attainment, and family characteristics.

LFS process in brief

The Labour Force Survey Section, within the Labour Statistics Division (LSD), is responsible for overseeing the production of the LFS each month.

Data collection for the LFS is carried out each month over the ten days following the LFS reference week. The reference week is normally the week containing the 15th day of the month. During collections, Statistics Canada interviewers contact a random sample of 56,000 households to obtain data on the labour force. Interviews are conducted by telephone interviewers working out of regional office computer-assisted telephone interview (CATI) sites or by personal visits from a field interviewer. The sample of households consists of six panels of dwellings. Each panel is contacted on six consecutive months or survey occasions. Every month, the oldest panel is dropped and a new panel is introduced. Hence, two consecutive survey occasions have five panels in common. Non-response to the LFS averaged about 12% of eligible households in 2018. Consistent with global trends, this rate has been increasing over time (approximately 11% in 2017, and 10% in 2016).

Once the survey data is received, it is processed prior to its release. This includes editing and imputing the data, manually applying industry and occupation coding, deriving certain variables by combining items on the questionnaire, weighting the data to enable tabulations of estimates at national, provincial, and sub-provincial levels of aggregation, and applying various adjustments to the data (i.e., seasonal adjustment, reference week effect adjustment, holiday effects on hours worked adjustment).

LFS survey results are then published in The Daily ten days following the completion of data collection.

Revisions and redesigns

The LFS is periodically subject to revision and redesign. Every five years, population estimates are rebased or reweighted to the most recent census population counts. The LFS undergoes a sample redesign every ten years to reflect changes in population characteristics and new definitions of geographical boundaries. Redesign of the questionnaire, data collection, processing and dissemination systems occur approximately every 20 years.

The last rebasing of population estimates and sample redesign occurred in January 2015. Also in 2015, a project was undertaken to transition to a new computer system to edit, code and process LFS data. This new system, the Social Survey Processing Environment (SSPE), is a generalized survey data processing system used by a number of other Statistics Canada surveys, and replaces the Head Office Processing System (HOPS), which was in place for more than 25 years. This new processing system was implemented in January 2019.

Audit objective

The objective of the audit was to provide the Chief Statistician and the Departmental Audit Committee with reasonable assurance that management has implemented an adequate quality control framework to ensure the accuracy and relevance of Labour Force Survey outputs.

Evaluation objective

The audit also included an evaluation component. The objective of the evaluation component was to provide the Chief Statistician and Statistics Canada's Performance Measurement and Evaluation Committee with insight on the survey's relevance and performance (accuracy) from the user's perspective.

Scope

The audit scope included an examination of the governance structures, risk management activities, and quality controls for managing LFS activities. Specific areas that were examined were:

  • Quality assurance processes for validating LFS outputs.
  • Processes for managing risks related to the launch of the new LFS processing system.
  • Follow-up on the implementation of the recommendations resulting from management's Review of the July 2014 Labour Force Survey Release, with respect to the new LFS processing system.
  • Processes for monitoring and managing the declining survey response rate.
  • Processes for soliciting feedback from users and other stakeholders, and applying this feedback to program planning.

The period under review was April 1, 2017 to December 31, 2018.

Approach and methodology

This audit was conducted in accordance with the Mandatory Procedures for Internal Auditing in the Government of Canada, which include the Institute of Internal Auditors' International Standards for the Professional Practice of Internal Auditing.

The evaluation component of the audit was conducted in compliance with the Policy on Results and related Standards on Evaluation.

Field work consisted of a review of applicable LFS processes, activities and tools to ensure compliance with Statistics Canada Quality Guidelines.

Authority

The review was conducted under the authority of the approved Statistics Canada Integrated Risk-based Audit and Evaluation Plan 2018/2019 to 2022/2023.

Audit findings, recommendations and management response

Quality controls

The LFS program performs the subject matter based and process based quality assurance activities as required by the Validation Guidelines. However, a formal validation strategy has not been developed, and formal validation reports are not prepared for each production cycle.

Risks to the implementation of SSPE were well managed, including risk related to user acceptance of the new system.

With only minor exceptions, the recommendations from the management's Review of the July 2014 Labour Force Survey Release were applied in the implementation of the new processing system (SSPE).

The increase in nonresponse is being well managed, with all key functions actively involved in addressing the challenge. However, some work remains to better understand the degree of bias in the nonresponse and how it affects the quality of LFS outputs.

A formal validation strategy for LFS has not been developed.

Validation is the activity of assessing the quality of a statistical output in terms of its accuracy, coherence and overall reasonableness. It is where statisticians validate the quality of the outputs produced, in accordance with a general quality framework and with statistician's expectations based on their cumulative knowledge of current conditions and the specific statistical domain. Validation is meant to challenge rather than rationalize estimates. It also confirms important correlations and relationships, overall and within smaller domains. Therefore, it is important that unusual results or movements from previous results are understood and that atypical estimates are a trigger for more detailed investigation.

The Agency's Directive on the Validation of Statistical Outputs ("Directive") makes the necessary provisions to ensure that data validation is consistently applied and documented on all the Agency's statistical outputs. It requires that all mission critical programs perform all the validation steps described in the Agency's Guidelines for the Validation of Statistical Outputs ("Validation Guidelines"), unless a justification can be given as to why the step could not be completed. As a mission critical program, LFS is expected to comply with the Validation Guidelines.

The Validation Guidelines state that "directors shall ensure that each program in their Division has a validation strategy in place which meets the requirements of the Directive on the Validation of Statistical Outputs." The strategy should include the planned validation activities to be conducted and the specifics on what will be done (e.g., which files to confront, which external partners will be implicated). In most instances, the validation strategy will remain stable over time, and will need to be updated only as changes are made to the validation measures used. Program directors are expected to review each program's validation strategy at least every three years to ensure the validation activities being conducted are sufficient to mitigate risks to data quality.

The audit found that management has not documented a complete and cohesive validation strategy, although various processes are in place to ensure that validation activities took place each month (discussed below). The production team and the analysis team each maintain their own checklists of key operational and validation activities. However, these checklists do not include a comprehensive list of all validation activities completed each month, and do not provide full detail on how the validation activities are to be performed. Further, validation activities performed by Methodology and Demography are not included. In such cases, the teams performing the validation activities provided the production team with an email notification confirming the results of their validation activities. No single repository of all validation activities has been developed to support management in its assessment of their adequacy. Given the multiple functions involved in validation activities, a documented validation strategy would better support LSD management in their required assessment of the adequacy of the validation activities being performed.

The LFS program performs the subject matter based and process based quality assurance activities as required by the Validation Guidelines.

The Validation Guidelines define specific validation steps which all mission critical programs must follow, unless a justification can be given as to why the step could not be completed. The Validation Guidelines divide validation activities into eight subject matter based activities and two process based activities. The subject matter based validation activities are (see Appendix C for a description of the validation activities):

  1. Analysis of changes over time;
  2. Verification of seasonally adjusted estimates;
  3. Verification of estimates through cross-tabulations;
  4. Coherence analysis based on known current events;
  5. Confrontation with other similar sources of data published by Statistics Canada;
  6. Consultation with stakeholders internal to Statistics Canada;
  7. Review of The Daily by Senior Management; and
  8. Formal briefing to the Strategic Management Committee.

The process based validation activities for mission critical surveys are: review of production processes; and coherence analysis based on quality indicators.

Interviews were held with management and key program staff from processing, methodology, analysis and dissemination to gain an understanding of the process based validation exercises/activities. Key documentation supporting LFS validation activities during a sample of three months (October, November and December 2018) was also reviewed to confirm whether the required subject matter based and process based validation activities were performed.

The audit found that the LFS program performs the subject matter based and process based quality assurance activities as required by the Validation Guidelines. Validation activities were being conducted for each of the required steps, however, they were not documented in validation reports (as discussed below). This was due in part to a lack of understanding of the requirements in the Validation Guidelines, and also to the tight timelines associated with the LFS which require that data be validated and published within 10 days of the close of collections. That said, many of the validation activities are being performed, and potential issues being resolved, in various team meetings. These include the weekly LFS Operations Committee meetings, the monthly LFS Analysis team meeting, and in regular meetings between LFS management and the Principal Researcher, Analytical Studies Branch (ASB).

Each week, the LFS Operations Committee meets to discuss any issues or concerns related to LFS that have arisen. This includes, but is not limited to, discussing and resolving issues identified in data validation. These meetings are attended by representatives from key functions involved in the LFS, including personnel from collections, methodology, information technology, analysis, and LFS management. These meetings enable the team to apply a multi-disciplinary approach to quickly identifying and resolving any potential issues.

The analysis team also meets once each month to review the LFS data and perform several validation activities. Although no records are kept of these meetings, interviews confirmed that the team performs their validation activities by comparing various data sets and, once satisfied, the Chief of Analysis provides her signoff on the analysis team checklist.

In addition, LFS management meets with the Principal Researcher, ASB, to review the LFS data. The Principal Researcher applies his knowledge of results from existing surveys and economic indicators to help management identify areas of interest or potential data issues. The LFS production team then conducts investigations into certain areas to validate the data. No records are kept of these discussions or of the resulting investigations, but once completed, the relevant validation activities are signed off in the production team checklist.

Finally, several validation activities are performed by the methodology team. After completing their validation, they confirm their results by email to the LFS production team. Any potential issues that are identified are discussed further at the weekly LFS Operations Committee meetings.

Formal validation reports are not prepared for each production cycle.

The Validation Guidelines also require that a Validation Report be prepared to record the results of the program's validation activities each production cycle. This report should record and communicate the outcome of the validation investigations. If perceived or actual anomalies were found, these discrepancies, the investigation that was conducted, and either the correction strategy implemented or the reason for accepting the discrepancy must be documented. The report need only capture the main points or the significant elements of the validation process, but with enough detail so that it is clear why the data quality conclusion has been reached.

The audit found that formal validation reports are not being prepared. As noted, two checklists are used to track some, but not all, validation activities. Further, these checklists do not record whether any perceived or actual anomalies were found, what the discrepancies were, what investigations were conducted, what corrective strategy was implemented or what the reason for accepting the discrepancy was. While this information was reported to be discussed during the weekly LFS Operations Committee meeting and the monthly Analysis Committee meeting, no records were kept of the discussions or the ultimate resolution of identified discrepancies.

Risks to the implementation of SSPE were well managed, including risk related to user acceptance of the new system.

Risk management is an essential component in the delivery of information technology projects. When changes or updates to systems are being implemented, particularly for mission-critical surveys, risks to the successful implementation of the project should be identified and assessed, and mitigation plans should be developed to address risks that exceed management's tolerance. The audit examined the risk management processes in place to identify, assess and respond to risks related to the launch of the new processing system, SSPE.

Although formal risk assessment processes were not in place for all aspects of the launch, the audit found that risks were identified and assessed, and that mitigation strategies were implemented to address them. In particular, risks related to user acceptance of the new system were well managed.

LFS management, in consultation with Statistical Information Systems Division (SISD), identified, assessed and mitigated risks throughout the development and implementation of SSPE. In 2016, the Security Authorization Committee (SAC) assessed the project as 'high' risk, with the most significant risk areas being: change management; hardcoding issues; combined development, test, and production environments; documentation; and best practices in programming and coding. Management worked to mitigate these risks, and in 2018, SAC reduced the project risk rating to 'medium'. SAC also approved the SSPE implementation, conditional on ensuring adequate version control for any changes made to the system.

In addition to programming, coding and security concerns addressed at SAC meetings, other risks were also considered throughout the development and implementation of SSPE. Of note, risks related to user acceptance of the new system were also identified and managed. To mitigate these risks, LFS management met with key LFS users (Bank of Canada, Employment and Social Development Canada, and others who receive advanced release of LFS data) to discuss the SSPE implementation and answer any questions or concerns they might have. A Transition of Labour Force Survey Data Processing to the SSPE Environment document was also published and made available publicly. This document explained the testing process followed in implementing SSPE and described the differences between the two systems. Management also developed media lines and FAQs in preparation for the release of the new system, ensuring users were well informed of the change.

With minor exceptions, the recommendations from the management's Review of the July 2014 Labour Force Survey Release were applied in the implementation of the new processing system (SSPE).

In 2014, an error was made in the LFS release for the July reference period. Following this error, management conducted an internal review of the incident to determine what happened, why it occurred, why it was not caught in the quality assurance process, and what could be done to mitigate the risk of similar errors occurring in the future. The resulting report made five recommendations, focused on the Agency's information technology change management and data validation practices. The recommendations addressed the areas of: governance over system changes; protocols for testing system changes; diagnostic and error reporting; systems documentation; and informing survey analysts and management responsible for data validation of events affecting the survey production cycle.

HOPS was the processing system in use at the time of the error. Given that this system was replaced in January 2019, the audit assessed whether the five recommendations were applied in the implementation of the new SSPE system. With only minor exceptions, the audit found that the recommendations were applied in its implementation.

Governance and oversight of the SSPE implementation was robust, and roles and responsibilities regarding the implementation were documented and well understood. They are also appropriately segregated, with only one exception noted; the roles of the re-design unit head and processing unit head, while previously segregated, were combined. This change was the result of an employee departure towards the end of the SSPE implementation period. Although inadequate segregation of these roles could result in re-design errors not being flagged in processing due to the same approval authority, this risk is mitigated by the extensive levels of review, monitoring and approval of changes by other levels of staff and management.

While a formal testing protocol was not developed prior to testing the new system as recommended, management did conduct thorough testing of the new system over an 18 month period. The tests performed and results obtained were fully documented, and six months of parallel testing was conducted, whereby results from HOPS and SSPE were compared using the same data sets. Significant differences between the systems were identified, examined, and where necessary, rectified.

Diagnostic and error reporting in SSPE has been significantly improved over that of HOPS. SSPE allows for a flexible and modular approach to LFS processing, with individual steps being easier to isolate, analyze and enhance as required. After each step is run in SSPE, an error report is produced and staff are able to verify that all specifications that form part of the step ran successfully. This level of diagnostic reporting was not available in HOPS.

Systems documentation for SSPE was also significantly improved over HOPS. Documentation of the SSPE system is up-to-date and covers each step in the processing process. It includes a specifications document that outlines what each step of the process does, and a user document that outlines the procedures to follow for each step, providing users with clear guidance on using the system.

The audit also found there are strong communication and information-sharing practices among the many functional areas involved in LFS production, analysis and validation. The weekly LFS Operations Committee meetings provide an effective forum for information sharing by regularly bringing together the key participants in the LFS process. The participants interviewed expressed their satisfaction with this forum and stated that the meetings are useful and provide them with the information needed to do their jobs.

The increase in nonresponse is being well managed, with all key functions actively involved in addressing the challenge.

Nonresponse occurs when a dwelling is identified as eligible and selected for participation in the survey, but does not participate. This can occur due to any number of reasons such as: no one at home, temporary absence, interview not possible (inclement weather, unusual circumstances in the household, etc.), technical problems, or refusal. Nonresponse is one of the most common sources of non-sampling error, and can introduce bias into the results if the characteristics of the responding households are significantly different from the nonresponding households. While high rates of nonresponse do not necessarily result in bias, rising nonresponse rates increase the likelihood of bias.

Nonresponse rates on household surveys have been increasing worldwide in recent decades, and LFS has not been immune. As recently as 1995, LFS nonresponse rates were at 5%. By 2018, this rate had risen to over 13%.

The audit examined how this increase is being managed. The audit expected to find that management established targets for nonresponse rates, monitored achievement against these targets, and took action as required to ensure the quality of LFS outputs was maintained.

Overall, the audit found that the increase in nonresponse rates is being well managed, although some work remains to be done in an effort to better understand how nonresponse affects LFS survey quality.

Response rate targets are established in the Collection and Operations Service Agreement (COSA). Each year, the COSA is reviewed by the Collection, Planning and Research Division (CPRD) and any updates are made in consultation with the key functions in the collection, processing and analysis of the LFS, including the Labour Statistics Division (LSD), Collection Systems Division (CoSD), Operations and Integration Division (OID), Collection Planning and Research Division (CPRD), and the regional collection offices.

Nonresponse is closely monitored both during and after collection. Collections and LFS Operations each produce comprehensive daily reporting to track nonresponse. These results are discussed daily within the Collections and LFS Operations teams, and any issues or concerns are discussed at the weekly LFS Operations Committee meetings. Methodology also tracks an estimate of the coefficient of variation (CV) daily, providing management with an indication of the quality of data at any given point during collection. This reporting also helps management decide on whether to extend the collection period, or where to allocate more collection resources to maximize survey quality.

Management has also been active in addressing the increase in nonresponse. CPRD has undertaken several initiatives, resulting from their December 2018 study to understand the effect of different collection methods on nonresponse rates. These actions include implementing a new national production plan, implementing a best-time-to-call method of reaching respondents after the initial contact, and reducing differences across regions by implementing best practices nationwide.

Some work remains to better understand the bias in the nonresponse and how it affects the quality of LFS outputs.

Although the audit found the increase in nonresponse is being well managed and LFS response rates remain high relative to most surveys, interviews revealed that management does not have a full understanding of the impact of non-response on statistical biases in LFS estimates. To the extent that the labour market characteristics of non-respondents differs from those of respondents - and that these differences are not controlled during data processing, imputation and weighting – non-response could cause systematic differences between LFS results and the results that would be obtained from a survey with an otherwise identical design and no non-response.

As a result of this incomplete understanding of this potential statistical bias, management is unable to fully assess the risk of nonresponse, or to establish the most cost-effective targets necessary to achieve acceptable results. Methodology had planned to conduct such a study in 2018-19, but reported that it was delayed to 2019-20 due to budgetary constraints and higher-priority activities. Such a study would provide management with the information necessary to better plan collections activities and to manage the risk of bias in survey results caused by nonresponse.

Recommendation 1

It is recommended that the Assistant Chief Statistician, Social, Health and Labour Statistics ensure that the Agency documents its validation strategy for the Labour Force Survey.

Management response

Management agrees with the recommendations.

Documentation of activities related to LFS validation activities, including validation activities done by subject matter and methodology, will be consolidated into a single validation strategy document.

Deliverables and timeline

The Director, Labour Statistics Division will prepare a consolidated LFS validation strategy document by March 31, 2020.

Recommendation 2

It is recommended that the Assistant Chief Statistician, Social, Health and Labour Statistics ensure that the Agency prepares a validation report to record results of the program's validation activities each production cycle in compliance with the Guidelines for the Validation of Statistical Outputs.

Management response

Management agrees with the recommendations.

A review of current monthly validation activities (e-mail and paper-based sign-offs and monthly Data Quality Control meeting minutes) will be conducted and consolidated, and a monthly validation report will be developed. A process will also be established to ensure that all validation activities and outcomes are approved on an annual basis by the Director, Labour Statistics Division.

Deliverables and timeline

The Director and Assistant-Director, Labour Statistics Division will:

  • Prepare monthly validation reports by March 31, 2020; and
  • Implement an annual process for director-level approval of LFS validation activities by March 31, 2020.

Recommendation 3

It is recommended that the Assistant Chief Statistician, Analytical Studies, Methodology and Statistical Infrastructure ensure that the Agency complete its planned study on the impact of nonresponse rates on the quality of statistical outputs for the Labour Force Survey.

Management response

Management agrees with the recommendations.

The Methodology team will complete a nonresponse bias analysis within 6 months. The analysis will include: a comparison between Census 2001 (when response rates were high) and Census 2016 (after response has declined) with LFS; and matching LFS and Census using the Social Data Linkage Environment to evaluate the bias.

Deliverables and timeline

The Director General, Methodology will complete an LFS nonresponse bias analysis by November 30, 2019.

Consulting with users and stakeholders

LFS management consults with key users/stakeholders of LFS outputs through a variety of formal and informal means. In 2018, these consultations included a broad base of internal and external stakeholders.

Management considers, and where possible, applies user and stakeholder feedback.

The relevance of statistical information reflects the degree to which it meets the needs of data users and stakeholders. Managing relevance requires ensuring that the Agency's programs remain aligned with information needs as they evolve. Being aware of changing priorities and having the flexibility to respond to them are vital to ensuring continued relevance.

One means by which the Agency ensures the ongoing relevance of its products is through consultation with data users and stakeholders. In its Spring 2014 report, "Meeting the Needs of Key Statistical Users – Statistics Canada", the Office of the Auditor General (OAG) found that, while the Agency consulted regularly with other federal government departments and provincial and territorial governments, LFS did not consult sufficiently with its other users (e.g., private sector, municipalities, and non-governmental organizations).

The audit examined whether management had addressed the OAG's finding by consulting with the full range of LFS users and stakeholders. The audit also examined how the feedback obtained through these consultations was applied to ensure the ongoing relevance of the LFS. To do this, the audit team reviewed documentation on LFS planning, and user and stakeholder consultation and outreach activities in 2016, 2017 and 2018. The audit team also conducted interviews with key program staff and senior management.

LFS management consults with key users/stakeholders of LFS outputs through a variety of formal and informal means. In 2018, these consultations included a broad base of internal and external stakeholders.

Management consults with LFS users/stakeholders through a variety of formal and informal means. These include formal consultations, twice annual outreach meetings with key users, and a number of other events where Statistics Canada representatives interact with LFS users and stakeholders. Although formal consultations on LFS were not conducted in 2016 and 2017, the Client Consultations Unit from the Strategic Communications & Stakeholder Relations Division (SCSRD) held a National Engagement Week in October 2018 to solicit feedback on all of Statistics Canada's products, services and programs. This week included a consultation session with LSD product users, including LFS, led by LSD management. Roughly 85 people participated in this consultation, including representatives from the private sector, municipalities, non-governmental organizations, and federal, provincial, and territorial government employees. While this consultation covered all LSD products, the discussions also touched on, and were applicable to, LFS. The topics covered included the structure and utility of LSD's data products, existing gaps in LSD data, and tools used to present the data. While no formal consultations on LFS are planned for 2019, the Agency tentatively plans to hold another National Consultation Week in 2020.

LSD management also keeps informed of user needs by meeting with certain key users and stakeholders twice annually to discuss upcoming initiatives, program changes or other activities related to the labour statistics program. These sessions typically include representatives from federal, provincial, and territorial governments, banks, and the private sector. Although not considered as formal consultations, these meetings provide opportunities for users to give feedback to management on LSD products, and to receive answers to any questions they might have. Management reported planning to continue with these sessions in 2019.

Finally, interactions with users and stakeholders each month at the LFS release help keep management informed of user needs. This includes monthly calls with federal departments that receive early access to the pre-release information, question and answer sessions following each release where LFS analysts respond to user questions about the data, and user queries on LFS made to the Statistics Canada media line. These regular interactions keep management apprised of shifting user needs or any deficiencies in the LFS product.

Management considers, and where possible, applies user and stakeholder feedback

A central repository of user and stakeholder feedback is not maintained, however feedback from consultations and outreach sessions is generally documented, reviewed and informally assessed by LSD management. User and stakeholder feedback is regularly discussed at LFS Operations Committee and Dissemination Committee meetings, and potential solutions are identified and proposed to LSD management for approval. Key elements of the feedback are then reflected in the LSD operational plan, which identifies the key divisional priorities for the year.

Over the period examined, several trends emerged in user/stakeholder requests. These were: requesting that certain data gaps in LSD products be addressed, that more information on trends, confidence levels, and annual averages be provided; and that detailed LFS information be more easily accessible. The audit found several examples of management taking action to address this feedback. Some examples include the launch of an interactive web app to provide users with a visualization of LFS data, making LFS data more accessible and customizable to their needs, and introducing several new survey questions to respond to emerging user needs.

Evaluation findings, recommendations and management response

Data from the LFS were viewed as being timely, accurate and for the most part meeting the needs of users; however, sample size limited its usefulness.

Several data sources were used to complement the LFS including the census of population and the Survey of Employment, Payrolls and Hours.

The Agency's increased level of engagement and consultations was well received; communication of results, action items and next steps would be welcomed.

The evaluation conducted interviews with a sample of regular LFS data users to gather their perspectives on relevance, quality and access. The following set of findings are reflective of their specific points of view.

Relevance of LFS data

LFS data are used for many purposes including monitoring, reporting and forecasting.

Data from the LFS were viewed as a key economic indicator providing vital information on the national/provincial/local landscape. LFS data are used for many purposes including monitoring, reporting and forecasting. Specifically, respondents noted they used the data for the following purposes:

  • Conduct general analysis of labour market conditions
  • Prepare briefings, presentations, and reports for management, government officials and other stakeholders
  • Update relevant data tables, reports and website information for stakeholders or general public
  • Contribute to economic forecasting and modelling
  • Complement data from other surveys/sources
  • Undertake analysis of economic conditions (e.g., national, provincial, regional)
  • Inform labour force policies/investments
  • Answer specific questions as needed/to conduct research in specific industries or areas (e.g., agriculture, rural areas)

The majority of government respondents reported that the data were required immediately upon release specifically for the purpose of briefing senior managers and elected officials of changes and trends in the labour market. For example, provincial and sub-provincial respondents noted they used the information to monitor key sectors to track the impact of events such as a local plant shutdown.

LFS data provide critical information for policy development and business planning. Some specific examples included planning and investment decisions to address labour shortages and creating industrial profiles to monitor performance and direct planning. Provincial and sub-provincial government officials also used LFS data to compare with other jurisdictions as a measure of relative economic performance and to gauge potential shortages in labour supply.

Researchers and data analysts, in both the private sector and the public sector, reported using LFS data as an important component for forecasting and economic modelling. For example, LFS data were used as input into Gross Domestic Product (GDP) forecasts and to estimate usage of specific government programs. Some respondents also reported using the data to follow the outcomes of sub-groups such as immigrants.

While the LFS generally provides respondents with the information they need, the sample size limits its usefulness

The evaluation found that while the LFS generally provides respondents with the information they need, the sample size limits its usefulness. Using the scale strongly agree / agree / disagree / strongly disagree, 21 out of 24 respondents reported they agreed (17) or strongly agreed (4) with the statement 'The LFS provides me with the information I need'Footnote 1.

The LFS provides me with the information I need
  Strongly Agree Agree Disagree Strongly Disagree Declined to answer
number of respondents (n)
The LFS provides me with the information I need 4 17 2 1 0

While the results appear to be very positive based on the scale, many qualified their response by noting that the usefulness of the data was limited by the sample size and the scope.Footnote 2 They noted several limitations including: volatility for smaller provinces and sub-provincial regions, no data for smaller municipalities, and data suppressions below certain levels. Respondents also remarked that questions could be included to cover topics such as place of work information, mobility, and turnover. For the respondents who disagreed/strongly disagreed, there reasoning was directly related to a lack of information at the level of granularity they required (need for more data for smaller provinces and sub-provincial regions and for small population groups).

Other data sources are used to complement the LFS

As noted previously, while respondents acknowledged and valued the data provided from the LFS, they were also well aware of its limitations. As such, in order to mitigate any deficiencies or fill gaps, virtually all respondents reported also using other data sources (23 out of 24 respondents) to complement the LFS. Data sets used included:

  • Survey of Employment Payrolls and Hours (SEPH) (17)
    • Purpose: complement information from LFS (for example used earnings by industry); validate LFS trends.
  • Census of Population (16)
    • Purpose: bigger sample size so can look at sub-provincial geographies and community-level data; provides breakdown by industry classification and occupation.
  • Job Vacancy and Wage Survey (10)
    • Purpose: fills a gap and can answer more specific research questions.
  • Employment Insurance Data (4)
    • Purpose: unemployment data in communities suppressed in LFS.
  • Provincial government data (4)
    • Purpose: regional labour market data; payroll information from employers; other provincial administrative data.
  • General Social Survey (2)
    • Purpose: specific research questions; work related data on education, skills, well-being, etc.
  • Conference Board of Canada (2)
    • Purpose: forecast data to proxy labour costs; more detailed GDP information at the municipal level.

Some respondents also reported using data sources such as tax-filer data, national accounts data and industry information.

The majority of respondents viewed the data sources as complementary (19) or partially complementary (2) which when combined with the LFS, provided a more holistic view of the labour market. A few respondents noted that other data sources were used as substitutes given limitations arising from sample size. For example, census data were used to examine issues such as occupation and industry at low levels of geography which can be customized.

Recommendation 4

It is recommended that the Assistant Chief Statistician, Social, Health and Labour Statistics ensure that opportunities to address data gaps at the sub-provincial level be explored in consultation with key stakeholders. This could include examining potential data linkages and the use of modelling. Engagement of key stakeholders will ensure that efforts are targeted and that resulting products provide the greatest value added.

Management response

Management agrees with the recommendation.

The Labour Statistics Division is working in partnership with the Labour Market Information Council (LMIC) to develop a strategy to fill data gaps at the sub-provincial level by producing more local and granular data using a combination of traditional and non-traditional statistical methods.

This project will include:

  • Involving at least one province in order to better understand and address specific local and granular information requirements.
  • Producing at least one STC information product (such as a Power BI dashboard) to illustrate the extent to which existing data sources and methods can generate local and granular data.
Deliverables and timeline

The Director and Assistant-Director, Labour Statistics Division will:

  • Present a project plan at the LMIC board meeting in June 2019;
  • Prepare a report summarizing the options for the development of alternative methodologies to fill gaps in local and granular data needs by March 31, 2020; and
  • Prepare at least one STC information product (such as a Power BI dashboard) using existing local and granular data by March 31, 2020.

Effectiveness – Quality

Respondents believe that the LFS accurately reflects the labour market at the national and provincial levels

On the scale strongly agree / agree / disagree / strongly disagree, 21 out of 24 respondents reported they agreed (14) or strongly agreed (7) with the statement 'The LFS accurately reflects changes in the Canadian labour market.'Footnote 3

The LFS accurately reflects changes in the Canadian labour market
  Strongly Agree Agree Disagree Strongly Disagree Declined to answer
number of respondents (n)
The LFS accurately reflects changes in the Canadian labour market 7 14 1 0 2

Overall, respondents believed that LFS estimates at the national and provincial level accurately reflected changes in the labour market. However numerous respondents expressed concerns about the month-to-month volatility of the LFS, especially for small population groups, lower levels of geography and detailed industries; acknowledging that this is characteristic of sample surveys. As such, some said that they focussed more on trends rather than the month to month movements. Several respondents commented that it was a challenge at times explaining to officials and stakeholders why there were fluctuations and a lack of precision with estimates. For example, the use of moving averages for some sub-provincial areas masked or led to lags so it was difficult reconciling the LFS data with what people were observing.

The respondent who disagreed noted that the LFS provided trend information rather than absolute movements. Those who declined to answer responded that they had no reference point for comparison or that it depended on the level of detail being examined.

LFS is meeting or exceeding expectations in terms of timeliness of delivery

Of the 24 respondents, 18 required data immediately after release for a number of purposes including briefing and reporting. For the other 6 respondents, although they did not require the data immediately they noted that others within their organizations did.

Overall, respondents viewed the timeliness and consistency of release to be a strength. In terms of the statement 'Results from the LFS are released in a timely fashion', 23 of 24 respondents responded they either strongly agreed (18) or agreed (5) with it.Footnote 4

Results from the LFS are released in a timely fashion
  Strongly Agree Agree Disagree Strongly Disagree Declined to answer
number of respondents (n)
Results from the LFS are released in a timely fashion 18 5 1 0 0

There was an acknowledgement that the task of reliably collecting the information and producing the results within a very short timeframe on a monthly basis was a noteworthy accomplishment. Respondents were pleased they had a schedule of releases provided well in advance and recognized the consistency with which the schedule was met. The one respondent who disagreed noted they had received their custom requests late at times.

Effectiveness – Access

The Daily serves as a "first glance" with data tables being the principle access vehicle

All respondents reported accessing LFS data through either data tables on the Statistics Canada website and/or customized tables on the day of release and in general.

How LFS data are accessed
How LFS data are accessed Respondents
Data tables on the Statistics Canada website only 5
Customized tables from Statistics Canada only 5
Both data tables and customized tables 14

In addition, 18 respondents reported using The Daily. They noted that it primarily served as a broad overview of national or provincial results and that the data tables were their main avenue for accessing data. Other avenues mentioned were media (2) and third parties (3) such as an industry association or the provincial statistics office.

Respondents using The Daily indicated they were generally satisfied with it. A few suggestions for improvement included:

  • Greater use of visuals and infographics, with the ability to export the graphics
  • More trend analysis, rather than month-to-month changes
  • More industry data
  • Ability for users to customize (set preferences in advance so that charts reflect the interests of the specific user)
  • Provide high-level annual data.

Several respondents commented on the loss of Beyond 20/20 tables with little advanced consultation or notice and a number mentioned challenges with the website.

Trust

There is a significant degree of trust in the LFS; especially with more aggregate results such as at the national and provincial levels

A picture of the level of trust in LFS can be drawn by combining results along a number of dimensions: Footnote 5

Evaluation findings
  Evaluation findings
Are they using the LFS? A wide variety of uses was noted including: briefing officials and stakeholders, reporting to local citizens, monitoring for impacts such as those arising from plant closures, modelling and forecasting labour supply and demand, developing labour market policies, informing investments, and researching trends in specific industries or occupations.
Does it meet their needs? While they recognized there are limitations with the LFS, 21 out of 24 respondents reported they agreed (17) or strongly agreed (4) with the statement 'The LFS provides me with the information I need'.
Are they using substitutes? Virtually all respondents reported using other data sources (23 out of 24) and viewed them as complementary (19) or partially complimentary (2) to fill gaps in the LFS.
Do they believe it provides an accurate picture? 21 out of 24 respondents reported they agreed (17) or strongly agreed (4) with the statement 'The LFS accurately reflects changes in the Canadian labour market.' This was especially true at the Canada and provincial levels.
Do they believe it is timely? 23 of 24 respondents responded they either strongly agreed (18) or agreed (5) with the statement 'Results from the LFS are released in a timely fashion'.

Based on these results, it can be concluded that there is a significant degree of trust in the LFS; especially with more aggregate results such as at the national and provincial levels.

Other Findings

There is generally a good understanding of the differences between the LFS and SEPH

As noted previously, 17 of 24 respondents used SEPH and viewed it as a complementary data source. SEPH data were used by officials to inform the general public, managers used the data for policy development and business planning, and analysts and economists used it for economic modelling and forecasting. In many ways, SEPH data were used in a similar fashion as LFS data.

LFS data were viewed by some as being prone to volatility on a month-to-month basis whereas SEPH data were viewed as more stable. Several respondents reported using SEPH data to validate LFS trends in order to determine if fluctuations represented real changes or were just noise arising from a sample household survey. SEPH users reported that the industry level data were useful for providing insights into labour market movements by sector and that the data were often used by managers and human resources staff during labour negotiations.

Seven respondents reported they did not use SEPH. Their reasons included: no sub-provincial data, less timely than LFS, relatively short history, simply not aware of it.

Most respondents, whether they used SEPH or not, were able to identify some of the key differences between SEPH and LFS. For example, most knew that SEPH is an employer based survey using administrative and survey data while LFS is a household survey. They also knew that LFS provided information on employment rates by various demographic characteristics whereas the focus of SEPH is on employment and earnings by industry. Several also noted that the populations covered were different.

Awareness and use of the Labour Market Observatory is low

The Canadian Labour Market Observatory is a relatively new product which consists of interactive data visualization applications showcasing the vast amount of publicly available labour market information produced by Statistics Canada.

Only 11 out of 24 respondents were aware of the application. Of this group only 4 indicated they used it; the rest were either still evaluating it or viewed it as having limited use for them. Comments provided by the 11 respondents included: the information is limited and not as in-depth as needed; longer trends are required; needs more functionality such as ability to export data.

Engagement through consultations was welcomed, however, results and follow-up were missing

During the open ended portion of the interview, some respondents indicated that they had participated in consultations held by Statistics Canada on the LFS. They mentioned they appreciated the opportunity to contribute their points of view and were encouraged by the engagement. They commented as well however that they had not received any information on findings or next steps arising from the consultations.

Recommendation 5

It is recommended that the Assistant Chief Statistician, Social, Health and Labour Statistics ensure that results from consultation activities be made available to users and participants; including action items and next steps.

Management response

Management agrees with the recommendation.

  • A report summarizing the consultation activities, including action items and next steps, will be provided to consultation participants and to the broader set of LFS stakeholders and data users.
Deliverables and timeline

The Director, Labour Statistics Division will prepare a consultation report to be provided to consultation participants and data users by January 31, 2020.

Appendices

Appendix A: Audit criteria

Appendix A: Audit criteria
Control objectives / Core controls / Criteria Sub-criteria Policy instruments/Sources
Audit objective: Statistics Canada has established an adequate quality control framework to ensure the accuracy and relevance of Labour Force Survey outputs.
1.1 Management has addressed the recommendations of the Review of the July 2014 Labour Force Survey Release in the new processing system.
  • 1.1.1 Governance and oversight processes are in place and operating as intended to oversee changes to LFS processing systems.
  • 1.1.2 Roles and responsibilities with respect to changes to LFS processing systems are documented and understood.
  • 1.1.3 A formal testing protocol is implemented for changes to LFS processing systems.
  • 1.1.4 Diagnostics and error reporting is implemented to ensure LFS processing systems are operating as expected, and corrective action is taken to address any identified errors.
  • 1.1.5 Accurate and up-to-date systems documentation has been developed for LFS processing systems.
  • 1.1.6 Contextual events related to the survey environment, as well as management's related risk mitigation strategies, are communicated to parties (survey analysts, management, and oversight committees) responsible for reviewing the quality of LFS outputs.
  • Audit Criteria related to Management Accountability Framework (MAF): A tool for Internal Auditors
  • Statistics Canada's Quality Assurance Framework
  • Statistics Canada's Quality Guidelines
  • Statistics Canada's Directive for the Validation of Statistical Outputs
  • Statistics Canada's Guidelines for the Validation of Statistical Outputs
  • Review of the July 2014 Labour Force Survey Release
  • 2014 Spring Report of the Auditor General of Canada (Chapter 8 – Meeting Needs for Key Statistical Data – Statistics Canada)
1.2 Quality assurance processes to validate LFS outputs are compliant with Agency expectations and are applied consistently.
  • 1.2.1 A formal Validation Report has been developed for the LFS program.
  • 1.2.2 The LFS program is compliant with subject matter based quality assurance requirements.
  • 1.2.3 The LFS program is compliant with process based quality assurance requirements.
1.3 Response rate targets and tolerance levels are established, periodically reviewed, and monitored.
  • 1.3.1 Response rate targets and tolerance levels are established and periodically reviewed.
  • 1.3.2 Management monitors actual response rates against planned results, and adjusts course as needed.
1.4 Risk management processes are in place to identify, assess and respond to risks to the launch of the new processing system.
  • 1.4.1 Risk management processes are in place to identify, assess and respond to risks to the launch of the new processing system.
1.5 Management ensures the ongoing relevance of LFS outputs.
  • 1.5.1 Management consults regularly with users and other external stakeholders to gather feedback on the LFS.
  • 1.5.2 Management explicitly considers user and external stakeholder feedback and applies it to ensure LFS remains relevant and aligned with user needs.

Appendix B: Evaluation Methodology

Following discussions with program management and Audit and Evaluation management, three areas were identified for review in the evaluation:

Evaluation Topics

  • Relevance
    • What is LFS information used for?
    • To what extent do LFS data meet the needs of users?
    • To what extent do LFS users use other sources of information and for what purpose(s)?
  • Effectiveness – Quality
    • To what extent do users believe LFS information is accurate?
    • To what extent do users believe LFS information is timely?
  • Effectiveness - Access
    • What avenues do users utilize to access LFS information?
    • To what extent do users access The Daily and for those who do, how can it be improved?

By combining results from the first two evaluation topics (relevance and effectiveness – Quality), a portrait of the level of trust in LFS can be drawn. This method is used rather than direct questioning (i.e., Do you trust LFS data?) given the inherent subjective nature of such questioning. The premise is that if respondents are using the LFS, not turning to substitutes, satisfied with the timeliness, and having their information needs met; then there is implicitly a level of trust.

In addition to the evaluation topics (noted above), at the request of the program, questions were included to explore the use and understanding of the Survey of Employment, Payrolls and Hours and the use and awareness of the Labour Market Observatory.

Interviews were conducted with a sample of regular LFS data users to gather their perspectives on these questions. Semi-structured telephone interviews were held with individuals from the private sector, non-profit organizations, federal government departments, provincial government departments and regional/municipal government departments (Collection tool).

Finally, it should be noted that the findings are limited to the specific user group interviewed for the evaluation. Efforts were made however to cover as broad a range within this group as possible and findings from a general client satisfaction survey conducted by Statistics Canada are included in footnotes where possible.

Collection tool

Key Informant Interviews
n = 24 (29 participants)

Semi-structured interviews undertaken in February and March 2019:

  • Federal (n = 6);
  • Provincial/Territorial (n = 7);
  • Sub-provincial (n = 4);
  • Private Sector (n = 4);
  • Non-profit (n = 3)
Evaluation issues and indicators
Evaluation issues Evaluation indicators
Issue 1: Relevance 1.1 Identification of uses
2.1 Identification of uses specific to day of release
3.1 Level based on a 4 point scale
3.2 Identification of factors impacting level
4.1 Proportion using other sources, including SEPH
4.2 Identification of other sources
4.3 Identification of uses for other sources, including SEPH
5.1 Self-evaluation of ability to explain the difference between the two sources
Issue 2: Effectiveness – Quality 6.1 Level based on a 4 point scale
6.2 Identification of factors impacting level
7.1 Level based on a 4 point scale
7.2 Identification of factors impacting level
Issue 3: Effectiveness - Access and Release 8.1 Proportion of users using the Daily on day of release
8.2 Identification of improvements to the Daily
9.1 Identification of other avenues
9.2 Identification of reasons for using other avenues
10.1 Proportion of users aware
10.2 Use and frequency
10.3 Suggestions for improvement

Appendix C: Validation Activities

The Validation Guidelines define the specific validation steps which all mission critical programs must follow, unless a justification can be given as to why the step could not be completed. The Validation Guidelines divide validation activities into two groups:

  • Subject matter based activities; and
  • Process based activities.

The subject matter based validation activities required of mission critical surveys are:

  1. Analysis of changes over time: To analyse changes over time, a consistent time series of a particular statistic over a sequence of time points is created.
  2. Verification of seasonally adjusted estimates: For monthly or quarterly data presenting seasonally adjusted estimates, a validation of the results can be appropriate. Seasonal adjustment is designed to eliminate the effect of seasonal and calendar influences in infra-annual data to allow for more meaningful comparisons of economic conditions from period to period.
  3. Verification of estimates through cross-tabulations: This analysis is normally done at a finer level of disaggregation than that of the published estimates. Such tabulations allow checking of the internal consistency of the data file and provide the ability to explore the underlying characteristics associated with the estimates.
  4. Coherence analysis based on known current events: Coherence analysis based on known current events means validating estimates against domain intelligence and recent events affecting the sector.
  5. Confrontation with other similar sources of data published by Statistics Canada: Confrontation with other similar sources of data, either published by Statistics Canada or external to the organization can provide insight into whether reported micro-data and aggregate estimates are reasonable.
  6. Consultation with stakeholders internal to Statistics Canada: Stakeholders who have either direct knowledge of the specific subject matter being studied or are experts in a related subject matter could be consulted.
  7. Review of The Daily by Senior Management: One of the final steps of validation is a review of The Daily release by Senior Management. All releases should be validated by the division's Assistant Director and Director and can go up through the responsible Director General and Assistant Chief Statistician. A final sign-off of The Daily by the Chief Statistician will be done as per the Policy on Official Release.
  8. Formal briefing to the Strategic Management Committee: Each Mission Critical Program must present their pre-release results to the Strategic Management Committee (SMC). The presentation includes an overview of any changes implemented or issues encountered during the production operations, the impact that those changes may have had on the estimates, and the risk mitigation strategy utilized.

The process based validation activities required of mission critical surveys are:

  1. Review of production processes: Production processes include, for example, frame creation, sample design, collection, coding, editing, imputation, and weighting systems as well as other interventions such as the linking of administrative data or the calculation of derived variables.
  2. Coherence analysis based on quality indicators: The quality indicators associated with a program should be used to determine if an estimate is sound. For statistical surveys, quality indicators based on survey design, collection results and processing are used. Examples of quality indicators are: coverage of target population, coefficient of variation, response rate, imputation rate, metadata and paradata, revision rates of estimate.

Appendix D: Initialisms

CATI
Computer-assisted telephone interview
COSA
Collection and Operations Service Agreement
CoSD
Collection Systems Division
CPRD
Collection, Planning and Research Division
CV
Coefficient of variation
EQ
Electronic questionnaire
HOPS
Head Office Processing System
ICOS
Integrated Collection and Operating Systems
LFS
Labour Force Survey
LMIC
Labour Market Information Council
LSD
Labour Statistics Division
OAG
Office of the Auditor General
OID
Operations and Integration Division
SAC
Security Authorization Committee
SCSRD
Strategic Communications & Stakeholder Relations Division
SSID
Statistical Information Systems Division
SSPE
Social Survey Processing Environment
Date modified: