In June 2024, questions measuring the Labour Market Indicators were added to the Labour Force Survey as a supplement.

Questionnaire flow within the collection application is controlled dynamically based on responses provided throughout the survey. Therefore, some respondents will not receive all questions, and there is a small chance that some households will not receive any questions at all. This is based on their answers to certain LFS questions.

Labour Market Indicators

ENTRY_Q01 / EQ 1 - From the following list, please select the household member that will be completing this questionnaire on behalf of the entire household.
WFH_Q01 / EQ 2 - At the present time, in which of the following locations [do/does] [you/respondent name/this person] usually work as part of [your/his/her/their] main job or business?

  1. At a fixed location outside the home
  2. Outside a home with no fixed location
  3. At home

WFH_Q03 / EQ 3 - Among those locations, where [do/does] [you/Respondent name/this person] usually work the most hours?

  1. At a fixed location outside the home
  2. Outside a home with no fixed location
  3. At home

CWFH_Q01 / EQ 4 - You mentioned that [you/Respondent name/this person] usually work[s] most of [your/his/her/their] hours at home as part of [your/his/her/their] main job or business.

Is that mostly:

  1. In [your/his/her/their] own home
  2. At a farm owned or operated by [yourself/Respondent name/this person] or another member of the household
  3. In a separate business premise on the same property as [your/Respondent’s name/this person’s] home
  4. On someone else’s property, farm or business 

CWFH_Q02 / EQ 5 - Where is the office or work site which [you/Respondent name/this person] report[s] to in [your/his/her/their] main job located?

  1. In [your/his/her/their] own town or nearby
  2. In another part of the province
  3. In another province or territory
  4. Outside Canada

CWFH_Q03 / EQ 6 - In which province or territory is the office or work site which [you/Respondent name/this person] report[s] to located?

Select province or territory.

CWFH_Q04 / EQ 7 - Where are the [colleagues or co-workers/the people] with whom [you/Respondent name/this person] interact[s] with in [your/his/her/their] main [job/business] mostly located?

  1. In [your/Respondent’s name/this person’s] own town or nearby
  2. In another part of the province
  3. In another province or territory
  4. Outside Canada

Canadian Economic News, May 2024 Edition

This module provides a concise summary of selected Canadian economic events, as well as international and financial market developments by calendar month. It is intended to provide contextual information only to support users of the economic data published by Statistics Canada. In identifying major events or developments, Statistics Canada is not suggesting that these have a material impact on the published economic data in a particular reference month.

All information presented here is obtained from publicly available news and information sources, and does not reflect any protected information provided to Statistics Canada by survey respondents.

Resources

  • Calgary-based Trans Mountain Corporation announced that May 1st signified the commercial commencement date for the Expanded System and that tankers would be able to receive oil from Line 2 by mid-May.
  • Calgary-based Imperial Oil Ltd. announced that its Grand Rapids oil sands project had started production at Cold Lake using lower emission technology. Imperial said production will continue to ramp up over the next few months to achieve full rates of 15,000 gross barrels per day later this year.
  • Calgary-based AltaGas Ltd. and Royal Vopak of the Netherlands announced a positive final decision on the Ridley Island Export Facility, a large-scale liquefied petroleum gas (LPG) and bulk liquids terminal with rail, logistics, and marine infrastructure on Ridley Island, British Columbia. The companies said the projected gross Joint Venture capital cost of the project is $1.35 billion, excluding governmental incentives and support, and is expected to come online near 2026 year-end.
  • Calgary-based Canadian Utilities Limited announced a new $2 billion energy infrastructure project, called the Yellowhead Mainline project, which will consist of building approximately 200 kilometres of high-pressure natural gas pipeline and related control and compression facilities that will run from Peers, Alberta, to the northeast Edmonton area. The company said construction is expected to commence in 2026, subject to regulatory and company approvals, and is planned to be on-stream in the fourth quarter of 2027.
  • Vancouver-based Canfor Corporation announced the permanent closure of its Polar sawmill in Bear Lake, British Columbia and the suspension of its planned reinvestment in Houston, British Columbia. Canfor said this follows the announcement by its subsidiary company, Canfor Pulp, that one line of production will be indefinitely curtailed at the Northwood Pulp Mill.

Manufacturing

  • Winnipeg-based NFI Group Inc. (NFI) announced that NFI subsidiary New Flyer of America Inc. (New Flyer) had been awarded a contract from the New Jersey Transit Corporation for the purchase of up to 1,300 Xcelsior® 40-foot clean-diesel transit buses. NFI said New Flyer had received an initial firm order for 175 buses with the option to purchase up to 1,125 additional 40-foot buses.
  • Unifor announced on May 5th that 461 workers at the Toronto Nestle plant had decided to strike. Unifor said the plant produces Kit Kat, Aero, and Coffee Crisp chocolate bars, as well as Smarties. On May 29th, Unifor announced that workers had ratified a new three-year contract.
  • Japan-based Asahi Kasei announced it will construct its previously announced integrated lithium-ion battery (LIB) separator plant in Port Colborne, Ontario. Asahi had announced an initial investment of approximately $1.56 billion and said the start of commercial production is currently slated for 2027. The Governments of Canada and Ontario said the project is expected to benefit from federal support through tax credit while Ontario expects to support the project with both direct and indirect incentives.

Other news

  • The Government of Canada announced it was launching the first phase of a national pharmacare plan, including making contraception medications and devices free and making diabetes medications, like insulin, free.
  • On May 24th, the Government of the Northwest Territories released Budget 2024-25, which included a fiscal strategy to restore fiscal sustainability by increasing operating surpluses, reducing short-term debt, realigning health spending to make the programs more sustainable, allocating funding to priorities, and increasing fiscal capacity to ensure the Greater Northwest Territories (GNWT) can respond to future fiscal shocks. The Government forecasts a $294 million operating surplus for 2024-2025 and a contraction in real GDP of 1.1% in 2024.
  • Quebec's minimum wage increased from $15.25 to $15.75 per hour on May 1st.
  • Minnesota-based ALLETE, Inc., which includes utilities and renewable energy companies, and a partnership led by Canada Pension Plan Investment Board of Toronto and Global Infrastructure Partners of New York announced they had entered into a definitive agreement under which the partnership will acquire ALLETE for USD $6.2 billion including the assumption of debt. ALLETE said the transaction is expected to close in mid-2025, subject to the approval of ALLETE's shareholders, the receipt of regulatory approvals, and other customary closing conditions.

United States and other international news

  • United States President Joseph R Biden, Jr. announced an increase in tariffs under Section 301 of the Trade Act of 1974 on $18 billion of imports from China across strategic sectors such as steel and aluminum, semiconductors, electric vehicles, batteries, critical minerals, solar cells, ship-to-shore cranes, and medical products.
  • The U.S. Federal Open Market Committee (FOMC) maintained the target range for the federal funds rate at 5.25% to 5.50%. The last change in the target range was a 25 basis points increase in July 2023. The Committee also said it will continue reducing its holdings of Treasury securities and agency debt and agency mortgage-backed securities and that, beginning in June, the Committee will slow the pace of decline of its securities holdings by reducing the monthly redemption cap on Treasury securities from $60 billion to $25 billion.
  • The Monetary Policy and Financial Stability Committee of Norway's Norges Bank left the policy rate unchanged at 4.5%. The last change in the policy rate was a 25 basis points increase in December 2023.
  • The Bank of England's Monetary Policy Committee (MPC) voted to maintain the Bank Rate at 5.25%. The last change in the Bank Rate was a 25 basis points increase in August 2023.
  • The Reserve Bank of Australia (RBA) left the cash rate target unchanged at 4.35%. The last change in the cash rate target was a 25 basis points increase in November 2023.
  • The Executive Board of Sweden's Riksbank lowered the repo rate by 25 basis points to 3.75%. The last change in the repo rate was a 25 basis points increase in September 2023.
  • The Reserve Bank of New Zealand (RBNZ) left the Official Cash Rate (OCR), its main policy rate, unchanged at 5.50%. The last change in the OCR was a 25 basis points increase in May 2023.
  • Florida-based Red Lobster Management LLC, owner and operator of the Red Lobster® restaurant chain, announced that the Company had voluntarily filed for relief under Chapter 11 of the Bankruptcy Code in the United States. The Company said it intends to use the proceedings to drive operational improvements, simplify the business through a reduction in locations, and pursue a sale of substantially all of its assets as a going concern. The company also said Red Lobster's restaurants will remain open and operating as usual during the Chapter 11 process.
  • Texas-based ConocoPhillips and Marathon Oil Corporation announced they had entered into a definitive agreement pursuant to which ConocoPhillips will acquire Marathon Oil in an all-stock transaction with an enterprise value of USD $22.5 billion. The companies said the transaction is expected to close in the fourth quarter of 2024, subject to the approval of Marathon Oil stockholders, regulatory clearance, and other customary closing conditions.

Financial market news

  • West Texas Intermediate crude oil closed at USD $76.99 per barrel on May 31st, down from a closing value of USD $81.93 at the end of April. Western Canadian Select crude oil traded in the USD $63 to $67 per barrel range throughout May. The Canadian dollar closed at 73.33 cents U.S. on May 31st, up from 72.75 cents U.S. at the end of April. The S&P/TSX composite index closed at 22,269.12 on May 31st, up from 21,714.54 at the end of April.

Information for survey participants (ISP)

Notice

Due to the possible Canada Post service disruptions, communications sent by mail may be delayed. Your timely participation is important. To assist you in completing your survey, a Statistics Canada employee may call or visit you in person.

We are committed to protecting the data you entrust to us. Please contact us or visit our page Frequently asked questions for further information.

Start your survey

Some Statistics Canada surveys can be completed online. If you have been invited, via email or mail, to participate in an online survey, complete the following steps to log in to the survey.

Step 1: Visit the electronic questionnaire portal and select Start my survey.

Step 2: Enter your secure access code (you will find this code in the invitation letter or email you previously received from Statistics Canada).

Step 3: Complete the survey.

Are you missing your secure access code, or do you need help?

Phone:

1-877-949-9492
(TTY: 1-800-363-7629)
Monday to Friday (except holidays) from 8:00 a.m. to 7:00 p.m., Eastern time

If you use an operator-assisted relay service, you can call us during regular business hours. You do not need to authorize the operator to contact us.

Email:

Frequently asked questions

Do you have general questions about Statistics Canada or surveys being conducted? Have a look at our FAQs to find some answers.

List of surveys in collection

To search for a particular survey, view the list of surveys currently in collection.

My community

There's power in numbers. Discover the benefits that various population groups enjoy and the challenges they face, thanks to a broad range of information from Statistics Canada surveys. You'll find presentations, articles, videos and infographics on subjects of interest.

Did you know?

Search box

Infographic - The Journey of Statistics Canada Data

Statistics Canada collects data on all aspects of life in Canada, and you have a key role to play.

Adopting a high Level MLOps Practice for the Production Applications of Machine Learning in the Canadian Consumer Prices Index

By Christian Ritter, William Spackman, Todd Best, Serge Goussev, Greg DeVilliers, Stephen Theobald, Statistics Canada

Introduction

As Statistics Canada (StatCan) has shifted to an administrative-data-first strategy (Statistics Canada Data Strategy), the agency has also increasingly researched and leveraged Machine Learning (ML) (Data science projects), not only due to the volume and velocity of the new data, but also due to its variety as certain parts of the data are unstructured or semi-structured. The agency's adoption of ML has been guided by the Framework for Responsible ML (Responsible use of machine learning at Statistics Canada) to make investments to support sound application and methods, and has also included a focus on developing production level code and other best practices (Production level code in Data Science). One of the types of best practices that has been developed is a focus on transparency, reproducibility, rigour, scalability, and efficiency – processes that are also known as Machine Learning Operations or MLOps. This article provides an overview of the investments made and plans for future development in MLOps at Statistics Canada, focusing on the price statistics use case, specifically for the Canadian Consumer Price Index (CPI).

Similar to many National Statistical Organizations (NSOs), Statistics Canada's CPI program leverages supervised Machine Learning to classify new unique products (UN Task Team on Scanner Data (2023)) from alternative data sources (ADS), such as scanner, web scraped, or administrative data, to the product classes used for its CPI. A particular concern with this administrative data from the consumer economy is how dynamic it is over time, as new products enter the market and old products exit. From an ML point of view, this change over time can be considered as the gradual shift of the data distribution for each month's data that needs to be classified, compared to the distribution on which an ML model was trained on, resulting in gradual model degradation (see Figure 1 for a visualization of concept). As shown by Spackman et al (2023) and Spackman et al (2024), misclassification can also affect a price index, thus a reliable approach is needed to combine data and model monitoring, validation of records to correct for misclassification, and frequent retraining to make sure that price indices are unaffected any misclassifications. MLOps helps address these concerns through a robust ML production process.

Figure 1: Concept Drift and retraining: Impact on Model Performance.
Description - Figure 1: Concept Drift and retraining: Impact on Model Performance.

Classification performance on new products that are observed in web scraped data decreases steadily over time, however retraining the model on a periodic basis (every 3 months, for instance) will bring up the performance.

Why does MLOps bring value in this situation?

The application of ML to a production process requires detailed consideration of several end-user business needs, such as:

  • Overall efficiency, such as cost effectiveness of the maintenance.
  • Ability to perform model inference on production of big data with its 5+ v's (volume, veracity, velocity, variety, and value).
  • Ensure transparent and efficient usage of the highly complex algorithms.
  • Ensure a continuous reliable and acceptable model performance with low risk.
  • Enable the use of many models, in parallel or through updates.
  • Handle the complex dependency of open-source ML frameworks and packages.
  • Meet organizational governance requirements.

Addressing such business needs puts the actual ML model as a very small component in the context of the overall ML production process as highlighted in Sculley et al (2015) and shown in Figure 2.

Figure 2: Hidden Technical Depth in Machine Learning Systems, adapted from Figure 1 in Sculley et al (2015)
Description - Figure 2: Hidden Technical Depth in Machine Learning Systems, adapted from Figure 1 in Sculley et al (2015).

Machine Learning, as in the modeling itself, is actually a small component of many larger components when it comes to robust production systems. For example, configuration is a big topic, having processing for serving, monitoring, or analyzing the model and performance is key. Developing processes for data verification, resource management, process management, and data collection and extraction – are all necessary to also invest in.

In general, there have been significant efforts to address similar business needs for decades (particular in the finance industry), but this effort received a significant boost of relevancy when ML went mainstream across industries, and additional needs emerged, e.g., the need to handle big data and the more and more complex neural networks. MLOps describes the ML production process which aims to address all the common business needs previously listed. The MLOps of today can be understood as a paradigm or approach to enable automated, effective, transparent, and iterative delivery of ML models into production while also focusing on business and regulatory requirements. Hence it has many benefits, especially for StatCan. MLOps covers statistical methodologies (e.g., drift detection, misclassification), best practices in software engineering (e.g., DevOps) as well as (cloud) infrastructure and tools (e.g., system design, monitoring, scalable solutions).

The automated and iterative approach in MLOps borrows from DevOps practices of software engineering, which in the ML context enables cost savings, increased speed of model creation and model deployment, allowing the usage of many models. Furthermore, governance and transparency increase through use of MLOps. The model creation process with its complex multi-step approach involving data and software dependencies is represented through training pipelines. Frequent model (re)training (continuous retraining) is often necessary to mitigate the accuracy degradation of ML models due to data drift.

Complementarily, the model deployment process often borrows from software engineering with the concept of 'deployment' of software, the model, and its dependencies, into an isolated production environment. Often, the deployment occurs on scalable infrastructure (i.e. easy to increase or decrease compute resources) to handle the big-data characteristics; with the model encapsulated and integrated through a clear, easy-to-use interface (i.e. interactable via an Application Programming Interface or API). This process has its ML-specific components, such as the need to log and monitor data and model behavior and potentially allow for ML interpretability or explainability functionality.

How does MLOps approach integrate within price statistics needs?

Supervised machine learning is a key tool within consumer price statistics processing pipelines. Specifically, as the CPI is built up from price indices for elementary aggregates, which are pairings of product and geography classes (The Canadian Consumer Price Index Reference Paper, Chapter 4 – Classifications), classification must be done for unique products to their correct respective product codes (i.e. categories). From a technical point of view, this is a Natural Language Processing problem, similar to other examples at Statistics Canada (2021 Census Comment Classification), whereby the text describing a product is used by the model to classify the product to a group used for price indices. As classification is not perfect, manual validation of records is necessary to correct errors prior to compiling price indices. This validation additionally helps create new examples that can be used as retraining data. Statistics Canada has a rich experience of this process, with more detail outlined as step 5.2 of the Generic Statistical Business Process Model (GSBPM) (2021 Census Comment Classification). Figure 3 demonstrates this key step as central to a processing pipeline on a typical alternative data source such as scanner data, prior to the file being able to be used in production.

Figure 3: Application of MLOps within the consumer price statistics processing pipeline. Manual record quality assurance (QA) is useful for not just validation of the classification step, but also to create retraining data. To support this manual QA, a specific misclassification and outliner flag processes is designed.
Description - Figure 3: Application of MLOps within the consumer price statistics processing pipeline. Manual record quality assurance (QA) is useful for not just validation of the classification step, but also to create retraining data. To support this manual QA, a specific misclassification and outliner flag processes is designed.

The consumer price statistics processing pipeline starts with dataset quality assurance. It then continues to the standardization of the data and ingestion of it into a central storage location for analytics. From here, near identical but homogeneous products can be linked together and geography is mapped for price statistics needs. This is where MLOps system is required – to classify each unique product to the product class it should belong to. Records are then manually reviewed, with specific focus paid to records flagged as misclassified or outliers. Finally, elementary price aggregation can be done on this data, depending on the strata that this specific price aggregation methodology needs.

Adopting MLOps principles to this use case would add considerable maturity to the process. Table 1 below summarizes the value promised (left) and how it addresses concrete price statistics needs (right).

Table 1: Summary of key capabilities of MLOps and how each supports the price statistics use case.
MLOps Promise How it addresses price statistics needs
General cost savings Automation of processes save cost on compute and data scientist work
Stabilizes pipeline for entire ML lifecycle through DevOps practices Less error prone and lower risk of friction along the road from model development to production
Ease friction at every point in the MLOps process through automation Automation of complex steps of Data Science work includes preprocessing and training, makes work reproducible and transparent
Re-use of model creation pipeline to produce 100s of models Training models based on new data can be used to continuously create ML models, by pointing the training pipeline to new data
Allows faster ML model deployment, easier and faster Addresses need to produce multiple models per alternative data source at higher frequency
Leverages central location of model tracking and versioning Maintaining 100s of models requires a central model store, also increasing lineage and governance control
Easy and automated scaling of compute during the MLOps process appropriate to the time and scale needed Allow to scale with the increasing amount of data using high performance compute and distributed systems, with cost-effective usage
Extended monitoring: from infrastructure to model monitoring Ensuring reliable performance of models despite a multitude of factors such as model updates, data drift, and errors
Common workflow and policy A standardized automated process allows the organization to address its requirements around Artificial Intelligence (AI) usage
Handles complex (open source) dependencies Software engineering best practices such as virtualization, testing and Continuous Integration (CI) and Continuous Delivery (CD) (together referred to as CI/CD), reduce the risk of (silent) failure
Auditing and governance Reproducibility through version control of all ML artifacts, easy adoption of accountability and control frameworks such as Responsible AI Framework

Road to adoption via Iterative Maturity Model

Integration into existing process

As described above, product classification (and outlier/misclassification detection) are components of a much larger process with alternative data, and hence the integration needs to be performed in a decoupled modular fashion, following best architectural design practices. Classification service/MLOps system is exposed through dedicated REST APIs to the rest of the processing system which allows a smooth integration. Updates and changes to this MLOps system must not impact the overall system so long as no API contracts are violated.

Maturity steps

Based on business requirements and a feasibility study, an iterative approach was designed for MLOps adoption through four MLOps Maturity levels. These steps, similar to industry best practices such as described by Google (MLOps: Continuous delivery and automation pipelines in machine learninng) or Microsoft (Machine Learning operations maturity model), adopt an increasing level of automation and addresses of a variety of risks. Value generation is possible from Level 1 onwards by addressing the most critical aspects of production ML.

Table 2: Maturity model for MLOps.
Maturity Coverage
Level 0 Before the MLOps adoption; Jupyter notebook used for modeling and Luigi scripts for inference and orchestration
Level 1 Automatic training through continuous training pipelines, ML artifact management and version control; basic data quality checks and monitoring
Level 2 Automatic scalable ML model inference on new data; continuous deployment to scalable endpoints
Level 3 Monitoring with performance-based drift detection, shadow model deployments, automated retraining, responsible AI with model cards and standardized reporting

Overview of the system architecture and capabilities

Before diving into each component and capability of the system, a visual of the overall architecture and execution workflows is provided. Pipelines (as code) are published to the production environment and then executed following MLOps best practices to ensure reproducibility. Metadata is kept on each aspect and provenance is well established.

Production workflow

Firstly, the workflow that happens every time with a new retailer dataset is described. Figure 4 visualizes the steps that can be summarized as a multi-step batch process:

  1. Initial state: A new retailer dataset arrives through bronze-silver-gold process in the data warehouse.
  2. A Synapse pipeline is triggered which extracts the new unique product data to the data lake and calls our main REST endpoint.
  3. The main REST endpoint executes an orchestration pipeline which interacts with different (component) pipelines.
  4. First it calls the data validation pipeline which tests data expectations defined through great expectations (Great Expectations GitHub). Failure results in automatic data reports.
  5. The classifier pipeline follows, with the type of data and the retailer in question being used as parameters to identify the production-staged models in the registry as well as the corresponding deployments under the managed endpoints as appropriate in this case.
  6. The deployments distribute the data across multiple nodes for processing in parallel.
  7. The resulting classifications are validated and checked for outliers/misclassifications via versioned methods. Outliers or misclassified product candidates are categorized.
  8. The orchestrator puts the classified data onto the lake which triggers the execution of a Synapse pipeline for the loading of this output data into the data warehouse.
  9. Classification results and outlier data is ingested back into the data warehouse.
  10. Classification results and outlier data are now available in the warehouse for downstream consumption (and most critically validation).
Figure 4: Schematic Production Processing Workflow
Description - Figure 4: Schematic Production Processing Workflow

See production workflow summary above for a detailed step by step overview of this 9-step process.

Model creation workflow

The process to create the model is done separately from the production workflow. Figure 5 visualizes the steps:

  1. Modeling code is pushed to the development branch on GitLab, which triggers the CD pipeline to integrate the pipeline code in Azure Machine Learning (Azure ML).
  2. Inside the CI process, the code is tested following established procedures.
  3. Once approved through a merge request, the CD pipeline publishes the training pipeline to the workspace as a new version, where it becomes available for execution. Note it contains the training configuration for different retailers as required.
  4. The training pipeline can be executed by pointing it to a location on the data lake where the pre-determined training (or retraining) data is stored. This step can be scheduled to run when updated retraining data is created. If training is successful, the model is registered in the Azure ML model registry under a unique name and version.
  5. The definition of which model is in production is stored in a configuration file on GitLab; changing this file triggers a change in the deployment configuration. Models in MLflow (MLflow Tracking: An efficient way of tracking modeling experiments) are tagged accordingly in the model registry, and then deployed as a deployment under a corresponding managed endpoint.
  6. Depending on the needs of the data, one model may be used for each retailer or multiple. Shadow models are also leveraged where applicable to allow a robust update process of models after a phase of monitoring.
Figure 5: End-to-end: from training to deployment.
Description - Figure 5: End-to-end: from training to deployment.

See model creation workflow summary above for a detailed step by step overview of this 6-step process.

Detailed overview of MLOps capabilities

While the system architecture automates in a robust way all required production aspects as well as supports automated retraining, these steps are necessary to deal with data drift.

Concept of data drift

In dynamically changing and non-stationary environments, the data distribution can change over time yielding the phenomenon of concept drift (João, et al. 2014). In the context of consumer price data, this could be reflected as new product categories being introduced to the market over time and changing consumer preferences. Degradation of model performance will be encountered if the change in data distribution alters the decision boundary between classes, known as real concept drift (Figure 6).

Figure 6: Types of drifts: circles represent instances, different colors represent different classes (João, et al. 2014).
Description - Figure 6: Types of drifts: circles represent instances, different colors represent different classes (João, et al. 2014).

A simplified example is provided to demonstrate drift. The original data (left) follows a specific distribution, and a model learns to fit a decision boundary to categorize the data into two classes. If the distribution shifts (middle), i.e. P(y|X), retraining is needed to refit the data and learn a new decision boundary to categorize the data into two classes. However, there can also be virtual drift, whereby P(X) shifts, but the p(y|X) does not shift, hence the existing decision boundary still works well.

Figure 7:  Patterns of changes over time (outlier is not concept drift) (João, et al. 2014)
Description - Figure 7: Patterns of changes over time (outlier is not concept drift) (João, et al. 2014).

Five scenarios are shown for distribution change. The first is a sudden or abrupt shift, whereby the distribution shifts at one point and then stays at the new distribution. The second could be incremental, whereby shift happens continuously over time and eventually it stabilizes to a new level. The third could also be gradual, but instead of being continuous, it could shift rapidly and then shift back, and operate like this for a while before settling at the new level. Fourthly, shift could be reoccurring, whereby a shift occurs to a new level, but it stays at this level for only a while, before returning back to the original level. Finally, there could just be an outlier and not drift.

It is important that the implemented MLOps Production system have tools to both detect and address concept drift as it is encountered over time.

Drift detection

As consumer price data is not known to shift suddenly, performance-based drift detection was chosen (Spackman et al. 2023). As the name suggests, performance-based drift detection evaluates the performance of the predictive model to determine if drift has occurred. This approach is appropriate if there is a high proportion of validation to correct possible misclassification after the model, as is the case for our implementation. In our production systems, the performance of the predictive model is not known at the time of classification. To calculate the performance, some subset of the predicted instances must be flagged for quality assurance (QA). Once there is a set of quality assured data, the model predictions can be compared to the true classes to calculate evaluation metrics to measure the performance. For this evaluation to be trusted, the subset of predicted instances chosen for QA must be selected in an unbiased way. One such method to select an unbiased sample is to randomly select a specific proportion of instances from each run of the predictive model. Figure 8 below shows a demonstration of our Model Performance Monitoring Dashboard, which showcases various F1, precision and recall values for distinct runs of the predictive model, which can be used to determine whether drift has occurred.

Figure 8:  Model performance dashboard that shows the performance of the model over time
Description - Figure 8: Model performance dashboard that shows the performance of the model over time

Performance of models is tracked over time. When two models are deployed in production, using a main model and shadow models, performance can be compared.

Model retraining

As a result of performance monitoring over time, data drift was noticed in production models; thus, a process is required for mitigating the impact in the MLOps system. One option for counteracting concept drift is to periodically refit the production model(s) using new data. In earlier stages of MLOps maturity, this is a cumbersome and time-consuming process, thus operationalizing the model retraining process represents a key component of MLOps maturity.

Figure 9 demonstrates the flow of new products through the MLOps system. As new products are ingested into the production system, they are automatically classified. As mentioned previously, a fraction of the products is selected to have their classes manually verified and corrected, by a trained annotator. Based on a manual review process, performance metrics for the classifier can be estimated. The products that have been manually verified, additionally serve as new training data, that can be used to retrain the classifier, which is then deployed to production.

Figure 9:  Classification, Review, and retraining process
Description - Figure 9: Classification, Review, and retraining process

As new products are classified – a proportion is randomly selected for retraining. Model performance is tracked on this random set, and retraining data is also created from this set of data. Corrections are also utilized for the final set that can be combined with other flagging methods to make sure that well classified products are used for price statistics.

This review and refitting process remains consistent between MLOps maturity levels, with the difference being the automation the retraining pipeline. At the final maturity level, the training data update, model refitting and model deployment are fully automated; triggered either by the completion of the manual review step or on a fixed schedule.

Data validation and expectation checks

Changes in data quality are a significant problem to model performance and quality outputs in production. In fact, data has become the most important part of a data science project (data-centric age) as model architectures and models are readily available. Hence it is necessary to establish a good data quality framework and methods. In the context of ML, it is important to align the quality checks with the expectation of the Data Scientist who builds the models. A separate data validation pipeline was introduced just for this task, which performs checks with the open-source framework Great Expectations (GE) (Great Expectations GitHub).

GE has a high degree of flexibility and scalability in the type of checks that can be applied, as well as outputting the expectation in a way as to enable easy version control. GE also produces automatic and easily readable data quality reports for domain experts and Data Scientists that are accessed directly in the Azure ML User Interface. As well, GE provides an extension library of off-the-shelf data validation expectation checks for rapid deployment and reusability across varying datasets and provides multiple options for connecting to varying data sources (CSVs on data lakes or file servers, databases, etc). On top of this, GE provides the ability to develop custom expectation suites and checks, which allows for the tailoring of designer expectations checks for fringe use cases that would not typically be covered by standard data validation methods.

Figure 10: A data validation report from Great Expectations
Description - Figure 10: A data validation report from Great Expectations

This shows an application of the output of the python package Great Expectations, after it has validated a specific dataset. The page has a summary of the validation on the top right, a more detailed table of checks that have been performed and the expectations matched to what has actually been seen in this dataset. Finally, the page has a set of actions that can expand various aspects of the page to provide the data scientist greater detail of the checks completed.

Version control

Proper use of version control (Version Control with Git for Analytics Professionals) is a key requirement for an MLOps system. It is an important requirement within the broader adoption of open-source best practices for production development by NSOs (Price and Marques 2023), and for the ML use case includes not only code version control, but also versions of additional parts in the ML lifecycle, including data version control, experiment version control, and model version control. For robust provenance and lineage, the tracking of versions of different parts of the process is needed. For example, given a model in production, one would want to know the data the model was trained on, the training code used, etc.

  • Code Version Control: The MLOps system uses code quality control and CI/CD best practices. Particularly the usage of GitOps, which defines the Git repository as the source of truth and the state of the system being adopted. Pipelines and deployments are similarly defined as code.
  • Data Version Control: Data version control allows the traceability of produced ML model back to the version of the data on which it was trained. In a production data process with frequent arrival of new data, it is also essential to keep track of the version of the data. Azure ML datasets are used for both needs.
  • Model Version Control: Model version control can be facilitated through a central model store, which stores the model artifacts and performs the model metadata management. The most prominent opensource tool for model management is MLflow. Azure ML model store is used as it allows interaction with MLflow.
  • Package Dependency Version Control: Package dependency version is done through virtual environments and docker images in which the dependencies are encapsulated. Azure Container Registry and Azure ML environments are used to handle the dependencies for data pipelines and ML models.
  • Pipeline Version Control: With daily execution of many pipelines, version control of new published pipelines is essential. Azure ML pipeline versioning is used to publish pipelines and manage them under pipeline endpoints.
  • Deployment Versioning: The version of the deployment is managed through dependencies on the model version.

Figure 11 (below) summarizes how these various components are combined together to form a lineage graph for deployed models.

Figure 11: Lineage graph for deployed models
Description - Figure 11: Lineage graph for deployed models.

The diagram summarizes how lineage of models can be managed. Firstly, from the main branch, a new release is planned, and a release branch is created. From there, a new dataset is registered on which models will be trained. With this dataset, multiple models can be trained in development. The final model can be deployed to production and used on new data. If experiments are done to improve the model on the same data, this newer model can similarly be deployed, even if the data on which it was trained or the code in the main branch was never changed.

Conclusion

This article summarized how MLOps can provide tremendous value to a critical statistical program at Statistics Canada, building maturity and robustness for a production process in a way that is aligned with Statistics Canada's Responsible Machine Learning framework (Responsible use of machine learning at Statistics Canada). The agency will continue to evaluate how MLOps can be applied to other use cases as well as continue to make investments to expand developed capabilities. For instance, the following capabilities are being explored: explainable AI dashboards as part of the model choice step in the model creation workflow, more robust data drift detection, more explicit shadow model deployment to support model evaluation (or just always run a backup model in production if required), as well as enhanced cost tracking to further optimize operational use.

Bibliography

João, Gama, Žliobaitė Indrė, Albert Bifet, Mykola Pechenizkiy, and Abdelhamid Bouchachia. 2014. "A survey on concept drift adaptation." ACM computing surveys (CSUR) 46, no. 4 1-37.

Price, Matthew, and Diogo Marques. 2023. "Developing reproducible analytical pipelines for the transformation of consumer price statistics: rail fares, UK." Meeting of the Group of Experts on Consumer Price Indices. Geneva: UN.

Sculley, David, Gary Holt, Daniel Golovin, Eugene Davydov, Todd Phillips, Dietmar Ebner, Vinay Chaudhary, Michael Young, Jean-Francois Crespo, and Dan Dennison. 2015. "Hidden technical debt in machine learning systems." Advances in neural information processing systems (28).

Spackman, William, Greg DeVilliers, Christian Ritter, and Serge Goussev. 2023. "Identifying and mitigating misclassification: A case study of the Machine Learning lifecycle in price indices with web-scraped clothing data." Meeting of the Group of Experts on Consumer Price Indices. Geneva: UN.

Spackman, William, Serge Goussev, Mackenzie Wall, Greg DeVilliers, and David Chiumera. 2024. "Machine Learning is (not!) all you need: Impact of classification-induced error on price indices using scanner data." 18th Meeting of the Ottawa Group. Ottawa: UNECE.

UN Task Team on Scanner Data. 2023. "Classifying alternative data for consumer price statistics: Methods and best practices." Meeting of the Group of Experts on Consumer Price Indices. Geneva.

Retail Commodity Survey: CVs for Total Sales March 2024

Retail Commodity Survey: CVs for Total Sales March 2024
Table summary
This table displays the results of Retail Commodity Survey: CVs for Total Sales (March 2024). The information is grouped by NAPCS-CANADA (appearing as row headers), and Month (appearing as column headers).
NAPCS-CANADA Month
202312 202401 202402 202403
Total commodities, retail trade commissions and miscellaneous services 0.50 0.70 0.66 0.59
Retail Services (except commissions) [561] 0.49 0.69 0.65 0.59
Food and beverages at retail [56111] 0.41 0.48 0.43 0.43
Cannabis products, at retail [56113] 0.00 0.00 0.00 0.00
Clothing at retail [56121] 1.24 0.76 0.85 0.98
Jewellery and watches, luggage and briefcases, at retail [56123] 3.49 2.01 2.46 2.03
Footwear at retail [56124] 0.99 1.26 1.08 1.30
Home furniture, furnishings, housewares, appliances and electronics, at retail [56131] 0.80 0.91 0.86 0.88
Sporting and leisure products (except publications, audio and video recordings, and game software), at retail [56141] 1.77 2.53 2.81 2.48
Publications at retail [56142] 5.29 5.34 7.39 7.17
Audio and video recordings, and game software, at retail [56143] 3.91 4.06 3.80 3.90
Motor vehicles at retail [56151] 1.85 2.48 2.24 1.85
Recreational vehicles at retail [56152] 5.15 5.24 4.89 4.64
Motor vehicle parts, accessories and supplies, at retail [56153] 1.35 2.50 1.89 1.66
Automotive and household fuels, at retail [56161] 1.71 1.65 1.54 1.68
Home health products at retail [56171] 3.06 3.32 3.27 3.38
Infant care, personal and beauty products, at retail [56172] 2.83 2.92 2.80 2.80
Hardware, tools, renovation and lawn and garden products, at retail [56181] 1.48 1.80 1.65 1.76
Miscellaneous products at retail [56191] 2.26 2.17 2.11 2.23
Retail trade commissions [562] 2.35 2.05 1.76 2.03

Advance indicators—Frequently asked questions

By Steve Matthews, Kyle Virgin and Ramdane Djoudad—Statistics Canada

This special-edition article provides nontechnical answers to questions related to the production, use and interpretation of advance indicators for Statistics Canada's Monthly Survey of Manufacturing, Monthly Wholesale Trade Survey and Monthly Retail Trade Survey. Organized as a set of frequently asked questions, this reference document complements the technical documentation on definitions, data sources and methods available for individual programs. It is composed of two sections. Section 1 reviews concepts and definitions that are central to the production of advance indicators, while Section 2 relates to the analysis and interpretation of these special statistical products.

Section 1: Context, definitions and terminology

1 What is an advance indicator?

Advance indicators are statistical estimates designed to provide early information on economic activities for a given reference period. For the surveys listed above, advance indicators are generated when information for a portion of respondents has been received but data collection is still underway. Advance indicators for monthly manufacturing, wholesale trade and retail trade are typically published 21 to 25 days after the end of a reference month, while preliminary indicators are published approximately one month later. For example, the monthly retail advance indicator for the January reference month would be published in February (21 to 25 days after), while the preliminary indicator for January would be published in March (one month later). Therefore, a February publication would showcase a preliminary indicator for December, as well as an advance indicator for January.

Chart 1 shows the average number of days to publish advance, preliminary and revised indicators for the three programs following the end of the reference month.

Chart 1 - Average day of publication, by program

Average day of publication, by program
Description - Average day of publication, by program
Average day of publication, by program
Program Advance Preliminary Revised
Manufacturing 25 45 76
Wholesale 25 46 77
Retail 21 52 82
2 What led Statistics Canada to publish advance indicators?

Statistics Canada first published advance indicators in 2020. This work was done primarily to provide users with more timely information, given the economic uncertainty that arose from the COVID-19 pandemic. Demand was high for advance indicators to monitor the economic impacts of COVID-19 in different areas of the Canadian economy and to provide early signals and information about the direction of trends. By using early respondent data, Statistics Canada was able to compile timely and reliable economic signals based on observed data.

Reference month of first release of advance indicator
Program Reference month of first release of advance indicator
Monthly Retail Trade Survey April 2020
Monthly Survey of Manufacturing May 2020
Monthly Wholesale Trade Survey August 2020
3 How does Statistics Canada produce advance indicators?

Statistics Canada uses a technique called flash estimation to produce advance indicators for selected survey programs. Flash estimation refers to a type of advance indicator that uses the same methods used to produce preliminary indicators, but these methods are applied to a more limited dataset, at an earlier point in time. For example, to produce the advance retail trade indicator, only responses that have been received by a predetermined point in the collection period are used. Once collection is complete, the same non-response treatments and weighting methods are used on the full set of received data, which is then used to produce the preliminary indicator.

The amount of collected data incorporated into an advance indicator varies from month to month and across surveys. Table 1 shows that in 2023, the average response rates for advance indicators (rounded to the nearest percentage point) were 68% for the Monthly Survey of Manufacturing, 59% for the Monthly Wholesale Trade Survey and 45% for the Monthly Retail Trade Survey. These response rates are typically published along with each advance indicator to provide users with information on the quality of that month's figure.

Table 1 - Average response rates, 2023
Program Advance indicator Preliminary indicator Revised indicator
Manufacturing 68% 87% 94%
Wholesale trade 59% 69% 75%
Retail trade 45% 83% 88%

The monthly gross domestic product (GDP) by industry program uses the advance indicators discussed in this article to compile its own advance indicators of GDP. More information on estimates of monthly GDP can be found in Revisions to Canada's GDP.

Figure 1 illustrates the published month-to-month movements for sales in manufacturing, wholesale trade and retail trade throughout the 2023 reference year. The advance, preliminary and revised indicators are highly coherent, in terms of both the direction (increase or decrease) and the magnitude of the month-to-month change in sales.

Figure 1 - Comparison of month-to-month movements from advance, preliminary and revised indicatorsFootnote 1

Month-to-Month Movements for Sales in Manufacturing, 2023
Description - Month-to-Month Movements for Sales in Manufacturing, 2023
Month-to-month movements for sales in manufacturing, 2023
  Advanced indicator Preliminary indicator Revised indicator
January 3.9 4.1 4.5
February -2.8 -3.6 -3.6
March 0.7 0.7 0.8
April -0.2 0.3 -0.1
May 0.8 1.2 1.2
June -2.1 -1.7 -2.0
July 0.7 1.6 1.6
August 1.0 0.7 1.0
September -0.1 0.4 0.7
October -2.7 -2.8 -2.9
November 1.2 1.2 1.5
December -0.6 -0.7 -1.1
Month-to-Month Movements for Sales in Wholesale, 2023
Description - Month-to-Month Movements for Sales in Wholesale, 2023
Month-to-month movements for sales in wholesale, 2023
  Advanced indicator Preliminary indicator Revised indicator
January 3.0 2.4 2.6
February -1.6 -1.7 -1.4
March -0.4 -0.1 -1.1
April 1.6 -1.4 -1.4
May 3.5 3.3 2.5
June -4.4 -2.8 -1.4
July 1.4 0.2 0.0
August 2.6 2.3 1.8
September 0.0 0.4 -0.6
October -1.1 -0.5 -0.3
November 0.8 0.9 0.9
December 0.8 0.3 -0.3
Month-to-Month Movements for Sales in Retail, 2023
Description - Month-to-Month Movements for Sales in Retail, 2023
Month-to-month movements for sales in retail, 2023
  Advanced indicator Preliminary indicator Revised indicator
January 0.7 1.4 1.6
February -0.6 -0.2 -0.2
March -1.4 -1.4 -1.5
April 0.2 1.1 1.0
May 0.5 0.2 0.1
June 0.0 0.1 0.1
July 0.4 0.3 0.4
August -0.3 -0.1 -0.1
September 0.0 0.6 0.5
October 0.8 0.7 0.5
November 0.0 -0.2 -0.0
December 0.8 0.9 0.9
4 Are there other approaches that can produce advance indicators?

Besides flash estimation, nowcasting is another method that can be used to produce advance indicators. In contrast to flash estimation, nowcasting encompasses more types of advance indicators that use either different input data or different compilation methods than preliminary indicators. For example, a nowcast may include estimators based entirely on models that use information from alternative sources available at the time when the model is applied to generate the nowcast. Similar to flash estimation, nowcasting models typically yield advance indicators that are less precise than preliminary indicators.

An important distinction exists between advance indicators produced at Statistics Canada and what are commonly referred to as forecasts. Typically, forecasting models are used to project data forward to describe future reference periods and as a consequence no information is available on the reference period of interest. The absence of observed data in forecasts increases the risk of inaccurate results because models rely on the assumption that historical trends and patterns will continue. In contrast, Statistics Canada incorporates some form of observed data in the production of advance indicators. Advance indicators released by Statistics Canada use direct observations as much as possible, such as data received from respondents or administrative data for a reference period of interest. The appropriate use of these data reduces the risk of large differences between advance indicators and preliminary indicators.

Section 2: Issues related to analysis and interpretation

1 What are the strengths and weaknesses of advance indicators?

Advance indicators provide timelier information to users; however, they are less precise than preliminary and revised indicators produced at a later date. Statistics Canada follows a multidimensional framework to assess data quality (Statistics Canada, 2019), including the dimensions of accuracy and timeliness. Statistical products typically aim to strike a balance between these dimensions to best meet the needs of data users. In this particular framework, advance indicators are intended to be timelier, with some compromise in accuracy. Because of this compromise in accuracy, the advance indicators are published at more aggregated levels of detail, such as the national level rather than the provincial level, compared with preliminary and revised indicators. They are also not released through the official Statistics Canada data repository but are only disseminated as part of articles in Statistics Canada's official release bulletin, The Daily.

2 How is the quality of advance indicators monitored?

Before this initiative, studies demonstrated that publishing advance indicators would provide a desirable balance of data timeliness and accuracy for users. In particular, the criteria used for accuracy accounted for the direction and the magnitude of relative month-to-month movements. This element is particularly important because it identifies turning points in a time series; the magnitude of movements is a key consideration because it estimates the pace of economic change. The historical performance of advance indicators produced with flash estimation was assessed, and these indicators outperformed forecasting and nowcasting methods with comparable timeliness. Furthermore, advance indicators can be generated approximately one month earlier than preliminary indicators.

Each month, Statistics Canada compares advance indicators with the preliminary indicators that follow for the same reference period to monitor the size of the differences between them, as well as their coherence in terms of the direction of the month-to-month movement.

Additionally, a comprehensive review of advance indicators and their past performance is conducted periodically. This review includes analysis of descriptive statistics over time, as well as any noteworthy differences observed for individual reference periods.

3 Why are advance indicators and preliminary indicators different?

Since advance indicators are derived using the same methods as preliminary indicators, they are subject to the same types of sampling and non-sampling errors, but with different sensitivities to these sources of error. The differences between advance and preliminary indicators can be attributed to the following sources:

  1. Responses received after the production of advance indicators: Imputation, designed to produce unbiased and accurate aggregate estimates, is used to estimate values for each non-responding unit when advance indicators are produced. If an individual unit does not respond in time for the advance indicator but provides a response before the preliminary indicator is produced, this causes a difference between the advance and preliminary indicators. Large differences can occur when individual units have notable differences, or when relatively small differences accumulate from many units.
  2. Improvements to imputation from additional responses received: Even for units that do respond in time for the advance indicator, the imputed values themselves can differ between advance and preliminary indicators. Since imputation for the preliminary indicator is based on more complete information, this indicator should be viewed as an improved estimate for non-responding units versus the advance indicator. While these differences are typically small, they can accumulate to yield notable differences between the advance and preliminary indicators.
  3. Updates made to data after the advance indicator: When advance indicators are produced, respondent data are subjected to a further revision and are updated as part of the data validation process, which can result in differences between the advance and preliminary indicators.
4 Are other approaches being considered to produce advance indicators?

Statistics Canada continually seeks to adopt leading-edge methods to maximize data quality. To support these efforts, Statistics Canada collaborates regularly with other national statistical offices, other statistical organizations and academia to identify, develop and evaluate methods that would assist in producing advance indicators. In particular, nowcasting approaches hold promise to further improve timeliness, if statistical models with suitable accuracy relying only on the information available can be developed.

5 Where can I find more information?

Eurostat (2017), Handbook on Rapid Estimates, 2017 Edition (PDF).

Statistics Canada (2019), Statistics Canada Quality Guidelines – Sixth Edition.

Audit of Staffing

November 2023
Project number: 80590-127

Table of contents

Executive summary

People are Statistics Canada's most valuable resource. Collectively, they drive the agency's mandate and priorities in producing statistics that help Canadians better understand their country. Staffing processes must be effective and efficient to ensure Statistics Canada hires the right people with the right competencies at the right time.

Staffing processes are a shared responsibility between the hiring manager and the staffing advisor. The hiring manager is responsible for making selection decisions that respect the public service values and ethics while acting in conformity with organizational goals. The staffing advisor is responsible for providing options, strategic advice, and guidance on the available staffing mechanisms and associated risks and considerations. The human resources (HR) assistant works alongside their assigned staffing advisor by being responsible for administrative components of the staffing process, such as creating and maintaining the staffing action file in GCdocs. Finally, Corporate Staffing is responsible for designing the staffing process; monitoring; and overseeing many of the fundamental components that support staffing advisors and HR assistants in executing staffing actions, such as developing tools and templates, developing training and guidance, and representing the agency in the event of a complaint.

The agency has experienced an extraordinary staffing landscape since the onset of the COVID-19 pandemic . Annual staffing actions processed by the agency grew from 6,473 in 2019/2020 to 13,947 in 2021/2022, before falling to 8,643 in 2022/2023. As hiring ramped up to meet the demands of new projects and programs, staffing processes were strained, leading to a backlog of staffing actions. This sudden increase took place during a period when HR leadership and teams were seized with efforts to transition the agency to full telework (followed by a shift to a hybrid model), multiple implementations of vaccine reporting, the hiring of thousands of contact tracers and the implementation of a rapid testing program for census employees. However, through significant work and effort, and without additional resources, the agency successfully recruited all required employees to work on the new projects and programs. This extraordinary staffing volume is not expected to be repeated in the foreseeable future.

Why is this important?

Staffing actions more than doubled over the last few years, leading to a persistent backlog and increasing the length of time to hire. An effective and efficient staffing process is critical to ensuring Statistics Canada has the HR required to achieve its mandate and priorities.

Further, according to the Public Service Commission's Appointment Framework, Statistics Canada is required to conduct a cyclical assessment at least once every five years. This assessment takes a broad look at the health of the organizational staffing system and identifies areas that need to be strengthened, as well as potential measures to address weaknesses. This Audit of Staffing contributes to fulfilling this requirement.

Overall conclusion

The agency's staffing processes have been stressed by a sudden increase in the volume of staffing transactions, combined with the departure of experienced staffing advisors. To optimize the effectiveness and efficiency of the staffing process as the agency moves towards more regular staffing levels, opportunities for improvement were identified in the areas of staffing advisor support and development, process design and documentation, staffing integrated systems, and the monitoring of and reporting on the process.

Key findings

Hiring managers' views on staffing

While the agency has been successful in recruiting employees to meet the increased demand in recent years, the administrative burden of staffing processes on hiring managers has resulted in some frustration among hiring managers. They find it cumbersome and overly compliance driven, inefficient, and sometimes ineffective in finding the best candidate. While acknowledging the challenges faced by staffing advisors, hiring managers would like to better leverage HR expertise throughout the staffing process.

Staffing advisor and human resources function views on staffing

There has been significant turnover in the staffing team in recent years. Most staffing advisors are new to their role, yet do not feel adequately supported. Staffing processes are not well defined, are not optimized at the agency level, are inadequately supported by integrated information systems and undergo frequent change. Together, these factors add administrative burden for staffing advisors and make it difficult for them to fully meet hiring manager expectations.

Monitoring the efficiency and effectiveness of the staffing process

There is limited reporting on metrics related to the efficiency and effectiveness of staffing processes overall. This type of monitoring would support data-driven decision making and the ongoing improvement of the effectiveness and efficiency of the staffing process.

Conformance with professional standards

The audit was conducted in accordance with the Mandatory Procedures for Internal Auditing in the Government of Canada, which include the Institute of Internal Auditors'International Standards for the Professional Practice of Internal Auditing.

Sufficient and appropriate audit procedures have been conducted, and evidence has been gathered to support the accuracy of the findings and conclusions in this report and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined and for the scope and period covered by the audit.

Steven McRoberts
Chief Audit and Evaluation Executive

Introduction

Background

People are Statistics Canada's most valuable resource. Collectively, they drive the agency's mandate and priorities in producing statistics that help Canadians better understand their country. Staffing processes must be effective and efficient to ensure Statistics Canada hires the right people with the right competencies at the right time.

The agency has defined three specific objectives for the staffing process that can be summarized as follows:

  • Achieving operational objectives: The agency must build an agile and competent workforce through effective human resources (HR) planning activities and staffing and recruitment strategies adapted to its organizational context and business needs.
  • Compliance: The agency must ensure compliance with the Public Service Employment Act (PSEA), the Values and Ethics Code for the Public Sector, and requirements of internal and central agency policies.
  • Equity, diversity and inclusion: Hiring managers should always look for the best qualified person to fill a position. They must also put in place strategies to attract talent from different backgrounds who offer a wide variety of ideas and strengths that, together, lead to innovative outcomes.

Staffing processes are a shared responsibility between the hiring manager and the staffing advisor. The hiring manager is responsible for planning for their workforce; identifying needs; engaging meaningfully with HR throughout the process; providing information as required; and ultimately making selection decisions that respect public service values and ethics, legislation, and policies while acting in conformity with organizational goals. The staffing advisor is responsible for providing options, strategic advice, and guidance on the available staffing mechanisms and associated risks and considerations. The HR assistant works alongside their assigned staffing advisor by being responsible for administrative components of the staffing process, such as creating and maintaining the staffing action file in GCdocs. Finally, Corporate Staffing is responsible for designing the staffing process; monitoring; and overseeing many of the fundamental components that support staffing advisors and HR assistants in executing staffing actions, such as developing tools and templates, developing training and guidance, and representing the agency in the event of a complaint.

Staffing mechanisms within the federal public service are subject to the PSEA, the Treasury Board's Policy on People Management, and the Public Service Commission's (PSC) Appointment Framework. Notably, the PSC's Appointment Framework was launched in 2016 with the intention of streamlining requirements, reducing administrative burden, offering greater flexibility and accountability, encouraging agile and customized approaches to staffing and policies, and increasing focus on outcomes. Within Statistics Canada, hiring also adheres to the agency's Staffing Governance Framework, which took effect on May 15, 2022.

In 2020/2021, like all organizations, Statistics Canada responded to the COVID-19 pandemic quickly, pivoting operations to focus on mission-critical programs and, with an unprecedented need for data, delivering data-driven insights to Canadians at a time when they were needed most. Throughout this time, the agency experienced an extraordinary staffing landscape as annual staffing actions processed by the agency grew from 6,473 in 2019/2020 to 13,947 in 2021/2022, before falling to 8,643 in 2022/2023. This sudden increase took place during a period when HR leadership and teams were seized with efforts to transition to full telework (some 7,500 employees were moved to telework, a great advance), followed by a shift to a hybrid model (including the use of personas, pulse surveys and an employee wellness survey to support employees' transition to a hybrid work environment); multiple implementations of vaccine reporting; the hiring of thousands of contact tracers (over 2,000 additional Statistical Survey Operations employees were hired in less than four months); and the implementation of a rapid testing program for census employees. As hiring ramped up to meet the demands of new projects and programs, the agency struggled to keep pace, leading to a persistent backlog of staffing actions. However, through significant work and effort, and without additional resources, the agency successfully recruited all required employees to work on the new projects and programs. This extraordinary staffing volume is not expected to be repeated in the foreseeable future.

Audit objective

The objective of this audit is to provide reasonable assurance on the effectiveness and efficiency of staffing processes and tools to facilitate appropriate and timely staffing of personnel.

Scope

The scope of the engagement included an examination of selected components of the agency's management control framework for staffing, including processes to

  • ensure management and the Workforce and Workplace Branch work in partnership to execute on staffing needs and priorities
  • enable all parties to understand and adhere to their respective staffing responsibilities
  • enable effective and timely staffing in accordance with management's needs.

Compliance with the PSC's Appointment Framework was excluded from the engagement scope, as the Workforce and Workplace Branch conducts annual self-assessments of staffing files to assess compliance and identify areas for improvement.

For the purposes of this engagement, the staffing process was considered to begin when the hiring manager identifies a need to fill a vacancy and to be complete once the letter of offer is issued to the selected candidate.

The scope period covered processes in place for the fiscal years 2021/2022 to 2022/2023.

Executive staffing was excluded from the engagement scope, as it follows different processes and requirements from other staffing actions.

Finally, the engagement scope also excluded the following elements, as they would be better assessed through separate, targeted engagements: the parts of the hiring process that take place after the selection of the successful candidate (including the security clearance process and pay-related processes); equity, diversity and inclusion initiatives; official languages measures; and financial management controls.

Approach and methodology

This audit was conducted in accordance with the Mandatory Procedures for Internal Auditing in the Government of Canada, which include the Institute of Internal Auditors' International Standards for the Professional Practice of Internal Auditing.

The audit work consisted of

  • examination and analysis of relevant internal documentation
  • questionnaires for hiring managers and staffing advisors
  • interviews and walkthroughs with hiring managers from across the agency, staffing operations team leads, staffing advisors, HR assistants and a team lead, along with representatives from Corporate Staffing and HR management.

Authority

The audit was conducted under the authority of the approved Statistics Canada Integrated Risk-based Audit and Evaluation Plan (2023/2024 to 2027/2028).

Findings, recommendations and management response

Hiring managers' views on staffing

While the agency has been successful in recruiting employees to meet the increased demand in recent years, the administrative burden of staffing processes on hiring managers has resulted in some frustration among hiring managers. They find it cumbersome and overly compliance driven, inefficient, and sometimes ineffective in finding the best candidate. While acknowledging the challenges faced by staffing advisors, hiring managers would like to better leverage HR expertise throughout the staffing process.

Staffing is a shared responsibility between hiring managers, staffing advisors and HR assistants. Staffing advisors and HR assistants support the administration of the process, while hiring managers are accountable for planning their staffing needs, making selection decisions, and ensuring compliance with staffing-related legislation and policies.

Staffing processes are cumbersome and overly compliance driven

While the agency's objectives for staffing do not explicitly include the concept of efficiency, achieving the three objectives in the least amount of time and at the lowest cost to the agency remains an important, yet unstated, goal. Further, these objectives are sometimes in tension with one another, requiring careful balancing to optimize results. Specifically, achieving operational objectives and ensuring compliance are often at odds, as attaining compliance adds burden to the process and can delay staffing actions, putting operational objectives at risk.

Hiring managers noted that staffing templates, established to demonstrate compliance, are burdensome and time-consuming to complete. They questioned whether the agency was striking the right balance between operational and compliance objectives, noting that often the feedback they received when completing the forms added marginal business value. Hiring managers also reported that staffing advisors each had their own interpretation of what was required and the point at which compliance requirements were satisfied.

Completing the Candidate Assessment Validation (CAV) form was identified as a common challenge in this regard. This form exists to fulfill compliance objectives and the PSEA requirement that the hiring decision for non-advertised appointments be documented. Staffing advisors and hiring managers alike noted that hiring managers spend significant time in completing the document and that it often included multiple rounds of going back and forth between the hiring manager and staffing advisor. The CAV template did not provide detailed guidance on what could represent sufficient documentation of the hiring decision, leaving room for varying interpretations of the point at which the documentation was acceptable.  

Striving for perfection in achieving compliance can lead to inefficiencies and unnecessary complications. While it is essential to meet the requirements set by the PSC, aiming for perfection beyond the established threshold can result in prolonged staffing processes, increased administrative burden and cost, and delays in filling critical positions. An overemphasis on perfection can also divert resources and attention from other pressing matters, leading to an imbalance in priorities.

Certain staffing processes are slow and may not result in finding good candidates

Two-thirds of hiring managers noted that traditional staffing processes and assessments were not always effective in finding the right candidate. While candidates often looked good on paper and answered questions appropriately in the interview stage, their on-the-job performance was not always reflective of their assessment results. Certain types of staffing actions are very labour-intensive for hiring managers, with varying results in finding candidates with the specific skills they are looking for. Other challenges included processes receiving too few applicants and taking too long (applicants often accepted positions elsewhere when the processes took too long).

Hiring managers also reported using several staffing actions to expedite employee start dates with the agency. In such scenarios, once a candidate is selected, they are offered an acting appointment to allow time for an indeterminate appointment to be processed. While this practice effectively speeds hiring, it also increases the workload for both staffing advisors and hiring managers as it creates two staffing actions in the place of one.

Siloed human resources functions create extra steps in the process for hiring managers

There are various HR functions that can be involved in a hiring process, including staffing and classification, along with other functions within the Corporate Strategy and Management Field (e.g., security, official languages and finance). Hiring managers reported taking on the facilitator role between the various HR functions, which were viewed as siloed. Several hiring managers noted that they are uncertain which group to reach out to for various HR issues, and this could add to the complexity and time of the staffing process. Members of the staffing function also expressed similar concerns.

Hiring managers want to better leverage strategic support from their staffing advisors

While acknowledging the challenges faced by staffing advisors, hiring managers indicated they would like staffing advisors to take on more responsibility throughout the staffing process. This would allow the agency to leverage the expertise of the staffing advisors and would align with the desired evolution of the staffing function. This change would include support such as providing strategic advice, assisting with screening large numbers of candidates and helping with the completion of mandatory compliance documentation. As the staffing function is not currently resourced or equipped to provide this level of support, some fields and divisions have created staffing "shadow shops" to perform these tasks. Using their knowledge of the hiring manager's business needs and past knowledge of the staffing process, these support teams act as a liaison between hiring managers and staffing advisors.

However, these shadow shops create challenges for staffing advisors. They disconnect the staffing advisor from the business, resulting in the advisor focusing mostly on the administrative aspects. Further, the guidance provided by the shadow shops may not align with that of the staffing advisor. While some of these shops have been disbanded or reduced, the demand for their services has not waned. Many staffing advisors acknowledged the need for the services these shadow shops provide. The team of staffing advisors wants to better support clients and acknowledged the challenges in the current environment, which include the high volume of staffing actions, the number and diverse needs of hiring managers they serve, and the administrative burdens caused by a manual and inefficient staffing process (discussed further in the Monitoring the efficiency and effectiveness of the staffing process section below).

The staffing process has not been designed to optimize efficiency and effectiveness

Many of the issues identified point towards a staffing process that has been designed based on accountabilities, rather than efficiency and effectiveness. Along these lines, the total cost of the staffing process is not known or monitored. While the cost of the staffing function itself is known, the considerable time invested by hiring managers in staffing is not tracked. Hiring manager time has thus become an unmeasured resource when designing the staffing process, and over time the process can drift towards being optimized at the staffing function level rather than the agency level. This results in tasks being assigned to hiring managers because they are accountable for them, not because they are the most efficient or effective people to perform them. In many cases, tasks for which hiring managers are accountable may be better or more efficiently performed by a centralized resource with the knowledge and volume of transactions to become proficient in the tasks.

For example, hiring managers are accountable for the selection decision. Consequently, they are expected to define the competencies required for a position, develop assessment tools for each competency, and evaluate the assessment tools for bias—all with limited support or training in these activities. Each of these activities can benefit from in-depth knowledge and experience and may be more effectively and efficiently performed, or guided, by a centralized resource with expertise in competencies and their assessment. Under an agency-level mindset, a more efficient and effective process might have a centralized resource maintain a bank of competencies, complete with definitions and bias-free assessment tools. Hiring managers could leverage these as required, rather than having to independently research competencies and develop assessment tools for their own personal use. While the agency does maintain a bank of tools for larger collective processes that are repeated on a periodic basis, similar support is not available for individual processes.

Staffing advisor and human resources function views on staffing

There has been significant turnover in the staffing team in recent years. Most staffing advisors are new to their role, yet do not feel adequately supported. Staffing processes are not well defined, are not optimized at the agency level, are inadequately supported by integrated information systems and undergo frequent change. Together, these factors add administrative burden for staffing advisors and make it difficult for them to fully meet hiring manager expectations.

Staffing advisors, operating within the existing manually driven processes, are faced with many challenges while supporting hiring managers. Many staffing advisors are new to their role and would like more support and the opportunity to learn from more experienced advisors. A commonly identified theme was that staffing advisors have the best interests of the agency and hiring managers in mind and are dedicated to providing an effective service.

Most staffing advisors are new to their role

Staffing advisors play a key role in the staffing process. They are process experts, responsible for guiding hiring managers from need identification to the appointment of the chosen candidate. They provide strategic advice on staffing mechanisms, risks and compliance requirements. This advice demands a strong understanding of their clients' business and staffing needs; of how to reach, attract and assess high-quality candidates; and of how to navigate the complex web of regulations, policies and directives that govern federal government hiring.

Across the federal government, staffing advisors are in short supply and in high demand, making movement between departments a common challenge.

As noted, the agency has seen a sharp increase in the volume of staffing actions over the past few years. Concurrently, turnover rates among the agency's Personnel Administration (PE) category have also spiked. During this time, the agency experienced turnover rates of 25% to 33% among PEs, compared with 7% to 12% across the federal public service. In 2022/2023, the staffing function had a turnover rate of 36%, compared with a turnover rate of 23% for the agency.

Management reported difficulty in replacing departing experienced staffing advisors with similarly experienced advisors. Consequently, junior staffing advisors (PE-01 or PE-02 levels) were hired in their place. As of September 2023, the agency had 14 staffing advisors in total: 12 junior staffing advisors at the PE-01 or PE-02 level and 2 senior staffing advisors at the PE-03 level.  

Staffing advisors require more support, straining team leads working to fill the gap

Developing a staffing advisor takes time, training and mentoring. They begin at the PE-01 level as a developmental HR advisor, typically for a period of one year or more, before progressing to the PE-02 level as an HR advisor, at which point they deliver operational services of limited scope. Junior staffing advisors are supported throughout this process by a more experienced staffing advisor at the PE-03 (HR specialist) or PE-04 (team lead) level. This advisor serves as a mentor, allowing the junior advisor to shadow them, reviewing their work before it is communicated to the hiring manager and acting as a sounding board when they encounter difficult situations. It is only once staffing advisors have attained the PE-03 HR specialist level that they may work fully independently and are considered to provide expert staffing advice.

Because of this high turnover among senior staffing advisors, junior staffing advisors have been assigned as lead advisors to work directly with hiring managers without an assigned mentor (but with supervision from their team lead). In some respects, they are performing the responsibilities of a senior staffing advisor without having the knowledge, experience or support to perform at that level. Junior staffing advisors reported taking on roles they do not feel sufficiently equipped or supported to execute. Many reported being uncomfortable providing advice to senior managers and executives without having experience with all types of staffing actions (particularly advertised processes) or fully understanding staffing processes.

With a shortage of experienced staffing advisors, team leads are struggling to fill the gap. The staffing team is presently composed of two team leads, each of whom is assigned seven to eight staffing advisors. Staffing management reported that team sizes of four to five were more typical, with a mix of junior and experienced advisors. Team leads reported being overwhelmed with their workload and having to work overtime to keep up while providing coaching, review and ongoing support to junior staffing advisors. Staffing advisors also noted that team leads seemed overburdened, making staffing advisors hesitant to approach them when they needed help. They stated this has led to delays in responding to hiring managers and has slowed the staffing process.

Frequent rotation of staffing advisors across client groups is causing frustration for hiring managers and staffing advisors

Adding to these challenges, the high turnover and volume of staffing actions have resulted in a frequent rotation of client groups among staffing advisors. Hiring managers and staffing advisors expressed frustration with this constant change and noted that greater stability was required. For staffing advisors, frequent rotation makes it difficult to establish strong working relationships with their clients and to understand their business and their needs. For hiring managers, a new staffing advisor requires dedicating time to update them on ongoing staffing actions, their business and their broader staffing needs. It also means adjusting to different ways of doing things, which can lead to challenges as advisors may have different interpretations of what is required and how the staffing process should proceed.

The shortage of experienced staffing advisors able to provide expert advice, coupled with the frequent rotation of staffing advisors across client groups, has affected the quality of service staffing advisors can provide and the cohesion between hiring managers and staffing advisors. These factors also serve to compound the other challenges discussed throughout this report.

Hiring managers are not involving staffing advisors early in the process

HR management noted that a common challenge concerned staffing advisors not being informed of staffing needs and activities early enough in the process. Last-minute requests result in backdated transactions and a high volume of urgent staffing actions. These issues make it difficult for staffing advisors to plan their work; complicate pay processes; and create problems for employees, who risk being paid wrong, late or not at all, or experiencing delays or refusal of benefits. These issues also affect the relationship between hiring managers and staffing advisors. Hiring managers expressed a desire for more support from their staffing advisors. Likewise, staffing advisors expressed a desire to provide more strategic advice and support to hiring managers and to be seen as a trusted partner in their clients' staffing activities. However, not involving staffing advisors early in the process relegates them to simply processing paperwork, often under tight time pressures. This in turn can lead to staffing advisors being seen as an obstacle to hiring managers, as the staffing advisor is left scrambling to ensure all the administrative requirements necessary to process the staffing action are fulfilled.

The agency introduced a timeliness initiative in May 2023 to establish clear timelines that hiring managers must meet for HR actions to be processed in time. As the initiative was implemented during the audit, it was not possible to assess its effectiveness. That said, management hopes that it will increase rigour and discipline in staffing processes and lead to hiring managers including staffing advisors earlier in their staffing actions.

Staffing processes are not well defined, making it difficult to optimize the process

Well-designed and implemented processes and procedures contribute to an effective staffing function. They help ensure common and standardized approaches in the administration of staffing actions, establish clear expectations for the level of service provided by staffing advisors, and inform all participants of the requirements and timelines of a typical staffing action. They can also ensure the most efficient approach is taken on each staffing action by properly sequencing actions and ensuring that no steps are forgotten.

Staffing advisors and hiring managers noted that the process was not fully defined. The HR function has not developed clear, comprehensive and documented procedures outlining the specific steps required for each type of staffing process. That said, a comprehensive flowchart was developed for the non-advertised staffing process in coordination with Field 7's Operations and Integration Division but was not shared with staffing advisors or hiring managers as the project was discontinued because of fiscal constraints. This work is a good start and could be leveraged in documenting other types of staffing actions. Various checklists have also been developed to define the key documentation required for each type of staffing action, but hiring managers and junior staffing advisors reported that these were not sufficiently detailed to meet their needs as they do not include all the steps of the process, the expected timelines for each step or the purpose behind each step.

The lack of clear, standardized procedures has resulted in a process that varies depending on the staffing advisor and hiring manager involved.

As an example, a key step at the outset of a staffing action is the development of the statement of merit criteria (SoMC). This document outlines the experience, skills and competenciesFootnote 1 required for the position and forms the basis of the candidate assessment. Completing the SoMC is the responsibility of the hiring manager, with guidance from the staffing advisor. There was no common approach to determining the level of support that should be provided. Also, staffing advisors who reported sharing SoMCs from other departments were sharing them from their personal collection rather than a list curated and maintained by the agency and available to all other staffing advisors.

Non-integrated information systems create administrative burden for staffing advisors

Information management and information technology (IT) systems are required to facilitate efficient and effective staffing processes. These systems should support process flows, document management, and monitoring and reporting.

Statistics Canada employs several systems to support its staffing process. These systems include Orbit, the Staffing Activity Management System, a work-in-progress (WIP) spreadsheet shared among all staffing advisors, SharePoint for financial approvals, GCdocs for key documents, staffing advisor and HR assistant mailboxes for correspondence and documents in progress, and various spreadsheets and other unique tools created and used by staffing advisors and HR assistants to track their files. Notably, these systems are not integrated, affecting the availability and accuracy of data related to the staffing process. Many data points are duplicated across multiple systems, leading to conflicting information that needs to be reconciled when information is compiled for reporting purposes.

These disparate systems add significant burden for staffing advisors, create complications and frustration for hiring managers, cause delays to staffing processes, and take staffing advisor time away from providing strategic support to their clients. There are also multiple entry points for new staffing actions, meaning staffing advisors must monitor their emails, SharePoint and WIP to know what work they have underway. Reporting for senior management requires staffing advisors to manually compile and reconcile data from multiple systems. Hiring managers cannot see the status of their staffing actions, leading to numerous emails to staffing advisors requesting status updates that must also be manually compiled and reconciled by staffing advisors. Additionally, the systems do not support the process of completing the required forms, meaning that they must be shared by email between the hiring manager, their delegate and the staffing advisor during their development, leading to lost documents and version control issues. These issues create frustration and undermine the hiring manager's trust in the staffing advisor in several ways. First, the lack of system-level support for the staffing process can make it hard for staffing advisors to locate files from previous staffing actions, giving the appearance of disorganization to hiring managers. Second, staffing advisors sometimes ask hiring managers to provide documents a second or third time for a given process, causing frustration and a loss of trust among hiring managers who understood that the staffing advisor was responsible for storing documentation for the staffing action.

Further, there is no formal documentation on which system is used to track what information, who should enter the information and when it should be done. Staffing advisors and their HR assistants work out roles and responsibilities between themselves. Consequently, some HR assistants may do more to support their staffing advisor while others may do less. These inconsistencies are manifested in the information available in the systems.

The HR team has undertaken a project to implement an integrated system to automate monitoring and improve the efficiency of the staffing process. However, this is not its first attempt to do so. Previous attempts have been set aside in favour of newer or more urgent priorities or have had their scope changed during implementation such that they did not deliver the integrated system that had been envisioned. Several concerns were raised regarding this current implementation that warrant consideration:

  • Senior HR management stated an expectation of implementation in the fall at minimal cost (attributable to having the selected software available to the agency at no additional cost). However, according to IT staff, a project of this scope will take months (at minimum) to implement, making a fall implementation unrealistic. They further noted that the development costs will be significant.
  • Two competing software alternatives are being considered. These alternatives were chosen for their low cost (the agency already owns licensing for them) and pre-existing in-house support. However, the system selection is being conducted while user needs are still being defined by Corporate Staffing, and staffing processes have not yet been documented.
  • The requisite resources from IT and Human Resources Business Intelligence (HRBI) have not been formally assigned to the project. It is being conducted "from the corner of the desk," and the HRBI team expressed concerns about competing priorities and deadlines that may affect the system implementation. The availability of the IT team to work on this project is also dependent on other priorities and resource constraints.

Frequent process changes with insufficient communication affect awareness and buy-in

Staffing advisors, Corporate Staffing and hiring managers reported a high pace of change in staffing priorities and processes. Staffing advisors are asked to communicate these changes to hiring managers and enforce the changes when hiring managers push back. However, staffing advisors said they were frequently not consulted on the changes or their implementation or told why the change was being made. As a result, there are missed opportunities to consult staffing advisors for their input and to prepare them to answer questions from hiring managers and build buy-in and awareness for the changes.

Monitoring the efficiency and effectiveness of the staffing process

There is limited reporting on metrics related to the efficiency and effectiveness of staffing processes overall. This type of monitoring would support data-driven decision making and the ongoing improvement of the effectiveness and efficiency of the staffing process.

Establishing clear performance targets and tracking the performance of the staffing process are important and can provide valuable data to support the improvement of its efficiency and effectiveness.

The agency has established timeliness service standards for each type of staffing action, but these standards are insufficient for assessing the efficiency of the overall process. These standards address only the time elapsed after all the documentation has been satisfactorily completed by the hiring manager. It therefore excludes the time spent planning the staffing action and soliciting, screening, interviewing and assessing candidates, as well as the time spent by the hiring manager documenting the selection decision. It also excludes any time spent going back and forth with the staffing advisor to bring the documentation to a satisfactory state.

Without clear targets and more complete monitoring of the time to hire (from need identification to start date), cost to hire (inclusive of hiring manager time) and time spent on the various steps of the process, it is difficult to know whether the staffing process is appropriately designed to meet agency needs or to understand where it may be falling short. Including hiring manager time—by tracking it through the Time Management System or elsewhere—in the cost of the process will be critical to understanding whether portions of the process would be better assigned to lower-cost resources or centralized resources with specialized expertise. These targets, particularly time to hire and cost to hire, should be negotiated with senior management to ensure that the staffing process is designed to appropriately balance cost, compliance objectives and business needs.

The effectiveness of the process is also not currently monitored. This can include monitoring the effectiveness of specific aspects of the staffing process (such as quality of staffing advisor advice and guidance) but should also include monitoring the quality of hires. Doing so could help identify which types of staffing processes are most successful, as well as areas where more work is required to improve effectiveness, such as better identifying the skills and competencies required for a job or designing new assessment tools that more accurately predict on-the-job performance.

Recommendations

Recommendation 1

It is recommended that the assistant chief statistician, Corporate Strategy and Management, ensure that

  1. a consistent approach to each type of staffing action be documented, communicated and implemented.
Management response

Management agrees with the recommendation.

A comprehensive review and analysis of staffing processes will be undertaken, prioritizing the staffing actions that account for the greatest staffing volume, and will include

  • working with partners in fields 6 and 7 to define and document each type of staffing process, including the steps within the process, the timelines associated with each step, and the roles and responsibilities of each individual involved
  • producing documentation and tools that support a standardized approach to staffing processes, with clear roles and responsibilities for each individual involved and associated timelines
  • refreshing the staffing page on the Internal Communications Network (ICN) to articulate staffing processes, steps and requirements, as well as roles and responsibilities for each individual involved in the process.
Deliverables and timeline

The director general, Workforce and Workplace Branch, will

  1. document process maps for each type of staffing action (approximately 20 in total), 10 of which account for 96% of the actions that make up the staffing transaction volume; the first 10 processes will be done in 2024, and the remaining lower-priority processes will be done in 2025, with the following schedule: 5 processes documented by the end of June 2024, 5 processes documented by the end of December 2024, 5 processes documented by the end of June 2025 and the remaining processes documented by December 2025
  2. launch tools spanning guides, infographics and training materials by the beginning of June 2024, with the release of process-specific tools aligning with timeframes in the previous deliverable
  3. refresh and update the staffing page on the ICN by the beginning of June 2024.

Recommendation 2

It is recommended that the assistant chief statistician, Corporate Strategy and Management, ensure that

  1. a plan to address gaps in the development, day-to-day support and retention of staffing advisors be developed and implemented.
Management response

Management agrees with the recommendation.

A thorough review will be conducted to understand and implement the measures required to better support the development and retention of staffing advisors.

To address the findings of the review, the following will be developed: standard procedures; supporting tools; and a development program for staffing advisors that encompasses a blend of formal training, informal training, coaching, and exposure to a range of staffing files and actions to support development and consistency in service delivery. This includes reviewing and updating the PE Recruitment and Development Program, support measures, and training roadmaps and career paths for staffing advisors.

Deliverables and timeline

The director general, Workforce and Workplace Branch, will

  1. document a review of planned measures to support the development and retention of staffing advisors by September 2024.

Recommendation 3

It is recommended that the assistant chief statistician, Corporate Strategy and Management, ensure that

  1. a business case be developed and executed for the implementation of an integrated system to enable the staffing function to efficiently carry out its responsibilities, reduce administrative burden and enable monitoring.
Management response

Management partially accepts the recommendation.

Because of austerity, it is not anticipated that any investment in an integrated system will be supported by central agencies over the next number of years. Management will

  • conduct an internal scan to determine integrated system needs, gaps and immediate options
  • conduct an external scan with the Office of the Chief Human Resources Officer to determine anticipated implementation dates for Next Generation Human Resources and Pay (NextGen HR and Pay) to determine potential implementation at Statistics Canada
  • as an interim measure, implement Workbench to improve tracking of actions and improve oversight of processes.
Deliverables and timeline

The director general, Workforce and Workplace Branch, will

  1. present a documented internal and external scan to define integrated system needs, gaps, and recommended short- and long-term solutions (including NextGen HR and Pay) for the approval of the assistant chief statistician, Corporate Strategy and Management, by March 2024
  2. complete Workbench system changes to improve tracking of actions and improve oversight of processes by June 2024. 

Recommendation 4

It is recommended that the assistant chief statistician, Corporate Strategy and Management, ensure that

  1. key performance indicators and targets for the efficiency and effectiveness of the staffing process be developed, implemented, monitored and reported to senior management.
Management response

Management agrees with the recommendation.

Management will establish a systematic approach to measure the performance and health of the staffing activity. This will include developing quantitative key performance indicators (KPIs) and targets to gauge the efficiency of staffing processes (including time elapsed and effort expended on staffing actions) and the effectiveness of the staffing process (including measures of hiring manager satisfaction).

Deliverables and timeline

The director general, Workforce and Workplace Branch, will

  1. present to the Operations Committee for review and approval a plan for the monitoring and reporting of staffing activity results, including the proposed monitoring cycle by June 2024
  2. present to the Operations Committee for review and approval specific KPIs and targets to be monitored by December 2024 (for the first 10 staffing processes documented, as outlined in the first deliverable)
  3. present reporting of results to the Operations Committee by December 2025 (and ongoing).

Recommendation 5

It is recommended that the assistant chief statistician, Corporate Strategy and Management, ensure that

  1. an agency-level plan to assess and improve the efficiency and effectiveness of the staffing process be developed, following the implementation of actions associated with recommendations 1 through 4, in a way that balances operational objectives with compliance requirements; senior management should be consulted in its development and approve the plan, and progress on its implementation should be reported to a tier 1 governance committee periodically.
Management response

Management agrees with the recommendation.

Following implementation of actions linked to recommendations 1 through 4, management will conduct an annual review of the staffing system to further identify efficiencies, review compliance requirements and measure effectiveness of the function, with input from senior management.

Deliverables and timeline

The director general, Workforce and Workplace Branch, will

  1. present to the Operations Committee for approval a review of the efficiency and effectiveness of the staffing function and proposed action plan to address areas for improvement, the initial part of the review to begin in June 2026, following the completion of process and efficiency reviews of all staffing processes, with a target presentation date of December 2026.

Appendices

Appendix A: Audit criteria

Audit criteria
Control objectives Core controls and criteria Policy instruments and sources

1. The planning of staffing processes is efficient and effective, and hiring managers, staffing advisors and human resources (HR) assistants are adequately supported.

1.1 The planning of staffing processes is efficient.

1.2 The planning of staffing processes is effective in defining the staffing need and establishing a staffing strategy that fulfills the identified need.

1.3 Hiring managers, staffing advisors and HR assistants are adequately supported throughout the planning process.

  • Statistics Canada's Staffing Governance Framework
  • Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
    • Human resource planning is aligned with strategic and business planning. (PPL-1)
    • The organization provides employees with the necessary training, tools, resources and information to support the discharge of their responsibilities. (PPL-4)
    • Suitable policies and procedures to support the development and management of human resources are established, maintained and communicated. (PPL-7)

2. The assessment of candidates is efficient and effective, and hiring managers, staffing advisors and HR assistants are adequately supported.

2.1 The assessment of candidates is efficient.

2.2 The assessment process is effective in determining candidate suitability.

2.3 Hiring managers, staffing advisors and HR assistants are adequately supported throughout the assessment process.

  • Statistics Canada's Staffing Governance Framework
  • Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
    • Human resource planning is aligned with strategic and business planning. (PPL-1)
    • The organization provides employees with the necessary training, tools, resources and information to support the discharge of their responsibilities. (PPL-4)
    • Suitable policies and procedures to support the development and management of human resources are established, maintained and communicated. (PPL-7)

3. The selection of candidates is efficient and effective, and hiring managers, staffing advisors and HR assistants are adequately supported.

3.1 The selection process is efficient.

3.2 The selection process is effective in facilitating the hiring manager's candidate selection decision.

3.3 Hiring managers, staffing advisors and HR assistants are adequately supported throughout the selection process.

  • Statistics Canada's Staffing Governance Framework
  • Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
    • Human resource planning is aligned with strategic and business planning. (PPL-1)
    • The organization provides employees with the necessary training, tools, resources and information to support the discharge of their responsibilities. (PPL-4)
    • Suitable policies and procedures to support the development and management of human resources are established, maintained and communicated. (PPL-7)

4. The staffing process is monitored to support its continuous improvement and ensure business needs are met.

4.1 Targets to monitor the efficiency and effectiveness of the staffing process are established, aligned with business needs and monitored.

4.2 Results of process monitoring are used to identify and address areas of weakness.

  • Statistics Canada's Staffing Governance Framework
  • Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
    • The oversight body / bodies request and receive sufficient, complete, timely and accurate information. (G-6)
    • Management has identified planned results linked to organizational objectives. (RP-1)
    • Management has identified appropriate performance measures linked to planned results. (RP-2)
    • Management monitors actual performance against planned results and adjusts course as needed. (RP-3)

Appendix B: Acronyms

CAV
Candidate Assessment Validation
HR
Human resources
HRBI
Human Resources Business Intelligence
ICN
Internal Communications Network
IT
Information technology
KPI
Key performance indicator
NextGen HR and Pay
Next Generation Human Resources and Pay
PE
Personnel Administration
PSC
Public Service Commission
PSEA
Public Service Employment Act
SoMC
Statement of Merit Criteria
WIP
Work-in-progress

National Indigenous History Month... By the numbers

National Indigenous History Month 2024... By the numbers

Children and youth

Languages

Health care access

Economy

  • In 2022, Indigenous gross domestic income (GDI) reached $60.2 billion, up 9.8% from 2021. This represented 2.3% of Canada's total GDI. The fastest growth was in arts, entertainment, and recreation (+30.6%) and mining, quarrying, and oil and gas extraction (+21.1%). Combined, these sectors accounted for 6.8% of Indigenous GDI. (Indigenous peoples economic account, 2022)
  • About 90% of Indigenous-owned businesses are small (less than 10 employees), with 31.4% located in rural areas. (Survival Rate and Performance of Indigenous-owned Businesses)

Rising prices and well being

  • In 2024, about 6 in 10 First Nations people living off reserve (59%), Métis (58%) and Inuit (62%) reported rising prices added to the amount of stress in their household and relationships during the past six months. (Impacts of rising prices on the well-being of Indigenous people, 2024)
  • In 2024, rising costs such as for gasoline, ammunition, or equipment limited the ability to hunt, fish, or trap for 17% of First Nations people living off reserve, 14% of Métis, and 32% of Inuit. Additionally, 61% of Fist Nations people living off reserve, 59% of Métis, and 64% of Inuit reported that rising prices had limited the amount of healthy and nutritious food they could buy during the past six months. (Impacts of rising prices on the well-being of Indigenous people, 2024)

Employment and education

  • In 2022, the number of jobs held by Indigenous people in Canada grew by 4.4% year over year to reach nearly 886,000. Nearly 1 in 22 jobs in Canada were held by Indigenous people. (Indigenous peoples economic account, 2022)
  • Trades most often chosen by Indigenous journeymen were electricians (15%), carpenters (10%) and welders (8%), whereas the trades most often chosen by Indigenous journeywomen were hairstylists (40%) and cooks (12%). (Indigenous journeypersons: Trends and socioeconomic characteristics, 2010 to 2020)
  • In 2021/2022, First Nations (72%), Métis (65%) and Inuit (69%) women were more likely to enter undergraduate degree programs than their male counterparts (First Nations men 28%; Métis men 35%; Inuit men 31%). Indigenous women, regardless of their Indigenous identity, were also more likely than non-Indigenous women (59%) to enter the same programs. (Highlights on Indigenous new entrants to postsecondary education)

Emergency preparedness

Real Estate Rental and Leasing and Property Management: CVs for operating revenue - 2022

CVs for Operating Revenue - 2022
Table summary
This table displays the results of CVs for Operating Revenue. The information is grouped by geography (appearing as row headers), percent, Lessors of residential buildings and dwellings (except social housing projects), Non-residential leasing and Real estate property managers (appearing as column headers).
Geography CVs for operating revenue
percent
Lessors of residential buildings and dwellings (except social housing projects) Non-residential leasing Real estate property managers
Canada 2.17 1.98 2.71
Newfoundland and Labrador 4.68 3.70 0.01
Prince Edward Island 1.61 1.48 5.43
Nova Scotia 0.93 2.18 3.43
New Brunswick 1.67 1.89 5.82
Quebec 1.72 4.24 6.23
Ontario 4.33 3.02 3.98
Manitoba 1.56 4.26 7.34
Saskatchewan 1.98 2.97 5.66
Alberta 6.43 3.65 6.68
British Columbia 7.24 7.05 6.23
Yukon 2.02 0.97 0.55
Northwest Territories 0.62 2.16 0.00
Nunavut 0.00 0.00 0.00

Monthly Survey of Food Services and Drinking Places: CVs for Total Sales by Geography - March 2024

CVs for Total sales by geography
Geography Month
202303 202304 202305 202306 202307 202308 202309 202310 202311 202312 202401 202402 202403
percentage
Canada 0.22 0.11 0.10 0.09 0.17 0.11 0.11 0.14 0.19 0.13 0.26 0.21 0.17
Newfoundland and Labrador 0.64 0.56 0.34 0.33 0.54 0.35 0.41 0.53 0.53 0.54 0.52 0.85 0.72
Prince Edward Island 8.33 8.10 0.65 0.60 0.66 0.60 0.81 1.18 0.88 3.93 9.57 5.00 1.44
Nova Scotia 0.62 0.28 0.30 0.32 0.36 0.29 0.34 0.39 0.37 0.38 0.83 0.50 0.48
New Brunswick 0.68 0.49 0.35 0.34 0.56 0.27 0.41 0.49 0.49 0.51 0.49 0.74 0.61
Quebec 0.52 0.20 0.27 0.24 0.40 0.28 0.33 0.46 0.59 0.33 0.30 0.53 0.40
Ontario 0.45 0.21 0.14 0.15 0.34 0.20 0.18 0.20 0.32 0.21 0.51 0.36 0.31
Manitoba 0.70 0.38 0.33 0.28 0.42 0.31 0.30 0.64 0.45 0.70 0.49 0.57 0.59
Saskatchewan 0.34 0.33 0.28 0.30 0.38 0.40 0.38 0.70 1.06 0.50 0.48 0.65 1.03
Alberta 0.32 0.24 0.20 0.16 0.22 0.25 0.29 0.32 0.30 0.29 0.70 0.36 0.35
British Columbia 0.34 0.16 0.23 0.18 0.20 0.24 0.22 0.26 0.26 0.30 0.73 0.41 0.26
Yukon Territory 29.84 1.33 15.96 1.19 11.83 1.33 12.06 11.15 1.42 1.42 1.92 4.21 2.61
Northwest Territories 38.10 1.80 21.99 1.82 18.97 8.00 23.59 16.14 1.75 1.78 2.21 2.92 2.40
Nunavut 2.47 1.57 72.13 2.20 61.61 6.64 5.24 1.33 1.80 2.34 4.25 7.92 5.99