History and context

Filter results by

Search Help
Currently selected filters that can be removed

Keyword(s)

Type

1 facets displayed. 1 facets selected.

Geography

1 facets displayed. 0 facets selected.

Survey or statistical program

3 facets displayed. 0 facets selected.

Content

1 facets displayed. 0 facets selected.
Sort Help
entries

Results

All (78)

All (78) (0 to 10 of 78 results)

  • Articles and reports: 12-001-X202300200012
    Description: In recent decades, many different uses of auxiliary information have enriched survey sampling theory and practice. Jean-Claude Deville contributed significantly to this progress. My comments trace some of the steps on the way to one important theory for the use of auxiliary information: Estimation by calibration.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300200013
    Description: Jean-Claude Deville is one of the most prominent researcher in survey sampling theory and practice. His research on balanced sampling, indirect sampling and calibration in particular is internationally recognized and widely used in official statistics. He was also a pioneer in the field of functional data analysis. This discussion gives us the opportunity to recognize the immense work he has accomplished, and to pay tribute to him. In the first part of this article, we recall briefly his contribution to the functional principal analysis. We also detail some recent extension of his work at the intersection of the fields of functional data analysis and survey sampling. In the second part of this paper, we present some extension of Jean-Claude’s work in indirect sampling. These extensions are motivated by concrete applications and illustrate Jean-Claude’s influence on our work as researchers.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300200015
    Description: This article discusses and provides comments on the Ardilly, Haziza, Lavallée and Tillé’s summary presentation of Jean-Claude Deville’s work on survey theory. It sheds light on the context, applications and uses of his findings, and shows how these have become engrained in the role of statisticians, in which Jean-Claude was a trailblazer. It also discusses other aspects of his career and his creative inventions.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300200017
    Description: Jean-Claude Deville, who passed away in October 2021, was one of the most influential researchers in the field of survey statistics over the past 40 years. This article traces some of his contributions that have had a profound impact on both survey theory and practice. This article will cover the topics of balanced sampling using the cube method, calibration, the weight-sharing method, the development of variance expressions of complex estimators using influence function and quota sampling.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300100007
    Description: I provide an overview of the evolution of Statistical Disclosure Control (SDC) research over the last decades and how it has evolved to handle the data revolution with more formal definitions of privacy. I emphasize the many contributions by Chris Skinner in the research areas of SDC. I review his seminal research, starting in the 1990’s with his work on the release of UK Census sample microdata. This led to a wide-range of research on measuring the risk of re-identification in survey microdata through probabilistic models. I also focus on other aspects of Chris’ research in SDC. Chris was the recipient of the 2019 Waksberg Award and sadly never got a chance to present his Waksberg Lecture at the Statistics Canada International Methodology Symposium. This paper follows the outline that Chris had prepared in preparation for that lecture.
    Release date: 2023-06-30

  • Articles and reports: 12-001-X202000100001
    Description:

    For several decades, national statistical agencies around the world have been using probability surveys as their preferred tool to meet information needs about a population of interest. In the last few years, there has been a wind of change and other data sources are being increasingly explored. Five key factors are behind this trend: the decline in response rates in probability surveys, the high cost of data collection, the increased burden on respondents, the desire for access to “real-time” statistics, and the proliferation of non-probability data sources. Some people have even come to believe that probability surveys could gradually disappear. In this article, we review some approaches that can reduce, or even eliminate, the use of probability surveys, all the while preserving a valid statistical inference framework. All the approaches we consider use data from a non-probability source; data from a probability survey are also used in most cases. Some of these approaches rely on the validity of model assumptions, which contrasts with approaches based on the probability sampling design. These design-based approaches are generally not as efficient; yet, they are not subject to the risk of bias due to model misspecification.

    Release date: 2020-06-30

  • Articles and reports: 12-001-X201900100003
    Description:

    In this short article, I will attempt to provide some highlights of my chancy life as a Statistician in chronological order spanning over sixty years, 1954 to present.

    Release date: 2019-05-07

  • Articles and reports: 89-20-00012018001
    Description:

    Historical works allow readers to peer into the past, not only to satisfy our curiosity about “the way things were,” but also to see how far we’ve come, and to learn from the past. For Statistics Canada, such works are also opportunities to commemorate the agency’s contributions to Canada and its people, and serve as a reminder that an institution such as this continues to evolve each and every day.

    On the occasion of Statistics Canada’s 100th anniversary in 2018, Standing on the shoulders of giants: History of Statistics Canada: 1970 to 2008, builds on the work of two significant publications on the history of the agency, picking up the story in 1970 and carrying it through the next 36 years, until 2008. To that end, when enough time has passed to allow for sufficient objectivity, it will again be time to document the agency’s next chapter as it continues to tell Canada’s story in numbers.

    Release date: 2018-12-03

  • Articles and reports: 12-001-X201700254888
    Description:

    We discuss developments in sample survey theory and methods covering the past 100 years. Neyman’s 1934 landmark paper laid the theoretical foundations for the probability sampling approach to inference from survey samples. Classical sampling books by Cochran, Deming, Hansen, Hurwitz and Madow, Sukhatme, and Yates, which appeared in the early 1950s, expanded and elaborated the theory of probability sampling, emphasizing unbiasedness, model free features, and designs that minimize variance for a fixed cost. During the period 1960-1970, theoretical foundations of inference from survey data received attention, with the model-dependent approach generating considerable discussion. Introduction of general purpose statistical software led to the use of such software with survey data, which led to the design of methods specifically for complex survey data. At the same time, weighting methods, such as regression estimation and calibration, became practical and design consistency replaced unbiasedness as the requirement for standard estimators. A bit later, computer-intensive resampling methods also became practical for large scale survey samples. Improved computer power led to more sophisticated imputation for missing data, use of more auxiliary data, some treatment of measurement errors in estimation, and more complex estimation procedures. A notable use of models was in the expanded use of small area estimation. Future directions in research and methods will be influenced by budgets, response rates, timeliness, improved data collection devices, and availability of auxiliary data, some of which will come from “Big Data”. Survey taking will be impacted by changing cultural behavior and by a changing physical-technical environment.

    Release date: 2017-12-21

  • Articles and reports: 12-001-X201700254894
    Description:

    This note by Danny Pfeffermann presents a discussion of the paper “Sample survey theory and methods: Past, present, and future directions” where J.N.K. Rao and Wayne A. Fuller share their views regarding the developments in sample survey theory and methods covering the past 100 years.

    Release date: 2017-12-21
Data (0)

Data (0) (0 results)

No content available at this time.

Analysis (78)

Analysis (78) (0 to 10 of 78 results)

  • Articles and reports: 12-001-X202300200012
    Description: In recent decades, many different uses of auxiliary information have enriched survey sampling theory and practice. Jean-Claude Deville contributed significantly to this progress. My comments trace some of the steps on the way to one important theory for the use of auxiliary information: Estimation by calibration.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300200013
    Description: Jean-Claude Deville is one of the most prominent researcher in survey sampling theory and practice. His research on balanced sampling, indirect sampling and calibration in particular is internationally recognized and widely used in official statistics. He was also a pioneer in the field of functional data analysis. This discussion gives us the opportunity to recognize the immense work he has accomplished, and to pay tribute to him. In the first part of this article, we recall briefly his contribution to the functional principal analysis. We also detail some recent extension of his work at the intersection of the fields of functional data analysis and survey sampling. In the second part of this paper, we present some extension of Jean-Claude’s work in indirect sampling. These extensions are motivated by concrete applications and illustrate Jean-Claude’s influence on our work as researchers.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300200015
    Description: This article discusses and provides comments on the Ardilly, Haziza, Lavallée and Tillé’s summary presentation of Jean-Claude Deville’s work on survey theory. It sheds light on the context, applications and uses of his findings, and shows how these have become engrained in the role of statisticians, in which Jean-Claude was a trailblazer. It also discusses other aspects of his career and his creative inventions.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300200017
    Description: Jean-Claude Deville, who passed away in October 2021, was one of the most influential researchers in the field of survey statistics over the past 40 years. This article traces some of his contributions that have had a profound impact on both survey theory and practice. This article will cover the topics of balanced sampling using the cube method, calibration, the weight-sharing method, the development of variance expressions of complex estimators using influence function and quota sampling.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202300100007
    Description: I provide an overview of the evolution of Statistical Disclosure Control (SDC) research over the last decades and how it has evolved to handle the data revolution with more formal definitions of privacy. I emphasize the many contributions by Chris Skinner in the research areas of SDC. I review his seminal research, starting in the 1990’s with his work on the release of UK Census sample microdata. This led to a wide-range of research on measuring the risk of re-identification in survey microdata through probabilistic models. I also focus on other aspects of Chris’ research in SDC. Chris was the recipient of the 2019 Waksberg Award and sadly never got a chance to present his Waksberg Lecture at the Statistics Canada International Methodology Symposium. This paper follows the outline that Chris had prepared in preparation for that lecture.
    Release date: 2023-06-30

  • Articles and reports: 12-001-X202000100001
    Description:

    For several decades, national statistical agencies around the world have been using probability surveys as their preferred tool to meet information needs about a population of interest. In the last few years, there has been a wind of change and other data sources are being increasingly explored. Five key factors are behind this trend: the decline in response rates in probability surveys, the high cost of data collection, the increased burden on respondents, the desire for access to “real-time” statistics, and the proliferation of non-probability data sources. Some people have even come to believe that probability surveys could gradually disappear. In this article, we review some approaches that can reduce, or even eliminate, the use of probability surveys, all the while preserving a valid statistical inference framework. All the approaches we consider use data from a non-probability source; data from a probability survey are also used in most cases. Some of these approaches rely on the validity of model assumptions, which contrasts with approaches based on the probability sampling design. These design-based approaches are generally not as efficient; yet, they are not subject to the risk of bias due to model misspecification.

    Release date: 2020-06-30

  • Articles and reports: 12-001-X201900100003
    Description:

    In this short article, I will attempt to provide some highlights of my chancy life as a Statistician in chronological order spanning over sixty years, 1954 to present.

    Release date: 2019-05-07

  • Articles and reports: 89-20-00012018001
    Description:

    Historical works allow readers to peer into the past, not only to satisfy our curiosity about “the way things were,” but also to see how far we’ve come, and to learn from the past. For Statistics Canada, such works are also opportunities to commemorate the agency’s contributions to Canada and its people, and serve as a reminder that an institution such as this continues to evolve each and every day.

    On the occasion of Statistics Canada’s 100th anniversary in 2018, Standing on the shoulders of giants: History of Statistics Canada: 1970 to 2008, builds on the work of two significant publications on the history of the agency, picking up the story in 1970 and carrying it through the next 36 years, until 2008. To that end, when enough time has passed to allow for sufficient objectivity, it will again be time to document the agency’s next chapter as it continues to tell Canada’s story in numbers.

    Release date: 2018-12-03

  • Articles and reports: 12-001-X201700254888
    Description:

    We discuss developments in sample survey theory and methods covering the past 100 years. Neyman’s 1934 landmark paper laid the theoretical foundations for the probability sampling approach to inference from survey samples. Classical sampling books by Cochran, Deming, Hansen, Hurwitz and Madow, Sukhatme, and Yates, which appeared in the early 1950s, expanded and elaborated the theory of probability sampling, emphasizing unbiasedness, model free features, and designs that minimize variance for a fixed cost. During the period 1960-1970, theoretical foundations of inference from survey data received attention, with the model-dependent approach generating considerable discussion. Introduction of general purpose statistical software led to the use of such software with survey data, which led to the design of methods specifically for complex survey data. At the same time, weighting methods, such as regression estimation and calibration, became practical and design consistency replaced unbiasedness as the requirement for standard estimators. A bit later, computer-intensive resampling methods also became practical for large scale survey samples. Improved computer power led to more sophisticated imputation for missing data, use of more auxiliary data, some treatment of measurement errors in estimation, and more complex estimation procedures. A notable use of models was in the expanded use of small area estimation. Future directions in research and methods will be influenced by budgets, response rates, timeliness, improved data collection devices, and availability of auxiliary data, some of which will come from “Big Data”. Survey taking will be impacted by changing cultural behavior and by a changing physical-technical environment.

    Release date: 2017-12-21

  • Articles and reports: 12-001-X201700254894
    Description:

    This note by Danny Pfeffermann presents a discussion of the paper “Sample survey theory and methods: Past, present, and future directions” where J.N.K. Rao and Wayne A. Fuller share their views regarding the developments in sample survey theory and methods covering the past 100 years.

    Release date: 2017-12-21
Reference (0)

Reference (0) (0 results)

No content available at this time.

Date modified: