Disclosure control and data dissemination

Filter results by

Search Help
Currently selected filters that can be removed

Keyword(s)

Type

1 facets displayed. 0 facets selected.
Sort Help
entries

Results

All (14)

All (14) (0 to 10 of 14 results)

  • Articles and reports: 11-522-X20050019433
    Description:

    Spatially explicit data pose a series of opportunities and challenges for all the actors involved in providing data for long-term preservation and secondary analysis - the data producer, the data archive, and the data user.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019434
    Description:

    Traditional methods for statistical disclosure limitation in tabular data are cell suppression, data rounding and data perturbation. Because the suppression mechanism is not describable in probabilistic terms, suppressed tables are not amenable to statistical methods such as imputation. Data quality characteristics of suppressed tables are consequently poor.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019436
    Description:

    Regardless of the specifics of any given metadata scheme, there are common metadata constructs used to describe statistical data. This paper will give an overview of the different approaches taken to achieve the common goal of providing consistent information.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019437
    Description:

    The explanatory information accompanying statistical data is called metadata, and its presence is essential for the correct understanding and interpretation of the data. This paper will report on the experience of Statistics Canada in the conceptualization, naming and organization of variables on which data are produced.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019438
    Description:

    A variety of standards for documenting the contents of data files have evolved over time, each with their own constituency and users. The Data Documentation Initiative (DDI) is an effort to establish an international XML-based standard for the content, presentation, transport, and preservation of documentation for datasets in the social and behavioural sciences.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019455
    Description:

    The Data Documentation Initiative (DDI) is an internationally developed standard used to develop metadata. The Data Liberation Initiative (DLI) along with partner universities, including the University of Guelph are working towards the goal of creating metadata for all Statistics Canada surveys available to the DLI community.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019456
    Description:

    The metadata associated with microdata production of major Statistics Canada household and social surveys are often voluminous and daunting. There does not appear to be a systematic approach to disseminating the metadata of confidential microdata files across all surveys. This heterogeneity applies to content as well as method of dissemination. A pilot project was conducted within the RDC Program to evaluate one standard, the Data Documentation Initiative (DDI), that might support such a process.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019460
    Description:

    Users will analyse and interpret the time series of estimates in various ways often involving estimates for several time periods. Despite the large sample sizes and degree of overlap between the sample for some periods the sampling errors can still substantially affect the estimates of movements and functions of them used to interpret the series of estimates. We consider how to account for sampling errors in the interpretation of the estimates from repeated surveys and how to inform the users and analysts of their possible impact.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019462
    Description:

    The traditional approach to presenting variance information to data users is to publish estimates of variance or related statistics, such as standard errors, coefficients of variation, confidence limits or simple grading systems. The paper examines potential sources of variance, such as sample design, sample allocation, sample selection, non-response, and considers what might best be done to reduce variance. Finally, the paper assesses briefly the financial costs to producers and users of reducing or not reducing variance and how we might trade off the costs of producing more accurate statistics against the financial benefits of greater accuracy.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019463
    Description:

    Statisticians are developing additional concepts for communicating errors associated with estimates. Many of these concepts are readily understood by statisticians but are even more difficult to explain to users than the traditional confidence interval. The proposed solution, when communicating with non-statisticians, is to improve the estimates so that the requirement for explaining the error is minimised. The user is then not confused by having too many numbers to understand.

    Release date: 2007-03-02
Data (0)

Data (0) (0 results)

No content available at this time.

Analysis (14)

Analysis (14) (0 to 10 of 14 results)

  • Articles and reports: 11-522-X20050019433
    Description:

    Spatially explicit data pose a series of opportunities and challenges for all the actors involved in providing data for long-term preservation and secondary analysis - the data producer, the data archive, and the data user.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019434
    Description:

    Traditional methods for statistical disclosure limitation in tabular data are cell suppression, data rounding and data perturbation. Because the suppression mechanism is not describable in probabilistic terms, suppressed tables are not amenable to statistical methods such as imputation. Data quality characteristics of suppressed tables are consequently poor.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019436
    Description:

    Regardless of the specifics of any given metadata scheme, there are common metadata constructs used to describe statistical data. This paper will give an overview of the different approaches taken to achieve the common goal of providing consistent information.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019437
    Description:

    The explanatory information accompanying statistical data is called metadata, and its presence is essential for the correct understanding and interpretation of the data. This paper will report on the experience of Statistics Canada in the conceptualization, naming and organization of variables on which data are produced.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019438
    Description:

    A variety of standards for documenting the contents of data files have evolved over time, each with their own constituency and users. The Data Documentation Initiative (DDI) is an effort to establish an international XML-based standard for the content, presentation, transport, and preservation of documentation for datasets in the social and behavioural sciences.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019455
    Description:

    The Data Documentation Initiative (DDI) is an internationally developed standard used to develop metadata. The Data Liberation Initiative (DLI) along with partner universities, including the University of Guelph are working towards the goal of creating metadata for all Statistics Canada surveys available to the DLI community.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019456
    Description:

    The metadata associated with microdata production of major Statistics Canada household and social surveys are often voluminous and daunting. There does not appear to be a systematic approach to disseminating the metadata of confidential microdata files across all surveys. This heterogeneity applies to content as well as method of dissemination. A pilot project was conducted within the RDC Program to evaluate one standard, the Data Documentation Initiative (DDI), that might support such a process.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019460
    Description:

    Users will analyse and interpret the time series of estimates in various ways often involving estimates for several time periods. Despite the large sample sizes and degree of overlap between the sample for some periods the sampling errors can still substantially affect the estimates of movements and functions of them used to interpret the series of estimates. We consider how to account for sampling errors in the interpretation of the estimates from repeated surveys and how to inform the users and analysts of their possible impact.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019462
    Description:

    The traditional approach to presenting variance information to data users is to publish estimates of variance or related statistics, such as standard errors, coefficients of variation, confidence limits or simple grading systems. The paper examines potential sources of variance, such as sample design, sample allocation, sample selection, non-response, and considers what might best be done to reduce variance. Finally, the paper assesses briefly the financial costs to producers and users of reducing or not reducing variance and how we might trade off the costs of producing more accurate statistics against the financial benefits of greater accuracy.

    Release date: 2007-03-02

  • Articles and reports: 11-522-X20050019463
    Description:

    Statisticians are developing additional concepts for communicating errors associated with estimates. Many of these concepts are readily understood by statisticians but are even more difficult to explain to users than the traditional confidence interval. The proposed solution, when communicating with non-statisticians, is to improve the estimates so that the requirement for explaining the error is minimised. The user is then not confused by having too many numbers to understand.

    Release date: 2007-03-02
Reference (0)

Reference (0) (0 results)

No content available at this time.

Date modified: