Publications

    Longitudinal and International Study of Adults Research Paper Series

    Evaluation of the Canadian Household Panel Survey pilot content

    Results from the Canadian Household Panel Survey Pilot

    Warning View the most recent version.

    Archived Content

    Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please "contact us" to request a format other than those available.

    by Andrew Heisz 1

    Abstract

    In January 2006, a conference on longitudinal surveys hosted by Statistics Canada, the Social and Humanities Research Council of Canada and the Canadian Institute of Health Research concluded that Canada lacks a longitudinal survey which collects information on multiple subjects such as family, human capital, income, labour and health and follows respondents for a long period of time. Following this conference, Statistics Canada received funds from the Policy Research Data Gaps fund to support a pilot survey for a new Canadian Household Panel Survey Pilot (CHPS-Pilot). Consultations on the design and content were held with academic and policy experts in 2007 and 2008, and the pilot survey was conducted in the fall of 2008. The objectives of the pilot survey were to: (1) test a questionnaire and measure the quality of data collected; (2) evaluate several design features and; (3) test reactions to the survey from respondents and field workers. The pilot survey achieved a response rate of 76%, with a mean household interview time of 68 minutes. Several innovative design features were tested, and found to be viable. Response to the survey, whether from respondents or interviewers, was generally positive. This paper highlights these and other results from the CHPS-Pilot.

    1. Introduction

    In the fall of 2008, Statistics Canada, in partnership with Human Resources and Skills Development Canada (HRSDC) and the Canadian academic community, fielded a the Canadian Household Panel Survey Pilot (CHPS-Pilot). This paper describes the background of the project, the steps taken in the development of the pilot survey, the methodological results, and the reaction to the pilot survey from respondents and field workers. A separate study will evaluate the questionnaire and quality of the data collected in the pilot.

    2. Pilot development

    2.1 Background

    In January 2006, Statistics Canada, the Social Science and Humanities Research Council of Canada and the Canadian Institute of Health Research hosted a conference "Longitudinal Social and Health Surveys in an International Perspective"2. The objective of this conference was to take stock of longitudinal social and health surveys fielded in Canada and elsewhere. At that time, Statistics Canada's longitudinal social and health survey program was nearing 15 years old, and the conference provided an opportunity to discuss the successes and shortcomings of Statistics Canada's longitudinal portfolio. All of Statistics Canada's longitudinal surveys were discussed and comparisons were drawn to longitudinal surveys in other countries.

    Among other things, the conference identified an important data gap for Canada: Canada lacks a "general household panel survey". A general household panel survey is a multitopic longitudinal household survey with a sample representative of the population. Canada has longitudinal surveys that focus on specific topics, like the National Population Health Survey and the Survey of Labour and Income Dynamics (SLID). Canada also has longitudinal surveys that focus on particular sub-populations like the National Longitudinal Survey of Children and Youth, the Youth in Transition Survey, and the Longitudinal Survey of Immigrants to Canada. These surveys are valuable as they allow for in-depth research on a particular topic or sub-population. However, a general household panel survey would allow for research that stretches beyond traditional subject matter domains, enabling researchers to see how events in one domain may affect others, perhaps much later in life. It would allow researchers to investigate how lives evolve in social contexts, the most immediate social context being the family, but also larger social contexts such as friendships, neighborhoods and public services. The design of such surveys calls for interviewing all household members, allowing analysis of family dynamics and their interactions with other domains in ways that would be impossible for other surveys.

    A number of other recommendations came from the "Longitudinal Social and Health Surveys in an International Perspective" conference that relate to the design of the new household panel survey:

    • It should focus on four subject matter domains: (1) Labour and Income, (2) Family, (3) Human Capital Development, and (4) Health.
    • It should expand the number of topics that could be captured over the life of a panel by using rotating modules.
    • It should have an indefinite panel length.
    • It should not be used for the production of cross sectional income estimates.
    • It should engage academic and policy experts, and keep them engaged as partners in the ongoing survey governance.
    • It should be flexible, with the ability to adapt to emerging research and policy needs.
    • It should be comparable in design and content to international general household panel surveys such as the German Socio Economic Panel (GSOEP), the British Household Panel Survey (BHPS) and the Household Income and Labour Dynamics in Australia survey (HILDA).
    • It should be easy to use, and emphasize the importance of minimizing processing time, simplifying weighting and methodology, and reducing the learning time needed to understand the dataset3.

    Following the Montreal conference, the Policy Research Data Gaps fund provided three years of funding for the CHPS-Pilot.

    2.2 Governance and content development

    The CHPS-Pilot project was developed under a tripartite governance system with each of Statistics Canada, HRSDC and the academic community represented. A steering committee made up of two Director Generals from Statistics Canada, a Director General from HRSDC and two academics directed the project. The survey was managed at Statistics Canada in the Income Statistics Division.

    Content development took place from February 2007 through March 2008. Content development was driven by four academic expert groups, each responsible for one of the four major subject matter domains identified above. These four expert groups, comprising about 20 Canadian academics, advised on content needs and priorities, and discussed possible research uses for the new survey. Each of the four academic expert groups prepared a report indicating what data should be collected for each domain, including some indication on which items could be used as rotating content as opposed to content that would appear in each year. These reports were developed over several months, during which time there were a number of meetings held to exchange preliminary ideas. A draft of the pilot survey was then produced.

    Policy research experts from federal departments were consulted to comment on the draft survey. Researchers from HRSDC, the Canadian Mortgage and Housing Corporation, and the Bank of Canada were consulted on data needs, resulting in numerous changes to the draft survey. Qualitative testing of the questionnaire in the form of one-on-one interview testing was undertaken from January through March 2008.

    Application development and data processing took place in Special Surveys Division of Statistics Canada. As with all social surveys at Statistics Canada, methodology was the responsibility of Household Surveys Methodology Division while collection was carried out by Collection Planning and Management Division.

    2.3 Survey Design

    The CHPS-Pilot survey was conducted between October 15th and December 31st, 2008. The name of the pilot, for the purposes of field work, was the Living in Canada Survey – Pilot. The survey design followed the model established by the HILDA , GSOEP and BHPS for general household panel surveys. Briefly:

    Target population: the target population was all Canadians living in households, excluding institutional and on-reserve populations. Households were sampled (the frame and sampling approach is discussed below), and all household members became permanent sample members, regardless of their age.

    Target respondent: the survey was to interview all household members aged 15 and over living in sampled households.

    Mode: the collection method was non-proxy computer assisted personal interviews (CAPI), although the option was given to interviewers to conduct the interview over the telephone using the CAPI application if a face to face meeting could not be scheduled.

    Sample size and frame: The sample comprised 1,200 dwellings selected from the Labour Force Survey (LFS) Rotate-out frame and 1,400 dwellings selected from the LFS Area-frame. Roughly equal sized samples were drawn from each of four provinces - Ontario, Quebec, Saskatchewan and New Brunswick. To minimise collection costs, the pilot-survey had a small sample size and used a highly clustered design. As a result, the pilot survey was not expected to yield estimates of the population.

    Following rules: The following rules for the survey were not firmly established as the intent was for a single wave pilot only. However, they would have been similar to those used in the HILDA , BHPS or GSOEP surveys. In those surveys, all permanent sample members are followed indefinitely, cohabitants of permanent sample members are also interviewed, and children of permanent sample members themselves become permanent sample members.

    2.4 Pilot survey content

    The pilot survey was to develop a wave-1 questionnaire; the intent of a wave-1 questionnaire is to establish a foundation of data upon which the later waves of the survey could build. Accordingly, the pilot survey included some questions of a retrospective nature, as well as questions designed to generate a baseline picture in human capital, labour, health and well-being.

     The survey was structured as a number of components.

    • The entry component made a roster of all household members and collected basic demographics about each of them. This component was responded to by a person identified to be knowledgeable about all household members (a "person most knowledgeable" or PMK)
    • A household component, which was a set of questions about the household or its members, that one could reasonably expect the PMK to answer on behalf of all household members.
    • A member component, which was to be asked of each household member aged 15 and over. Household members aged 0-14 did not receive an interview.

    The household component asked questions on housing, childcare use, monthly expenditures on key items and total monthly household income. It also included questions on food, financial and housing security, as well as material deprivation.

    The member component included retrospective questions on marriages and common law relationships, parenting history and fertility intentions, educational history, and jobless spell histories. It also contained questions on current labour market activities, characteristics of current jobs, and new questions targeted towards the self-employed, skills used at work, and questions on employment expectations. Other questions sought main activities in the event that the respondent was not working. Four modules were identified to gather information on work to retirement transitions: two were used in the pilot, with the intent that the other two would be used in the second wave of the survey. These questions were asked of respondents 45 years of age and older. Finally, the member component included questions on demographics and life satisfaction.

    3. Pilot Results

    3.1 Pilot Objectives

    The objectives of the CHPS-Pilot Survey were to (1) test a questionnaire and measure the quality of data collected; (2) evaluate several design features and; (3) test reactions to the survey from respondents and field workers. As noted earlier, a paper describing the questionnaire and data quality is forthcoming.  In terms of design features, the specific goals were to:

    1. Establish a response rate
    2. Gather information to help chose the best frame for this survey
    3. Examine the difficulty completing interviews in larger households
    4. Examine the use of the telephone interview
    5. Get information on questionnaire and question length

    In this section we evaluate these design features as well as report on reactions to the survey.

    3.2 Response Rate

    It is important to have a high response rate for any survey, but for a longitudinal one, perhaps it is even more important, as a low response rate in the first wave may cast doubt on the representivity of the survey for the life of the panel. Moreover, responses to the wave-1 survey permit the construction of auxiliary information to adjust for attrition in subsequent waves, which could extend the life of the panel.

    Canada, like elsewhere in the world, has seen a decline in willingness to respond to the first wave of longitudinal surveys. For example, in SLID, a CATI panel survey where a new first wave has been launched every three years since 1993, the wave-1 response rate has fallen from over 90% in the early panels of the survey, to mid-70% rates in the late 2000s.

    Table 3.2-1 shows the response rate from the CHPS-Pilot survey. In future waves of the survey, one would target an individual level response rate, but in the first wave, where membership of the sample is unknown due the household level sample frame, the target response rate must be set at the household level. The target response rate for the CHPS-Pilot was for 80% of households to be partially interviewed (a partial interview is one where some but not all eligible members were interviewed). This target response rate was established based upon recent response rates of other CAPI surveys at Statistics Canada. Altogether, there were 2,356 addresses issued, yielding 2,122 households eligible for interview.  The (compete plus partial) response rate was 76%, which was slightly lower than the target. Overall, 5,453 individuals were enumerated, with 3,205 eligible for interview. Fully 91% of those eligible for interview were interviewed.

    Table 3.2-1
    Response rates from the CHPS-Pilot
    Table 3.2-1 Response rates from the CHPS-Pilot
    Wave 1 Number %
    Household outcomes  
    Addresses issued 2,356 ...
    Out of scope 237 ...
    Multiples (additions to sample) 3 ...
    Eligible households 2,122 100%
    Refusal and non-contact 501 24%
    Complete plus partial household coverage 1,621 76%
    Complete household coverage 1,405 66%
    Individual outcomes  
    Enumerated individuals 5,453 ...
    Ineligible children (Under 15) 2,248 ...
    Enumerated adults 3,205 100%
    Refusals, non-contacts and partial interviews 298 9%
    Complete member interviews 2,907 91%
    ... not applicable
    Source: Canadian Household Panel Survey Pilot (CHPS), 2007

    One way of evaluating the success of this response rate would be to look at wave 1 response rates for other successful household panel surveys to determine if the CHPS-Pilot met the standards set by international surveys. Results published for the 1991 Wave-1 BHPS (Lynn (2006)) and the 2001 Wave-1 HILDA (Watson and Wooden (2002)) indicate that the 1991 BHPS achieved a 74% partial plus complete response rate, while the 2001 HILDA achieved a 66% partial plus complete response rate, indicating that the 76% CHPS-Pilot response rate may have been enough to match the successes of the international surveys. A possible area for improvement in the CHPS-Pilot was in the share of completed member interviews which was as high as 95% in the BHPS, but only 91% in the CHPS-Pilot.

    3.3 Frames

    As noted above, the CHPS-Pilot used two frames to draw sample from – the LFS Rotate-out frame and the LFS Area frame. The objective of this approach was to evaluate which frame would yield a higher response rate. The Rotate-out frame included dwellings previously used in the September 2007 LFS. In most cases, respondents at these dwellings would have previously spent up to six months in the LFS, which could have a negative effect on response. However, the benefit to using the Rotate-out frame is that it contains a large amount of auxiliary information on respondents which could be used to improve wave-1 non-response adjustment. In contrast, respondents at dwellings in the Area-frame sample would not have had any prior experience with Statistics Canada surveys (aside from the Census), but the auxiliary information available on this frame is much more limited. It should be noted that no tracing of former LFS respondents selected from the Rotate-out frame was implemented in the pilot; rather, interviews were conducted with the current occupants of the dwellings. Ideally, we would have compared response rates from the two frames after attempting to trace Rotate-out frame members who had moved in the year since the September 2007 LFS.

    Table 3.3-1
    Results comparing LFS-Area Frame and LFS-Rotate-out Frame
    Table 3.3-1 Results comparing LFS-Area Frame and LFS-Rotate-out Frame
      Area frame Rotate outs
    Number % Number %
    In Scope Households 1,180 ... 942 ...
    Total Non Response 265 22 236 25
    No Contact 18 2 18 2
    Refused 174 15 167 18
    Mental/Physical Limitations 9 1 6 1
    Language Barrier 9 1 4 0
    Other 55 5 41 4
    ... not applicable
    Source: Canadian Household Panel Survey Pilot (CHPS), 2007

    Results from the two frames are shown in table 3.3-1. It can be seen that the refusal rate from the Area frame (at 15%) was lower than the Rotate-out frame (at 18%), but not by a great amount. Moreover, non-contact in the Rotate-out frame (at 2%) would, if anything, have been higher if tracing were implemented. However, the slightly higher response rate in the Area frame comes at the cost of loss of the useful auxiliary information that would come with the Rotate-out frame. As a result, it is ambiguous which of the Area frame or the Rotate-out frame would have been better to use for the survey. However, the results of this test suggest that the effect of the prior response burden of the LFS on the response rate would be minor were we to choose to use the Rotate-out frame.

    3.4 Difficulty completing large households

    One important concern for the success of the survey was the response burden the survey design places on large households, and the corresponding difficulty completing interviews for each respondent in large households. Table 3.4-1 shows statistics on household completion rates for households of various sizes.  First, large households were quite rare. Only 16.3% of households required three or more interviews, while 4.2% required four or more. However, large households were harder to complete. Fully 99% of one-respondent households are completed, compared to 85% of 2-respondent households, 66% of 3-respondent households 76% of 4-respondent households, and 56% of 5-respondent households.

    Most (52%) respondents lived in households with two target respondents. On average it took 4.6 attempts to interview both respondents in the household. Interestingly, the mean number of contacts required to complete a household with two respondents was not more than a household with one respondent, with the average number of contacts being 4.5 for a one respondent household. However, more contacts, and hence more interviewer effort were required for households with three or more respondents.

    Table 3.4-1
    Results comparing LFS-Area Frame and LFS-Rotate-out Frame
    Table 3.4-1 Results comparing LFS-Area Frame and LFS-Rotate-out Frame
    Household Size 1 2 3 4 5
    % households 31.2 52.2 12.1 3.3 0.9
    % complete 98.8 85.3 66.2 76.1 56.0
    Number of attempts for a complete 4.5 4.6 6.4 6.9 7.3
    Source: Canadian Household Panel Survey Pilot (CHPS), 2007

    3.5 The telephone interview

    As noted earlier, the CHPS-Pilot was a CAPI interview. Interviewers were instructed to make their first contact with the household a personal contact, and wherever possible to interview each respondent in person. However, interviewers were given the freedom to complete interviews with difficult to reach respondents over the telephone. This was done in order to reduce response burden in large households, increase the share of completed households, and reduce collection costs. Moreover, in the event that certain cases were referred to senior interviewers, these were most likely to have been conducted over the telephone.

    Table 5 shows the number of attempts, number of contacts, and the number of complete cases for the two collection methods. An attempt is an effort to contact a household, a contact is an attempt that successfully reaches a household member, and a complete case is a contact that results in a case status being finalized, which means that interviewers will not attempt to contact the household again.

    Table 3.5-1
    Comparison of Interview Attempts by Personal and Telephone Collection
    Table 3.5-1 Comparison of Interview Attempts by Personal and Telephone Collection
      Collection method
    In Person Telephone
    Number of attempts 9 266 5 069
    Number of contacts 4 715 3 015
    Number of complete cases 1 164 442
    Source: Canadian Household Panel Survey Pilot (CHPS), 2007

    Fully 35% of attempts, 39% of contacts and 27% of complete cases were conducted over the telephone, which are relatively high rates for a survey conducted by personal interview. These rates are lower, but still remain high if we exclude from the calculations attempts made by senior interviewers (which are usually made by telephone). This reveals the importance of telephone collection in the survey for achieving the response rate described above. It may be that telephone interviews should be integrated more into the survey plan, perhaps with the development of a more formal procedure for telephone usage, or through the use of CATI instruments (in addition to CAPI) and CATI call centers.

    3.6 Interview and question length

    The expected interview length was possibly the most highly scrutinized metric during the design of the CHPS-Pilot as there was concern that seeking to interview all household members aged 15+ would lead to to an excessive amount of time spent in some households. While there was some discussion that we limit the number of interviews to four per household, it was decided that we would instead limit the interview length through careful pruning of the questionnaire. This was done using audit trail information on interview time per question from other surveys, as well as through discussions with subject matter and collection experts. Where audit trail information was not available, ad-hoc evaluations, such as timing of questions in mock-interview situations, were used.

    In the end, average interview time was 12.6 minutes for the household component and 24.8 minutes for the member component. These interview times were quite close to what we had anticipated before collection. Considering only households where all target respondents were interviewed, the average interview time was 68 minutes (including entry and exit). Figure 3.6-1 shows interview lengths by number of target respondents (showing only completed cases) indicating that the average interview length, was 46 minutes in a single respondent household, 71 minutes in a two respondent household, 81 minutes in a three respondent household, and 91 minutes or less in 4, 5 or 6 respondent households. While 91 minutes was seen as a long interview time, in the context of CHPS-Pilot, where the interview time is shared among household members, it was seen as acceptable. However, figure 3.6-1 also reveals that in that some households, the interview time was much longer than the average. For instance, 5% of four respondent households required more than 149 minutes of interview time to complete, indicating that measures to reduce the length of unusually long interviews are warranted.

    Figure 3.6-1
    Interview duration by number of respondents

    Data table for Figure 3.6-1

    Figure 3.6-1 Interview duration by number of respondents

    Source: Canadian Household Panel Survey Pilot (CHPS), 2007

    3.7 Reactions to the survey

    In total, 4 program managers, 10 senior interviewers, and 109 regular interviewers were involved in data collection for the CHPS-Pilot, and the training of field workers yielded many opportunities for survey developers to receive informal reactions to the survey. Moreover, there were weekly conference calls with the program managers during the field period to debrief survey managers on progress in the field as well as to raise any questions or concerns. Finally, a debriefing questionnaire was given to interviewers after collection to receive direct feedback on the survey.

    In general, interviewers were found to be very receptive to the survey. According to the results of the debriefing questionnaire, respondents were also receptive: 95% of the interviewers said respondents had a positive or neutral attitude towards the survey. A large number of interviewers (80%) felt that respondents clearly understood the purpose of the survey, and 75% of interviewers felt that conducting personal interviews did not cause problems. Several interviewers reported that the use of telephone interviewing allowed them to overcome scheduling difficulties.

    Nevertheless, interviewers reported back on a number of concerns with the survey design. Among the major concerns, almost half of the interviewers found the interview to be too long, and 61% felt that having multiple respondents or needing multiple visits caused a problem during collection. Scheduling of interviews and repeat visits for large households were two areas reported as being difficult.

    4. Conclusion

    Results from the CHPS-Pilot continue to be evaluated. In particular, the attention of the survey team at Statistics Canada and the academic experts has moved towards an evaluation of content appearing in the CHPS-Pilot. An evaluation of the content will appear in a separate report.

    The CHPS-Pilot demonstrated the feasibility of a general household panel survey in Canada, established a probable wave 1 response rate, a target interview length, and yielded useful information on frame use. These results and other expertise gained from the CHPS-Pilot will inform future work on longitudinal household survey development at Statistics Canada.

    References

    Picot, G., Berthelot, J.-M. and Webber M. (2006), Possible Future Directions for Longitudinal Surveys at Statistics Canada, Longitudinal Social and Health Surveys in an International Perspective Conference, Montreal, Canada.

    Watson, N. and Wooden M. (2002), The Household, Income and Labour Dynamics in Australia (HILDA) Survey: Wave 1 Survey Methodology, Hilda Project Technical Paper Series, No.1/02, May 2002, University of Melbourne, Melbourne, Australia.

    Lynn, P. (2006), Quality Profile: British Household Panel Survey, Version 2.0: Waves 1 to 13: 1991-2003, Institute for Social and Economic Research, University of Essex, Wivenhoe Park, Colchester, England.


    Notes

    1. Andrew Heisz, Income Statistics Division, Statistics Canada, 5th Floor, Jean Talon Building, 170 Tunney's Pasture Driveway, Ottawa, Canada, K1A 0T6, andrew.heisz@statcan.gc.ca
    2. Most papers presented at this conference are available at www.ciqss.umontreal.ca/Longit/index.html.
    3. Please see Picot, Berthelot and Webber (2006) for more details.
    Report a problem on this page

    Is something not working? Is there information outdated? Can't find what you're looking for?

    Please contact us and let us know how we can help you.

    Privacy notice

    Date modified: