Questionnaire design

Scope and purpose
Quality indicators

Scope and purpose

A questionnaire is a set of questions designed to gather information from a respondent. It is the interface between the respondent and the researcher, and therefore plays a central role in the data collection process. A questionnaire may be interviewer-administered or respondent-completed, using different methods of data collection.


Questionnaires play a central role in the data collection process and influence the image of a statistical agency. They have a major impact on respondent behaviour, interviewer performance, collection cost and respondent relations and therefore on data quality.

A well-designed questionnaire should collect data that correspond to the survey's Statement of Objectives while taking into account the statistical requirements of data users, administrative and data processing requirements as well as the nature and characteristics of the respondent population. Good questionnaires impose low response burden and remain both respondent and interviewer-friendly. The question design and wording must encourage respondents to complete the questionnaire as accurately as possible.

To this end, the questionnaire must focus on the topic of the survey, be as brief as possible, flow smoothly from one question to the next and facilitate respondents' recall.  Moreover, well-designed questionnaires should facilitate the coding and capture of data. They should minimize the amount of edit and imputation that is required, and lead to an overall reduction in the cost and time associated with data collection and processing. For more information, refer to the "Policy on the Review and Testing of Questionnaires" (Statistics Canada, 2002a).


Informing respondents

  • It is the policy of Statistics Canada to provide all respondents with information about: the purpose of the survey (including the expected uses and users of the statistics to be produced from the survey), the authority under which the survey is taken, the collection registration details, the mandatory or voluntary nature of the survey, confidentiality protection, the record linkage plans and the identity of the parties to any agreements to share the information provided by those respondents. For more information, refer to the "Policy on Informing Survey Respondents" (Statistics Canada, 1998).


  • Consulting with data users during the questionnaire design process allows for clear understanding of how the data are to be used.  It is important to undertake a review of existing subject matter literature and surveys, nationally and internationally, before designing a new questionnaire.  This should allow for a well-designed questionnaire that meets the users' needs.

Content and wording

  • The opening questions should be applicable to all respondents, be easy and interesting to complete, and establish that the respondent is a member of the survey population.

  • Use words and concepts in questionnaires that have the same meanings for both respondents and questionnaire designers, and, in the case of businesses, choose questions, time reference periods, and response categories that are compatible with the establishment's recordkeeping practices.

  • Choose question design and wording that encourage respondents to complete the questionnaire as accurately as possible. The questionnaire must focus on the topic of the survey, be as brief as possible, flow smoothly from one question to the next, facilitate respondents' recall and direct them to the appropriate information source (Converse and Presser, 1986 and Fowler, 1995). 


  • To the extent possible, harmonize concepts and wording with those already in use. When appropriate, reuse questions from other surveys. 

  • Verify French and English versions of the questionnaire for consistency.

  • All members of the project team should be involved in the review of the questionnaire since each can comment from a different perspective.  Team members can provide insight into whether the proposed questionnaire is conducive to good quality survey data, straightforward programming (in computer assisted environments), and efficient post-collection data processing.  More specifically, team members can evaluate the complexity of the questions and flow of the questionnaire, the impact of question structures on respondent behaviour and the detail of the questions relative to the sample size and analytical plan.


  • Design self-completed questionnaires to be attractive and easy to complete. To this end, give a positive first impression in the cover letter and front cover, and make the questionnaire appear professional and businesslike. If it is to be interviewer-administered, make the questionnaire interviewer-friendly.

  • To minimize the possibility of reporting errors, ensure that the instructions to respondents and/or interviewers are short, clear, and easy to find. Provide definitions at the beginning of the questionnaire or in specific questions, as required. Ensure that time reference periods and units of response are clear to the respondent, use boldface print to emphasize important items, specify "include" or "exclude" in the questions themselves (not in separate instructions), and ensure that response categories are mutually exclusive and exhaustive.

  • With respect to the questionnaire layout, provide titles or headings for each section of the questionnaire, and include instructions and answer spaces that facilitate accurate answering of the questions. Use colour, shading, illustrations and symbols to attract attention and guide respondents or interviewers to the parts of the questionnaire that are to be read and to indicate where answers are to be placed. At the end of the questionnaire, provide space for additional comments by respondents and include an expression of appreciation to the respondent (Converse and Presser, 1986 and Fowler, 1995).

Data collection

  • Carefully consider and evaluate different modes of data collection when designing a questionnaire. Be aware of the pros and cons of newer methods such as electronic or internet data reporting.  The collection mode has implications on the amount of detail, complexity, and number of questions that can be asked of a respondent, as well as the sensitivity of the subject matter being requested.

  • Build awareness among survey designers and data analysts that the mode of collection has an influence on the quality and measurement of the information collected.

  • Use the optimal rules for each data collection mode when designing the questionnaire. The use of open and closed questions, mark one or mark all that apply responses, the use of rankings and ratings, as well as the question and response order can all significantly impact on respondent behaviour (De Leeuw, 2005 and Dillman and Christian, 2003). 

Testing and evaluation

  • Choose among a wide range of methods to test and evaluate the questionnaire.  This could include qualitative tests such as focus groups or cognitive tests, pretests or pilot tests. The suitability and intensity of their use depend on various factors and circumstances. These include the type and size of the survey, the survey's content, utilization of previous survey questions or standard questions, whether it is an ongoing collection or not, the method of data collection, the project schedule, the budget, and the availability of resources.  Multiple reviews may be necessary and this will impact the cost and project schedule (Couper, Lessler,  Martin, Martin, Presser, Rothgeb, and Singer, 2004).

Quality indicators

Main quality elements:  accuracy, relevance, coherence.

  • Measurement error is the difference between measured values and true values. It consists of bias (systematic error introduced where the measuring instrument is inaccurate – this error remains constant across survey replications) and variance (random fluctuations between measurements which, with repeated samples, would cancel each other out). Sources of measurement error are the survey instrument, mode of data collection, respondent's information system, respondent, and interviewer.

  • A description of the processes developed to reduce measurement error associated with the survey instrument and to optimize the comparability of the data being collected should be made available. This will indicate to users the accuracy and reliability of the measures as well as the coherence of the data being collected with other statistical information. These processes might include questionnaire development, pilot studies, questionnaire testing, interviewer training, etc.

  • The accuracy of the data as well as its coherence may be influenced in many ways.  For example:

    • by the sensitive nature of the information sought

    • if the words and concepts used do not have the same meaning for both respondents and survey takers

    • if concepts and wording are not harmonized with terms already in use, particularly for business surveys

If so, an assessment of the direction and amount of bias of these items should be made to evaluate the impact on data quality.

  • Making questionnaires available to users will help them assess the relevance and coherence of the data, considering their own needs and the other data sources available to them.

  • Users should be kept informed of any modification to the questionnaire over time and an assessment of the impact of such changes on data comparability should be performed.

  • Documentation on the data quality of the survey should also include any problems with question wording, response burden, refusal rates or other relevant information.


Converse, J.M. and S. Presser. 1986. Survey Questions:  Handcrafting the Standardized Questionnaire. Sage University Paper Series on Quantitative Applications in the Social Sciences, 07-063. Thousand Oaks, California. Sage Publications. 80 p.

Couper, M. P., Judith T. Lessler, E.A. Martin, J. Martin, J.M. Rothgeb and E. Singer. 2004.  Methods for Testing and evaluating survey questionnaires.Hoboken, New Jersey. John Wiley and Sons, Inc.

De Leeuw, Edith D. 2005. "To mix or not to mix data collection modes in surveys." Journal of Official Statistics. Vol. 21, no. 2. p 233-255.

Dillman, Don A. 2000. Mail and Internet Surveys: The Tailored Design Method. 2nd ed. Toronto. John Wiley and Sons, Inc.

Dillman, Don A. and Leah M. Christian. 2003. "Survey Mode as a source of instability in responses across surveys."  Presented at the Workshop on stability of methods for collecting, analyzing and managing panel data, American Academy of Arts and Science, Cambridge, Massachusetts. March 2003.

Fowler, F.J. Jr. 1995a.  Improving Survey Questions: Design and Evaluation. Applied Social Research Methods Series, 38. Thousand Oaks, California. Sage Publications. 200 p.

Statistics Canada. 1998. "Policy on Informing Survey Respondents." Statistics Canada Policy Manual. Section 1.1.  Last updated March 4, 2009.

Date modified: