Original articles
Comparison of Open and Closed Questionnaire Formats in Obtaining Demographic Information From Canadian General Internists

https://doi.org/10.1016/S0895-4356(99)00106-7Get rights and content

Abstract

The objective of this study was to compare the impact of closed- versus open-ended question formats on the completeness and accuracy of demographic data collected in a mailed survey questionnaire. We surveyed general internists in five Canadian provinces to determine their career satisfaction. We randomized respondents to receive versions of the questionnaire in which 16 demographic questions were presented in a closed-ended or open-ended format. Two questions required respondents to make a relatively simple computation (ensuring that three or four categories of response added to 100%). The response rate was 1007/1192 physicians (80.0%). The proportion of respondents with no missing data for all 16 questions was 44.7% for open-ended and 67.0% for closed-ended formats (P < 0.001). The odds of having missing items remained higher for open-ended response options after adjusting for a number of respondent characteristics (2.67, 95% confidence interval 2.01 to 3.55). For the two questions requiring computations focused on professional activity and income, there were more missing data (P = 0.02, 0.02, respectively) but fewer inaccurate responses (P = 0.009, 0.20, respectively) for the open-ended compared to the closed-ended format. Investigators can achieve higher response rates for demographic items using closed format response options, but at the risk of increasing inaccuracy in response to questions requiring computation.

Introduction

Comprehensive data collection through survey questionnaires requires both maximizing questionnaire response rates and ensuring questionnaire completeness and accuracy among those responding. Advantages and disadvantages of different methods of questionnaire administration (i.e., face-to-face interview, telephone survey, and mailed self-administered completion) have been well described 1, 2. In two randomized trials of health survey administration comparing these three methods, Hochstein [3] and Siemiatycki [4] both detected very small differences in response rates, but found higher costs generated by face-to-face interviews. Siemiatycki also found that sensitive questions about income and Medicare number were more readily answered by mailed self-administered questionnaire than by telephone and face-to-face interviews. In addition, when adult asthmatics were randomized to receive self-administered versus interviewer-administered questionnaires we found that the proportion of endorsed quality of life symptoms was higher using the former than the latter [5].

Investigators have also examined the effect of item formatting on the completeness and accuracy of survey data. Schumann and Presser [6] reported a public opinion survey asking about the most important problem facing the United States. They found that different formats yielded different answers; using the open-ended format, respondents were more likely to complain about the political leadership and less likely to comment on violence than when using the closed-ended format. In another study, prompting workers with closed-ended questions resulted in identification of occupational exposure to an average of 11.0 potentially hazardous materials per respondent, whereas open-ended questions resulted in reporting of 3.1 materials [7].

Some have suggested that, to reduce the time and effort spent on questionnaire development and to facilitate direct comparisons across questionnaires, investigators should use standardized demographic questions in creating new survey instruments [8]. Demographic information describes survey respondents, allows comparisons with nonrespondents, and facilitates assessment of the generalizability of questionnaire results. In addition, demographic data permit comparisons among important respondent subgroups and adjustment for differences among them. We conducted an investigation designed to examine the relative completeness and accuracy of demographic data from open-ended versus closed-ended formats in a survey of career satisfaction among general internists in five Canadian provinces.

Section snippets

Questionnaire Development

We developed this questionnaire based on previous literature about job satisfaction, augmented with focus groups and semistructured interviews with general internists and residents in several community and university-affiliated hospitals in Canada [9]. Items fell into four domains: 1) clinical responsibilities, 2) teaching, 3) research, and 4) interpersonal issues. We pretested the questionnaire and assessed its clinical sensibility in six domains: purpose and framework, comprehensibility,

Respondents

The overall response rate was 80.0% (1007/1192). The demographic information we collected during the abbreviated telephone interviews of the nonrespondents demonstrated that these physicians were similar with respect to year of graduation, year of completion of internal medicine training, and percent of general internists [9].

Table 1 displays selected demographic, medical education, and specialty characteristics of physicians responding to the identical format section of the questionnaire. When

Discussion

Demographic information is usually considered easily collectable in questionnaires. Social desirability bias, for example, is unlikely to have much impact on responses to questions that are not value laden. Missing information, however, may still be important, and may limit the ability to generalize survey results or make inferences about subgroups within the larger dataset.

Our findings, based on the responses of nearly 1000 physicians, are similar to those reported by Schumann and Presser [6],

Acknowledgements

This study was funded by Physicians' Services Incorporated and the Professional Association of Interns and Residents of Ontario. Dr. Cook is a Career Scientist of the Ontario Ministry of Health.

References (12)

  • D.J. Cook et al.

    Interviewer versus self-administered questionnaires in developing a disease specific health related quality of life instrument for asthma

    J Clin Epidemiol

    (1993)
  • Babbie E. The Practice of Social Research. 7th Ed. Belmont, CA; Wadsworth Publishing Company;...
  • C.A. Woodward et al.

    Guide to Questionnaire Construction and Question Writing

    (1986)
  • J.R. Hochstein

    A critical comparison of three strategies of collecting data from households

    J Am Stat Assoc

    (1967)
  • J. Siemiatycki

    A comparison of mail, telephone, and home interview strategies for household health surveys

    Am J Public Health

    (1979)
  • H. Schuman et al.

    Questions and Answers in Attitude SurveysExperiments on Question Form, Wording, and Context

    (1981)
There are more references available in the full text version of this article.

Cited by (88)

  • Motivators and stressors for Canadian research coordinators in critical care: The motivate survey

    2020, American Journal of Critical Care
    Citation Excerpt :

    Strengths of this study include rigorous survey development and testing methods.19 We used a self-administered format that yields more valid self-reported responses than interviewer-administered questionnaires,19 and we used closed-ended demographic items that yield more complete and valid data about the respondents than do open-ended items.17 We also appended the questionnaire to our reminders for our respondents’ convenience, in order to maximize the chances of completion.24

View all citing articles on Scopus
View full text