Skip to main content

Advertisement

Log in

The role of the bifactor model in resolving dimensionality issues in health outcomes measures

  • Original Paper
  • Published:
Quality of Life Research Aims and scope Submit manuscript

Abstract

Objectives

We propose the application of a bifactor model for exploring the dimensional structure of an item response matrix, and for handling multidimensionality.

Background

We argue that a bifactor analysis can complement traditional dimensionality investigations by: (a) providing an evaluation of the distortion that may occur when unidimensional models are fit to multidimensional data, (b) allowing researchers to examine the utility of forming subscales, and, (c) providing an alternative to non-hierarchical multidimensional models for scaling individual differences.

Method

To demonstrate our arguments, we use responses (N =  1,000 Medicaid recipients) to 16 items in the Consumer Assessment of Healthcare Providers and Systems (CAHPS©2.0) survey.

Analyses

Exploratory and confirmatory factor analytic and item response theory models (unidimensional, multidimensional, and bifactor) were estimated.

Results

CAHPS© items are consistent with both unidimensional and multidimensional solutions. However, the bifactor model revealed that the overwhelming majority of common variance was due to a general factor. After controlling for the general factor, subscales provided little measurement precision.

Conclusion

The bifactor model provides a valuable tool for exploring dimensionality related questions. In the Discussion, we describe contexts where a bifactor analysis is most productively used, and we contrast bifactor with multidimensional IRT models (MIRT). We also describe implications of bifactor models for IRT applications, and raise some limitations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

References

  1. Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ: Erlbaum.

    Google Scholar 

  2. Bjorner, J. B., Kosinksi, M., & Ware, J. E. Jr. (2003a). The feasibility of applying item response theory to measures of migraine impact: A re-analysis of three clinical studies. Quality of Life Research, 12, 887–902.

    Article  PubMed  Google Scholar 

  3. Bjorner, J. B., Kosinksi, M., & Ware, J. E., Jr. (2003b). Using item response theory to calibrate the Headache Impact Test (HITTM) to the metric of traditional scales. Quality of Life Research, 12, 981–1002.

    Article  PubMed  Google Scholar 

  4. Bjorner, J. B., Kosinksi, M., & Ware, J. E., Jr. (2003c). Calibration of an item pool for assessing the burden of headaches: An application of item response theory to the Headache Impact Test (HITTM) to the metric of traditional scales. Quality of Life Research, 12, 913–933.

    Article  PubMed  Google Scholar 

  5. Haley, S. M., McHorney, C. A., & Ware, J. E., Jr. (1994). Evaluation of the MOS SF-36 physical functioning scale (PF-10): I. unidimensionality and reproducibility of the Rasch item scale. Journal of Clinical Epidemiology, 47, 671–684.

    Article  PubMed  CAS  Google Scholar 

  6. Hambleton, R. K. (2000). Emergence of item response modeling in instrument development and data analysis. Medical Care, 38(Suppl. 9), 60–65.

    Google Scholar 

  7. Hays, R. D., Morales, L. S., & Reise, S. P. (2000). Item response theory and health outcomes measurement in the 21st century. Medical Care, 38(Suppl. 2), 28–42.

    Google Scholar 

  8. Reise, S. P. (2004). Item response theory and its applications for cancer outcomes measurement. In J. Lipscomb, C. C. Gotay, & C. F. Snyder (Eds.), The cancer outcomes measurement working group (COMWG): An NCI initiative to improve the science of outcomes measurement in cancer (pp. 425–444). Boston, MA: Cambridge University Press.

    Google Scholar 

  9. Christoffersson, A. (1975). Factor analysis of dichotomized variables. Psychometrika, 40, 5–32.

    Article  Google Scholar 

  10. Knol, D. L., & Berger, M. P. F. (1991). Empirical comparison between factor analytic and multidimensional item response models. Multivariate Behavioral Research, 26, 457–477.

    Article  Google Scholar 

  11. McDonald, R. P. (1999). Test theory: A unified approach. Mahwah, NJ: Erlbaum.

    Google Scholar 

  12. McDonald, R. P. (2000). A basis for multidimensional item response theory. Applied Psychological Measurement, 24, 99–114.

    Article  Google Scholar 

  13. Takane, Y., & de Leeuw, J. (1987). On the relationship between item response theory and factor analysis of discretized variables. Psychometrika, 52, 393–408.

    Article  Google Scholar 

  14. Agency for Healthcare Policy and Research. (1999). Consumer Assessment of Health Plans Study – CAHPS©2.0 survey and reporting kit. Rockville, Maryland: Author (AHPR).

    Google Scholar 

  15. Hargraves, J. L., Hays, R. D., & Cleary, P. D. (2003). Psychometric properties of the Consumer Assessment of Health Plans Study (CAHPS®) 2.0 adult core survey. Health Services Research, 38, 1509–1527.

    Article  PubMed  Google Scholar 

  16. Reise, S. P., Meijer, R. R., Ainsworth, A. T., Morales, L. S., & Hays, R. D. (2006). Application of group level item response models in the evaluation of consumer reports about health plan quality. Multivariate Behavioral Research, 41, 85–102.

    Article  Google Scholar 

  17. Reise, S. P., Waller, N. G., & Comrey, A. L. (2000). Factor analysis and scale revision. Psychological Assessment, 12, 287–297.

    Article  PubMed  CAS  Google Scholar 

  18. Reitan, R. M., & Wolfson, D. (1996). Theoretical, methodological, and validational bases of the Halstein-Reitan Neuropsychological Test Battery. In I. Grant & K. M. Adams (Eds.), Neuropsychological assessment of neuropsychiatric disorder. New York: Oxford University Press.

    Google Scholar 

  19. Cattell, R. B. (1966). Psychological theory and scientific method. In R. B. Cattell (Ed.), Handbook of multivariate experimental psychology (pp. 1–18). Chicago: Rand McNally.

    Google Scholar 

  20. Ackerman, T. A. (1994). Using multidimensional item response theory to understand what items and tests are measuring. Applied Measurement in Education, 18, 225–278.

    Google Scholar 

  21. Ackerman, T. A. (1996). Graphical representation of multidimensional item response theory analyses. Applied Psychological Measurement, 4, 311–330.

    Article  Google Scholar 

  22. Reckase, M. D. (1997). The past and future of multidimensional item response theory. Applied Psychological Measurement, 21, 25–36.

    Article  Google Scholar 

  23. Reckase, M. D., Ackerman, T. A., & Carlson, J. E. (1988). Building a unidimensional test using multidimensional items. Journal of Educational Measurement, 25, 193–203.

    Article  Google Scholar 

  24. McDonald, R. P. (1981). The dimensionality of test and items. British Journal of Mathematical and Statistical Psychology, 34, 100–117.

    Google Scholar 

  25. Hattie, J. (1985). Methodology review: Assessing unidimensionality of test and items. Applied Psychological Measurement, 9, 139–164.

    Article  Google Scholar 

  26. Chernyshenko, O. S., Stark, S., & Chan, K. Y. (2001). Investigating the hierarchical factor structure of the fifth edition of the 16PF: An application of the Schmid-Leiman orthogonalization procedure. Educational and Psychological Measurement, 61, 290–302.

    Article  Google Scholar 

  27. Drasgow, F., & Lissak, R. I. (1983). Modified parallel analysis: A procedure for examining the latent dimensionality of dichotomously scored item responses. Journal of Applied Psychology, 68, 363–373.

    Article  Google Scholar 

  28. Nandakumar, R., & Stout, W. F. (1993). Refinement of Stout’s procedure for assessing latent trait dimensionality. Journal of Educational Statistics, 18, 41–68.

    Article  Google Scholar 

  29. Stout, W. (1987). A nonparametric approach for assessing latent trait unidimensionality. Psychometrika, 52, 589–617.

    Article  Google Scholar 

  30. Gibbons, R. D., & Hedeker, D. R. (1992). Full-information item bifactor analysis. Psychometrika, 57, 423–436.

    Article  Google Scholar 

  31. Holzinger, K. J., & Swineford, F. (1937). The bifactor method. Psychometrika, 2, 41–54.

    Article  Google Scholar 

  32. Schmid, J., & Leiman, J. M. (1957). The development of hierarchical factor solutions. Psychometrika, 22, 53–61.

    Article  Google Scholar 

  33. Yung, Y. F., Thissen, D., & McLeod, L. D. (1999). On the relationship between the higher-order factor model and the hierarchical factor model. Psychometrika, 64, 113–128.

    Article  Google Scholar 

  34. Gustafsson, J., & Balke, G. (1993). General and specific abilities as predictors of school achievement. Multivariate Behavioral Research, 28, 407–434.

    Article  Google Scholar 

  35. Wang, W., Chen, P., & Cheng, Y. (2004). Improving measurement precision of test batteries using multidimensional item response models. Psychological Methods, 9, 116–136.

    Article  PubMed  Google Scholar 

  36. Segall, D. O. (1996). Multidimensional adaptive testing. Psychometrika 61, 331–354.

    Article  Google Scholar 

  37. Rindskopf, D., & Rose, T. (1988). Some theory and applications of confirmatory second-order factor analysis. Multivariate Behavioral Research, 23, 51–67.

    Article  Google Scholar 

  38. Chen, F. F., West, S. G., & Sousa, K. H. (2006). A comparison of bifactor and second-order models of quality of life. Multivariate Behavioral Research, 41, 189–224.

    Article  Google Scholar 

  39. Waller, N. G. (2001). MicroFACT 2.0: A microcomputer factor analysis program for ordered polytomous data and mainframe size problems [computer program]. St. Paul, MN: Assessment Systems Corporation.

  40. Wolff, H., & Preising, K. (2005). Exploring item and higher order factor structure with the Schmid-Leiman solution: Syntax codes for SPSS and SAS. Behavioral Research Methods, 37, 48–58.

    Google Scholar 

  41. Muthén, L. K., & Muthén, B. O. (2004). Mplus user’s guide [version 3; computer program]. Los Angeles, CA: Muthén & Muthén.

    Google Scholar 

  42. Kass, R. E., & Wasserman, L. (1995). A reference Bayesian test for nested hypotheses and its relationship to the Schwartz criterion. Journal of the American Statistical Association, 90, 928–934.

    Article  Google Scholar 

  43. Samejima, F. (1969). Estimation of latent ability using a response pattern of graded scores. Psychometrika Monograph Supplement, 34, 100–114.

    Google Scholar 

  44. Muraki, E., & Carlson, J. E. (1995). Full-information factor analysis for polytomous item responses. Applied Psychological Measurement, 19, 73–90.

    Article  Google Scholar 

  45. Wood, R., Wilson, D., Gibbons, R., Schilling, S., Muraki, E., & Bock, R. D. (2003). TESTFACT: Test scoring, item statistics, and item factor analysis [version 4; computer program]. Lincolnwood, IL: Scientific Software International, Inc.

    Google Scholar 

  46. Bock, R. D., Gibbons, R., & Muraki, E. (1988). Full information item factor analysis. Applied Psychological Measurement, 12, 261–280.

    Article  Google Scholar 

  47. Swygert, K. A., McLeod, L. D., & Thissen, D. (2001). Factor analysis for items or testlets scored in more than two categories. In D. Thissen & H. Wainer (Eds.), Test scoring (pp. 217–250). Mahwah, NJ: Erlbaum.

    Google Scholar 

  48. Steinberg, L., & Thissen, D. (1996). Uses of item response theory and the testlet concept in the measurement of psychopathology. Psychological Methods, 1, 81–97.

    Article  Google Scholar 

  49. Davey, T., Oshima, T. C., & Lee, K. (1996). Linking multidimensional item calibrations. Applied Psychological Measurement, 20, 405–416.

    Article  Google Scholar 

  50. Li, Y. H., & Lissitz, R. W. (2000). An evaluation of the accuracy of multidimensional IRT linking. Applied Psychological Measurement, 24, 115–138.

    Google Scholar 

  51. Ackerman, T. A. (1992). An explanation of differential item functioning from a multidimensional perspective. Journal of Educational Measurement, 24, 67–91.

    Article  Google Scholar 

  52. Roussos, L., & Stout, W. (1996). A multidimensionality-based DIF analysis paradigm. Applied Psychological Measurement, 20, 355–371.

    Article  Google Scholar 

  53. Flora, D. B., & Curran, P. J. (2004). An empirical evaluation of alternative methods of estimation for confirmatory factor analysis with ordinal data. Psychological Methods, 9, 466–491.

    Article  PubMed  Google Scholar 

  54. Fraser, C., & McDonald, R. P. (1988). NOHARM: Least squares item factor analysis. Multivariate Behavioral Research, 23, 267–269.

    Article  Google Scholar 

  55. Gibbons, R. D., Bock, R. D., Hedeker, D., Weiss, D. J., Segawa, E., Bhaumik, D. K., Kupfer, D. J., Frank, E., Grochocinski, V. J., & Stover, A. (2007). Full-information item bi-factor analysis of graded response data. Applied Psychological Measurement, 31, 4–19.

    Article  Google Scholar 

  56. Jöreskog, K. G., & Sörbom, D. (1995). LISREL 8 users’s reference guide [computer program]. Chicago: Scientific Software.

    Google Scholar 

Download references

Acknowledgments

This paper was supported by grant number 5 U18 HS-00924 from the Agency for Healthcare Research and Quality. Dr. Hays was also supported in part by the UCLA/DREW Project EXPORT, National Institutes of Health, National Center on Minority Health & Health Disparities, (P20-MD00148-01) and the UCLA Center for Health Improvement in Minority Elders/Resource Centers for Minority Aging Research, National Institutes of Health, National Institute of Aging, (AG-02-004).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steven P. Reise.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Reise, S.P., Morizot, J. & Hays, R.D. The role of the bifactor model in resolving dimensionality issues in health outcomes measures. Qual Life Res 16 (Suppl 1), 19–31 (2007). https://doi.org/10.1007/s11136-007-9183-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11136-007-9183-7

Keywords

Navigation