Evaluating informatics applications—some alternative approaches: theory, social interactionism, and call for methodological pluralism

https://doi.org/10.1016/S1386-5056(01)00184-8Get rights and content

Abstract

A review of evaluation literature concerning CDSSs indicates that randomized controlled clinical trials (RCTs) are the ‘gold standard’ for evaluation. While this approach is excellent for studying system or clinical performance, it is not well suited to answering questions concerning whether systems will be used or how they will be used. Because lack of use of CDSS has been of concern for some years, other evaluation research designs are needed to address those issues. This paper critiques RCT and experimental evaluation approaches and presents alternative approaches to evaluation that address questions outside the scope of the usual RCT and experimental designs. A wide range of literature is summarized to illustrate the value of evaluations that take into account social, organizational, professional, and other contextual considerations. Many of these studies go beyond the usual measures of systems performance or physicians’ behavior by focusing on ‘fit’ of the system with other aspects of professional and organizational life. Because there is little explicit theory that informs many evaluations, the paper then reviews CDSS evaluations informed by social science theories. Lastly, it proposes a theoretical social science base of social interactionism. An example of such an approach is given. It involves a CDSS in psychiatry and is based on Kaplan's 4Cs, which focus on communication, control, care, and context. Although the example is a CDSS, the evaluation approach also is useful for clinical guideline implementation and other medical informatics applications. Similarly, although the discussion is about social interactionism, the more important point is the need to broaden evaluation through a variety of methods and approaches that investigate social, cultural, organizational, cognitive, and other contextual concerns. Methodological pluralism and a variety of research questions can increase understanding of many influences concerning informatics applications development and deployment.

Introduction

Informatics researchers and practitioners have been developing systems to support clinical care and decision making for a half century. Information technologies now are a standard fixture in clinical settings. Nevertheless, there is a general consensus that a variety of systems for clinical decision support are little used, even though their potential benefits have been demonstrated repeatedly [1]. A review of evaluations concerning clinical decision support systems (CDSSs) indicates that these studies are done in a way that precludes findings useful for understanding why CDSSs may or may not be effective. This omission could result in making less informed decisions about these technologies and, by extension, other medical informatics applications. The review reports that most studies use an experimental or randomized controlled clinical trials design (RCT) to assess system performance or to focus on changes in clinical performance that could affect patient care. Few studies involve field tests of the CDSS and almost none use a naturalistic design in clinical settings with real patients. Most focus on physicians rather than on other clinical users or patients. In addition, there is little theoretical discussion, although papers are permeated by a rationalist perspective that excludes contextual issues related to how and why systems are used. Further, CDSS evaluation studies appear to be insulated from evaluations of other medical informatics applications [1].

Although RCTs and other experimental designs are excellent for assessing system performance or specific changes in clinical practice behaviors, some authors have pointed out that these methods are not well suited for studying other research questions. Consequently, other approaches have been developed: simulation, usability testing, cognitive studies, record and playback techniques, network analysis, ethnography, and social interactionism among them. However, when these are used under controlled conditions, it can be difficult to investigate a variety of human, contextual, and cultural factors that affect system acceptance in actual use. Further, a focus on pre-specified outcome measures precludes examining processes of actual system use during daily activities [2].

This paper builds on that literature review [1] to argue for expanding evaluation approaches to enable increased understanding of the many influences concerning informatics applications development and deployment so that we can improve these processes and their connection with patient care. The paper draws on a wide range of literature that was identified through a combination of manual and automated literature searches as described in [1]. Although the literature is from fields that often remain separated, here it is linked to illustrate the need for evaluation designs that go beyond RCTs and experiments, that focus on a variety of individuals and on organizational concerns rather than primarily on physicians’ behavior, and that are informed by evaluations in many areas of medical and health informatics. To further these aims, the paper discusses evaluations that take into account social, organizational, professional, and other contextual considerations. It also proposes: (1) a theoretical base for evaluation; and (2) methodological pluralism in evaluation that incorporates both multiple methods and also addresses a variety of evaluation issues. The paper suggests a social interactionist approach that draws on social science theory, incorporates multiple methods, and addresses a variety of evaluation issues. This approach is illustrated by an example of a CDSS evaluation in psychiatry. The example is based on Kaplan's 4Cs, which focus on communication, control, care, and context [3], [4]. Although the discussion concerns CDSSs, a social interactionist approach is useful when evaluating clinical practice guideline implementation or other medical informatics applications.

Section snippets

Limitations of RCT/experimental evaluation approaches

Althought RCTs are the ‘gold standard’ of clinical research, this hierarchy of research design is not adhered to in other scientific disciplines and recently has been questioned in medicine [5], [6]. With respect to CDSS, as early as 1987, Lundsgaarde reported that evaluations of expert systems often ignored context, such as culture, organization, and work life. He observed that this lack might help explain why so few systems have moved from laboratory to clinic [7]. A few years later, Forsythe

The need for an alternative

Thus, it appears as though the standards of RCTs and other experimental approaches, while excellent for measuring clinical behavior changes or system performance, may be resulting in an impoverished understanding of issues pertaining to system use. Whether or not an informatics system works depends upon social and cognitive processes as well as on technological ones [27]. Because RCT and experimental evaluation approaches do not include social, organizational, cultural, cognitive, or contextual

‘Fit’

The CDSS literature seems not to be informed by studies of other systems, such as hospital information systems (HISs), computer based patient records (CPRs), or ancillary care systems (e.g. laboratory, radiology) that, like studies of guideline compliance, could provide useful insights into issues that may be relevant to acceptance and use of CDSSs [1]. One area that has received some attention when considering acceptance or rejection of CDSSs, or, any clinical information system, may be

Social interactionist theory

Situated action or cognition, actor-network theory, sociotechnical theory, constructionist approaches, and Classic Diffusion Theory each provide theoretical bases that could inform informatics studies, as could ethnography, cognitive studies, and theories of artifacts as embedded models of psychological theories. These theoretical bases have in common that they are, in some way or another, social interactionist approaches [3], [17], [93], [94], and a number of the researchers cited have social

Example

Drawing on this social interactionist tradition, Kaplan's 4Cs evaluation framework is based both on a social interactionist theoretical perspective and on empirical evaluation research studies of a variety of medical and health informatics applications. It can be used for studying and influencing processes of design, implementation, adoption, and use in natural settings where a clinical application is introduced. The framework calls attention to four of the many interrelated areas when

Conclusion

The long history of difficulties in achieving clinical use of some kinds of clinical informatics applications, such as CDSSs and clinical guidelines, suggests a need for deeper understanding of reasons for this recurring pattern. When staff revolt or boycott new hospital information systems where they are required to do order entry [49], [61], [98]; when users sabotage systems [99]; when despite years of effort it has been notoriously difficult to get physicians to use clinical guidelines; when

Acknowledgements

I am grateful to Dr Richard Spivack of the US National Institute of Standards and Technology for his invaluable assistance in the automated literature search for literature reviewed here.

References (99)

  • C. Safran et al.

    Electronic communication and collaboration in a health care practice

    Artif. Intell. Med.

    (1998)
  • D.R. Dixon

    The behavioral side of information technology

    Int. J. Med. Inform.

    (1999)
  • F. Grémy et al.

    Information systems evaluation and subjectivity

    Int. J. Med. Inform.

    (1999)
  • J. Aarts et al.

    Organizational issues in health informatics: a model approach

    Int. J. Med. Inform.

    (1998)
  • G. Southon et al.

    Lessons from a failed information systems initiative: issues for complex organisations

    Int. J. Med. Inform.

    (1999)
  • M. Demeester

    Cultural aspects of information technology implementation

    Int. J. Med. Inform.

    (1999)
  • B. Kaplan, Evaluating informatics applications—clinical decision support systems literature review Int. J. Med. Inform....
  • A.W. Kushniruk, V.L. Patel, J.J. Cimino, Usability testing in medical informatics: cognitive approaches to evaluation...
  • B. Kaplan, Organizational evaluation of medical information systems. In: C.P. Friedman, J.C. Wyatt (Eds.), Evaluation...
  • B. Kaplan

    Addressing organizational issues into the evaluation of medical systems

    J. Am. Med. Inform. Assoc.

    (1997)
  • J. Concato et al.

    Randomized, controlled trials, observational studies, and the hierarchy of research designs

    New Engl. J. Med.

    (2000)
  • K. Benson et al.

    Comparison of observational studies and randomized controlled trials

    New Engl. J. Med.

    (2000)
  • D. Forsythe, B. Buchanan, Broadening our approach to evaluating medical expert systems, In: P.D. Clayton (Ed.),...
  • H.A. Heathfield et al.

    Philosophies for the design and development of clinical decision-support systems

    Meth. Inform. Med.

    (1993)
  • H. Heathfield et al.

    Evaluating information technology in health care: barriers and challenges

    Br. Med. J.

    (1998)
  • V.L. Patel et al.

    Medical informatics and the science of cognition

    J. Am. Med. Inform. Assoc.

    (1998)
  • F. Lau et al.

    Building a virtual network in a community health research training program

    J. Am. Med. Inform. Assoc.

    (2000)
  • J.G. Anderson, C.E. Aydin, S.J. Jay (Eds.), Evaluating health care information systems: approaches and applications....
  • J.G. Anderson, C.E. Aydin, B. Kaplan, An analytical framework for measuring the effectiveness/impacts of computer-based...
  • B. Kaplan

    The medical computing ‘lag’: perceptions of barriers to the application of computers to medicine

    Int. J. Technol. Assess Health Care

    (1987)
  • W.M. Tierney et al.

    An plea for controlled trials in medical informatics

    J. Am. Med. Inform. Assoc.

    (1994)
  • H.A. Heathfield et al.

    Current evaluations of information technology in health care are often inadequate

    Br. Med. J.

    (1996)
  • I. Sim, P. Gorman, R.A. Greenes, R.B. Haynes, B. Kaplan, H. Lehmann, P.C. Tang, Clinical decision support systems for...
  • H.A. Heathfield, V. Peel, P. Hudson, S. Kay, et al., Evaluating large scale health information systems: from practice...
  • A.G. Randolph et al.

    Users’ guides to the medical literature: XVIII. How to use an article evaluating the clinical impact of a computer-based clinical decision support system

    J. Am. Med. Assoc.

    (1999)
  • J.G. Anderson, C.E. Aydin, S.J. Jay (Eds.), Evaluating health care information systems: methods and applications, Sage...
  • C.P. Friedman, J.C. Wyatt, Evaluation methods in medical informatics, Springer, New York,...
  • E.M.S. van Gennip, J.L. Talmon (Eds.), Assessment and evaluation of information technologies, Studies in Health...
  • F. Grémy, M. Bonnin, Evaluation of automatic health information systems. What and how?, In: E.M.S. van Gennip, J.L....
  • T. Jørgensen, Measuring effects, In: E.M.S. van Gennip, J.L. Talmon (Eds.), Assessment and evaluation of information...
  • B. Kaplan et al.

    Towards an informatics research agenda: key people and organizational issues

    J. Am. Med. Inform. Assoc.

    (2001)
  • J.S. Ash, P.N. Gorman, W.R. Hersh, Physician order entry in US hospitals, Proc. AMIA Symp. 1998,...
  • M.D. Cabana et al.

    Why don't physicians follow clinical practice guide-lines?: a framework for improvement

    J. Am. Med. Assoc.

    (1999)
  • D.A. Davis et al.

    Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines

    Can. Med. Assoc. J.

    (1997)
  • B.S. Mittman et al.

    Implementing clinical practice guidelines: social influence strategies and practitioner behavior change

    QRB

    (1992)
  • R.G. Woolf et al.

    Potential benefits, limitations, and harms of clinical guidelines

    Br. Med. J.

    (1999)
  • B. Kaplan

    Information technology and three studies of clinical work

    ACM SIGBIO Newsl.

    (1995)
  • C. Sicotte et al.

    The computer based patient record: a strategic issue in process innovation

    J. Med. Syst.

    (1998)
  • B. Kaplan, The influence of medical values and practices on medical computer applications, Proceedings, MEDCOMP'82: The...
  • Cited by (309)

    View all citing articles on Scopus
    View full text