Education/original research
Assessing Clinical Reasoning in Pediatric Emergency Medicine: Validity Evidence for a Script Concordance Test

https://doi.org/10.1016/j.annemergmed.2008.07.024Get rights and content

Study objective

Clinical reasoning is a crucial skill for all residents to acquire during their training. During most patient encounters in pediatric emergency medicine, physicians and trainees are challenged by diagnostic, investigative, and treatment uncertainties. The Script Concordance Test may provide a means to assess reasoning skills in the context of uncertainty in the practice of pediatric emergency medicine. We gathered validity evidence for the use of a pediatric emergency medicine Script Concordance Test to evaluate residents' reasoning skills.

Methods

A 1-hour test containing 60 questions nested in 38 cases was administered to 53 residents at the end of their pediatric emergency medicine rotation at 1 academic institution. Twelve experienced pediatricians were part of a reference panel to establish the basis for the scoring process.

Results

An optimized version of the test, based on positive item discrimination data, contained 30 cases and 50 questions. Scores ranged from 48% to 82%, with a mean score of 69.9 (SD=11.5). The reliability of the optimized test (Cronbach's α) was 0.77. Performance on the test increased as the level of experience of the residents increased. The residents considered the Script Concordance Test true to real-life clinical problems and had enough time to complete the test.

Conclusion

This pediatric emergency medicine Script Concordance Test was reliable and useful to assess the progression of clinical reasoning during residency training.

Introduction

Clinical reasoning is a crucial skill for all future physicians to acquire during their training. During most patient encounters in pediatric emergency medicine, physicians and trainees are challenged by diagnostic, investigative, and treatment uncertainties. The clinical supervision of medical students and residents often takes place without direct observation of the trainee's history-taking and physical examination skills. After a brief period of reflection and charting, the trainee reports the findings to an attending physician. From these brief reporting encounters, the attending physician judges the clinical competence of each trainee and reports the judgments on clinical ratings forms. These forms often represent the sole means used to assess clinical reasoning.1 Although clinical ratings are easy to administer, unobtrusive, and low cost, this assessment system results in many “above-average” ratings, often based on subjective impressions, that do not discriminate well among trainees.2

In an effort to improve the validity of the assessment of trainees, both the Accreditation Council for Graduate Medical Education in the United States and the Royal College of Physicians and Surgeons of Canada have asked residency programs to better assess and certify residents' key competencies in becoming qualified physicians, including clinical reasoning. One of the recommendations of the Accreditation Council for Graduate Medical Education advocates the use of more than 1 assessment tool to better evaluate different aspects of competence.

An assessment tool recently developed to assess clinical reasoning, the Script Concordance Test,3 was implemented in a pediatric emergency medicine program to assess residents at the end of their rotation. The purpose of the present study was to gather validity evidence for the use of Script Concordance Tests in pediatric emergency medicine.

Section snippets

Study Design

An instrument validation study was conducted. All residents in a pediatric emergency medicine clinical rotation during a 7-month period were asked to complete a Script Concordance Test. The test was administered in a supervised setting during the last week of each 4-week pediatric emergency medicine rotation. The Script Concordance Test was completed individually during a maximum 60-minute period after standardized instructions given by a research assistant acting as supervisor. Residents were

Results

During the 7-month study period, 53 of 55 eligible residents consented to participate (96%): 38 women (72%) and 15 men (28%). Thirty-four residents were from family medicine (64%), 10 from pediatrics (19%), 3 from emergency medicine (6%), and 6 from other programs (11%; 5 from radiology and 1 from dermatology). Twenty-one residents (40%) were in their first postgraduate year of training, 21 were PGY-2s (40%), and the 11 others (20%) were in their senior years.

The optimization process of

Limitations

This study used a nonrandom group of residents, a convenience sample.13 Although this method is less ideal than a stratified random sample, the study setting did not allow for random sampling. However, the group of residents in this study can be considered representative of residents in academic pediatric emergency medicine services in Canada and the United States. For example, the proportion of emergency medicine residents in this study (16%) is similar to that of residents in a study reported

Discussion

The results from this study contribute positively to a growing body of literature on the Script Concordance Test approach. Previous studies addressed issues of validity, reliability, feasibility, and applicability in different clinical disciplines and contexts.3, 5, 8, 15 The development of a Script Concordance Test for pediatric emergency medicine and the positive validity evidence gathered in the present study provide additional support for the use of the Script Concordance Test.

For any

References (19)

  • F. Caire et al.

    Auto-évaluation des internes en neurochirurgie par tests de concordance de script (TCS): processus d'élaboration des tests

    Neurochirurgie

    (2004)
  • G.W. Bandiera et al.

    Predictive validity of the Global Assessment Form used in a final-year undergraduate rotation in emergency medicine

    Acad Emerg Med

    (2002)
  • J.D. Gray

    Global rating scales in residency education

    Acad Med

    (1996)
  • B. Charlin

    Standardized Assessment of Ill-defined Clinical Problems: the Script Concordance Test [PhD thesis]

    (2002)
  • B. Charlin et al.

    Scripts and clinical reasoning

    Med Educ

    (2007)
  • B. Charlin et al.

    Standardized assessment of reasoning in contexts of uncertainty: the script concordance approach

    Eval Health Prof

    (2004)
  • G. Page et al.

    Developing key-feature problems and examinations to assess clinical decision-making skills

    Acad Med

    (1995)
  • B. Charlin et al.

    The Script Concordance Test: a tool to assess the reflective clinician

    Teach Learn Med

    (2000)
  • R. Gagnon et al.

    Assessment in context of uncertainty: how many members are needed on the panel of reference of a script concordance test?

    Med Educ

    (2005)
There are more references available in the full text version of this article.

Cited by (55)

  • Usefulness of SCT in detecting clinical reasoning deficits among pediatric professionals

    2021, Progress in Pediatric Cardiology
    Citation Excerpt :

    So far, the SCT has been applied in many specialties of adult medicine, including Cardiology [16–18], in order to assess the condition of clinical reasoning in those professionals. However, there are still not many studies within the pediatric field [13,19–23] and there is none including contents on Pediatric Cardiology (PC), in spite of the fact that this pediatric discipline includes potentially severe pathologies and represents, according to different Spanish studies (there are no international studies which take this variable into account), about 1% of the total amount of reasons for consultation [1,24]. For this reason, we consider it is relevant to create an SCT for PC complying with the criteria of validity, reliability, and acceptability described in the bibliography, and whose specific objectives are:

  • Pediatric Hospitalists’ Performance and Perceptions of Script Concordance Testing for Self-Assessment

    2021, Academic Pediatrics
    Citation Excerpt :

    Many previous studies of SCT focus on the test's validity and reliability among different specialties3–7 and across the spectrum of learners from medical students20 to postgraduate trainees18 and physicians in practice.9 While studies have concluded that the SCT format is both reliable and valid,4–8 recent studies have questioned the test's psychometric properties.32,33 Lubarsky34 and others have raised concerns that test takers can “game” the test by avoiding answers at either extreme of the Likert scale.

  • Novel Transfer of Care Sign-out Assessment Tool in a Pediatric Emergency Department

    2018, Academic Pediatrics
    Citation Excerpt :

    Senior resident was defined as PGY2 or higher. PGY status is frequently used as a proxy for maturity in educational studies evaluating progression of skill and knowledge.24–26 Finally, group comparisons took into consideration the variable of continuity of care.

View all citing articles on Scopus

Provide feedback on this article at the journal's Web site, www.annemergmed.com.

Supervising editor: Peter C. Wyer, MD

Author contributions: B Carrière, RG, B Charlin, and GB designed the study. B Carrière and B Charlin supervised the data collection during the study period. RG and SD provided statistical advice, and RG analyzed the data. B Carrière wrote the article, with major contributions from B Charlin, SD, and GB for revision and content in the discussion. B Carrière takes responsibility for the paper as a whole.

Funding and support: By Annals policy, all authors are required to disclose any and all commercial, financial, and other relationships in any way related to the subject of this article that might create any potential conflict of interest. The authors have stated that no such relationships exist. See the Manuscript Submission Agreement in this issue for examples of specific conflicts covered by this statement.

Publication date: Available online August 22, 2008.

Reprints not available from the authors.

View full text