INTRODUCTION

Clinical competence has been defined as the “habits of mind that allow the practitioner to be attentive, curious, self-aware and willing to recognize and correct errors”.1 This definition acknowledges not only the burgeoning amount of information that the modern clinician needs to manage, but also the metacognitive behavior of the physician, or the awareness of one’s knowledge. This attribute of mindfulness is also reflected in a core Accreditation Council for Graduate Medical Education (ACGME) competency, practice-based learning and improvement, which clarifies the expectation of lifelong learning for physicians.

The concept of “mindful practice” is rooted in reflection, an increasingly important component of medical education.2,3.Reflection in medicine has been defined as “the consideration of the larger context, the meaning, and the implications of an experience and action” and should be a strong theme as students are introduced to clinical experiences.3,4 Reflection enhances learning by encouraging conceptualization and inquiry, reinforcing positive learning experiences, and may improve diagnostic abilities.5 Reflection of critical incidents in clinical training has also been used to enhance professional and humanistic qualities of learners.6,7 Sobral has shown that students’ level of reflection in learning is associated with the meaningfulness of the learning experience,5 and is predictive of diagnostic thinking ability.8 Students arrive to their clinical experiences, however, with different levels of reflective habits.9 Unfortunately, students in clinical clerkships may find themselves in competitive environments that actually discourage reflection and critical self-assessment.

There is a need then for medical educators to create learning situations that model and encourage reflection in clinical medicine, but the methods to do this are less well-defined. Journaling has been an effective educational tool for development of self-awareness in preclinical years,10 but has been rarely explored in the context of busy clinical clerkships, when students are struggling with multiple tasks related to clinical skills. One form of portfolio development, the patient log, has been used for many years to document numbers and types of patient problems encountered by students in their clerkships. Patient logs have been facilitated by web-based databases and handheld technology. We hypothesized that embedding a reflective cue within the web-based patient log would be used by our students to document the status of “mindful practice.”

SETTING AND PROGRAM DESCRIPTION

The electronic patient log system used in this study, Patient Tracker™, was developed at the Johns Hopkins University School of Medicine’s Office of Academic Computing, and introduced into the required 4-week block Ambulatory Medicine clerkship. During this rotation, students are assigned to community-based offices, working one-on-one with a generalist preceptor. Students are expected to see 4 patients per half-day of practice. A meaningful patient encounter is defined as independent student contact with the patient, usually as the initial interview and examination. Knowledge objectives are assessed with an internally developed and validated written knowledge test, consisting of 50 multiple choice questions.

Patient Tracker™ permits students to use either a desktop computer or a personal digital assistant (PDA) for data entry. Data are deposited into a Microsoft Access™ database via an Avantgo server for Palm OS-based PDA’s, and via Cold Fusion™ when a desktop computer or a Pocket PC PDA is used. The template for data entry was designed and then altered based on 3 rounds of testing and evaluation by students.

The Patient Tracker required a 1-time entry for the student name, office site, and year of training. For each patient, the age, gender, chief complaint, medication list, and problem list could be entered with a series of dropdown menus. The final field, titled “Learning Need,” was completed with open text. At each rotation’s orientation for the clerkship, the clerkship director (PT) encouraged students to use the “Learning Needs” field to track what information they wish they had before this patient encounter, what they learned from the encounter or what they would like to research at a later time. The director emphasized that this field was not used for assessment or grading, but was intended to achieve 1 of the objectives of the clerkship, i.e., to develop habits of reflection on one’s clinical work, a habit that has been associated with high-quality patient care. Students were also given target numbers for patient problems for the clerkship, and midpoint in the rotation were encouraged via email to review their logs with their preceptors.

On review of the comments entered into the Learning Needs field, we noted that these were grouped into thematic categories, with some being factual and others more complex reflections, i.e., considerations of the larger context of the encounter. To evaluate students’ use of the field, a thematic analysis was done by first naming these categories according to the cognitive task that the student appeared to be engaged in. For instance, the most common entry was a listing of a new term or fact. Other repeated uses of the field: a) followed the framework of the clinical encounter, recording a physical examination finding or interpretation of primary data, differential diagnosis, or management of the problem; or b) recorded observations, such as physician–patient relationship, and c) recorded more deliberative categories, such as noting associations, generating questions and clinical reasoning activities. The number of entries in each category was noted for each student. These counts were then analyzed to see if the pattern of log use was associated with year of training, gender, number of encounters, or performance on the clerkship knowledge test. The knowledge test was investigated to explore if deeper reflection was associated with higher knowledge achievement in the clerkship. The relationship between categories was examined by looking at correlations between the use of different categories. Descriptive statistics and Pearson correlation coefficients were generated with the use of Stata® software. The protocol was exempted by the Institutional Review Board.

RESULTS

Three rotations of the clerkship were examined, representing 60 students. One student made no entries into Patient Tracker™; 59 student logs were examined. Of these 59 students, 37 (63%) were Year 4 students and 22 (37%) were Year 3 students; 36 (61%) were male. These 59 students entered a total of 3,051 patient encounters; with a mean of 52 (range 10–151) encounters per student for the 4-week clerkship.

Eight students never used the “Learning Need” field. Of the encounters, 1,347 (44.1%) had a “Learning Need” entry; some encounters had more than 1 “Learning Need” entry. The mean number of “Learning Need” entries for 51 students was 22.8 (range 1–65). Year 3 students recorded more “Learning Need” entries than Year 4 students, (29.7 vs 18.7, p < .05). Of the total number of entries, 1,308 were matched to the thematic categories; Table 1 shows the categories chosen to group “Learning Need” comments, ordered by frequency, with examples for each category.

Table 1 Categories, Number and Examples of “Learning Need” Text Comments Entered by 59 Ambulatory Medicine Clerkship Students

The mean score on the written knowledge test was 70.4% + 9.4 for this group of students. Year 4 students performed higher than Year 3 students in the written knowledge test (72.0% vs 67.6%); women scored higher than men (73.6% vs 68.3%). There was no relationship between the number of encounters, number of “Learning Need” entries, frequency of use of any category and performance on the written knowledge test (data not shown).

We next examined whether students had a preference for the category of “Learning Need” entered into Patient Tracker. We hypothesized that students who were entering clinical reasoning statements would also be making associations, posing questions, and noting differential diagnostic pearls. There were significant correlations (r = 0.36–0.47, p < .001) for numbers of entries in the categories: Clinical Reasoning, Differential Diagnosis, Question-asking, and Association Noted. These 4 categories were then collapsed into a “Diagnostic Thinking” category. Student entries in this category were correlated with interpretation of primary data, observations of physician–patient relationships, and management statements, but were not correlated with factual knowledge entries, suggesting that the use of categories differentiated students (Table 2).

Table 2 Correlations of “Diagnostic Thinking” comments with other Categories of Comments in the Learning Need Text Field

DISCUSSION

We report here our initial experience with a brief cue to reflection built into an electronic patient log system during a community-based ambulatory clerkship. Although there is extensive literature describing the use of electronic logs to describe the breadth of patient exposure in a clerkship, we found no previously published examples of this use of the log to stimulate additional learning from each encounter for individual students.1117 On the contrary, 1 study of 3 disciplinary clerkships found that the logs were rarely used to improve the structure of student learning in a clerkship.18 It was unclear as we initiated this feature of the log whether it would be used by students. Having seen substantial use by our students, (44% of encounters in this cohort), we attempted a better understanding of its use by the thematic analysis described here.

Reflection in medical education is frequently approached with journaling and portfolio exercises, and often focuses on critical incidents that occur during clinical experiences.6,1922 Studies of journaling have been troubled, however, by student feedback that these exercises are time-consuming and ineffective in learning.7,19,21 When these so-called reflective activities become part of assessment, there is an additional concern that they detract from clinical learning in favor of learning how to complete the portfolio correctly.22

Our goal was to create a metacognitive cue that was user-friendly for the learner, and encouraged reflection on every encounter. Our results indicated that students displayed a range of approaches to this cue, which suggested different stages of reflective thinking, from recognizing a state of uncertainty to reflecting on the implications of a clinical decision. Those students who generated clinical reasoning issues were more likely to note associations and communications issues, and less likely to list simple factual learning needs, suggesting that they were drawing a deeper learning experience from their encounters. These themes seem to parallel current understanding of the development of medical expertise, i.e., a progression from a structure of medical knowledge as lists of facts to elaborated memory structures, or the embedding of medical knowledge into meaningful “illness scripts”.23

The method of collecting these insights into student reflections was limited by the student motivation to use the “Learning Need” field and how individual students interpreted this field. Eight students chose not to use the field at all. As it was not used for grade purposes, however, we are assuming that students who did use it, used it to meet their needs, and that the hierarchy of content in this field may reflect the maturation of medical knowledge of our students. An additional limitation of the method, however, was the low likelihood of a deep reflection, given the tendency to record brief comments in real time. Moving this tool to a deep reflection would most likely require facilitated discussion, as exemplified in the method of the inpatient “exit rounds”.9

We believe the next step will be to validate these observations with additional studies. Sobral8 has used a Reflection in Learning Scale (RLS) to identify students with higher self-regulatory thinking and found that the RLS predicted cognitive achievement and diagnostic reasoning ability in later years of medical school. Identifying students who use preferred categories of clinical reasoning and associations as having higher RLS or Bordage’s Diagnostic Thinking Inventory24 scores would further validate the use of this tool as a tracking mechanism for students’ reflective practices.

A second use of this tool will be to facilitate the guided teaching by our community-based preceptors. Students value community-based preceptors who use an approach that regularly engages students in self-reflection, but reflection is underutilized as a teaching method in medical education.2,25 The call for more facilitated reflection in ambulatory education has always been counterbalanced by the reality of teaching and practicing in ambulatory settings where busy preceptors are more often relying on non-reflective thought processes.4,26 The ability to see the student’s perceived Learning Needs logs will assist preceptors in providing learning opportunities appropriate to the student’s needs and in engaging students in self-reflection. We now require that our students print out their “Learning Needs” logs as part of their mid-clerkship review with clinic preceptors.

In summary, we report the novel use of an electronic patient log system to promote reflective thinking in a clinical clerkship. Reflective medical practice has been shown to be a multidimensional construct.27 The use of the electronic cue we report here is unlikely to capture all of the potential dimensions of reflective practice. We believe, however, it is an effective prompt to 1 aspect, “deliberate practice,” defined as “the effortful attempt to learn from experience”.27