Introduction

The chest radiograph is a ubiquitous clinical tool in the medical profession. Medical students often receive inadequate formal introduction to the interpretation of these images before entering their clinical training. A study that surveyed both domestic and international medical schools found that 30% of programs do not teach radiology in the preclinical curriculum. Furthermore, while most medical schools teach the principles of basic chest radiography during the internal medicine clerkship, 24% of medical schools surveyed did not provide such instruction.[1] Other studies have demonstrated that medical students’ abilities are deficient in identifying abnormalities on chest radiographs that show common pathologies, such as tuberculosis, emphysema, and pneumothorax.[2][3] Research has demonstrated that both preclinical and clinical medical students have gained valuable knowledge from short courses that explicitly address the reading of radiographs.[4][5] While some clinical professors still utilize plain films with a projector to teach these small courses, Zou et al found that medical students prefer to learn radiology with the aid of PowerPoint slides and from professors who question them in the Socratic manner emulated in this study.[6]

There is a need for further testing and validation of such courses in the literature so that medical schools can evaluate their merits and decide whether to include them in their curricula. The purpose of this study was to examine the effectiveness of 3 faculty-led radiology small group learning sessions for second-year medical students at the University of Michigan. We sought to quantify the knowledge gained by participating students to evaluate whether to incorporate this radiology small group format into the new curriculum at our institution. Our hypothesis was that students taking our brief radiographic course could significantly improve their knowledge of chest radiography, as evidenced by higher quiz scores.

Participant Recruitment

Second-year medical students were recruited to participate voluntarily via e-mail, and 16 students out of 179 signed up. Prior to primer implementation, these students had received no lectures devoted entirely to reading chest radiography but had received informal instruction interspersed throughout other lectures. To be included in analysis, students had to attend a minimum of 2 of the 3 sessions. Students were allowed to exercise any permutation of attendance to meet this requirement—that is, attending the first and third sessions, the first 2, or the latter 2.

Structure of the Primer

A single faculty member (KAK) was responsible for the conception and administrative duties of the course, while a separate faculty member (EAS) developed and taught the course. The course consisted of 3 1-hour sessions. The first 2 sessions were principally didactic with ample time for discussion and questions. Lectures were provided in PowerPoint format with multiple case examples to illustrate important concepts. The first session gave a basic introduction to reading a chest radiograph and included relevant review of anatomy and the pathologic concept of the silhouette sign. The second session applied this basic framework to the interpretation of various common pathologies, including pulmonary contusion, pulmonary edema, pneumothorax, pleural effusion, mediastinal masses, and iatrogenic complications of emergency procedures.

While the first 2 sessions followed a traditional, didactic lecture format, the third session placed the student in the “hot seat” and invited each student to diagnose various pathologies taught earlier in the series. A student was presented with a radiograph, via projected PowerPoint slide, and asked to identify the imaging abnormalities and develop a corresponding differential diagnosis. After each student in the session had at least one opportunity to mentally dissect and verbally diagnose a case in front of peers, the faculty moderator explained the correct interpretations of the presented radiographs to enhance and solidify student learning.

Evaluation

An online examination platform, Radiology​.ExamWeb​.com, was used to construct a 16-question quiz to evaluate students’ general knowledge of chest radiography. Questions were selected from a database of questions formatted in the style of the National Board of Medical Examiners and the American Board of Radiology. The questions in this database were submitted by members of the Alliance of Medical Student Educators in Radiology and were peer-reviewed by a panel of subspecialty editors. Further information about the testing platform can be obtained from a previous publication by Lewis et al.[7]

Two days before the beginning of the course, students were asked to complete the quiz before the first meeting of the primer. They had unlimited time to complete it and were asked not to use outside resources. The quiz was not proctored, due to the nature of the online testing modality. Students were provided with their scores at the end of the prequiz but were not provided with analysis of correct/incorrect answers.

At the conclusion of the course, participating students were asked to complete this same online quiz. Students were provided with the PowerPoint lectures used in the course and were encouraged to review these before taking the postquiz (but they were not allowed to use them during the quiz). Only students who attended a minimum of 2 primer sessions and completed both the prequiz and postquiz were included in statistical analyses. The prequiz and postquiz datasets were tested for normality using the Shapiro-Wilk test. A Wilcoxon signed-rank test was performed on the prequiz and postquiz datasets from these students to determine whether they significantly improved their scores. All statistics were performed with the Statistical Package for the Social Sciences (SPSS), and the significance threshold was set a priori at p ≤ .05.

Results

In total, 16 students signed up to participate in the course out of the 179 students who received the e-mail. Of those 16, 10 students met the study’s inclusion requirements. Descriptive and inferential statistics from this cohort are listed in Table 1. As seen in Table 1, scores were significantly higher on the postquiz than on the prequiz (p = .015).

Table 1. Descriptive and Inferential Statistics for Students Who Completed Both Pre- and Postquiz
Prequiz (n = 10) Postquiz (n = 10)
Mean score 47.1 63.9
Standard deviation 12.7 11.6
Wilcoxon test p = .015

When subgroup analyses were performed on students who had attended the third primer session (n = 5) versus those who had not (n = 5), it was found that students who participated in the third session had an average postquiz score of 69%, while the students absent from the third session had an average postquiz score of 58.8%. The corresponding prequiz score averages for the 2 groups were 48.8% for third session attendees and 45.4% for those who were absent from the third session. These results are listed in Table 2.

Table 2. Pre- and Postquiz Scores for Students Who Attended the 3rd Primer Session Versus Those Who Did Not
Students Who Attended 3rd Session (n = 5) Students Who Did Not (n = 5)
Prequiz (avg. in % [std. dev.]) 48.8 [13.7] 45.4 [12.9]
Postquiz (avg. in % [std. dev.]) 69.0 [8.5] 58.8 [12.9]

Discussion

The statistically significant increase in student scores from the prequiz to the postquiz represents the knowledge gained from this chest radiograph primer. The quiz contained concepts directly addressed in the course as well as common pathologies not specifically addressed. The significant increase in postquiz scores suggests that students who took the course not only were prepared to answer questions on topics they had just learned but were able to extend this knowledge to new concepts as well.

The increase in postquiz average from students who attended the third session, where students were asked to diagnose verbally radiographic pathology, could suggest that the “hot seat” format may be a valuable learning tool. However, a confounding variable includes the fact that 3 students who attended the third session also attended the first 2 sessions and thus had more total lecture time than the students who did not attend the third lecture. The “hot seat” learning environment is similar to that used on the wards during the clinical years and is colloquially referred to as “pimping,” though some authors have questioned the appropriateness of this term.[8] The literature on this technique’s effectiveness in promoting student learning is mixed, with some authors defending its merits[9][10][11] and others calling into question its practice.12 To more accurately test the effectiveness of the “hot seat” didactic style, a follow-up study could include a variable third session, with a larger sample size, in which half of students are randomized to the “hot seat” format, while the other half continue with traditional didactics using PowerPoint slides containing the same material covered in the “hot seat” lecture. This would allow for more effective comparison between the teaching approaches.

Limitations of our study include variable lengths of time in between primer sessions and taking the postquiz. The average number of days from the conclusion of the course that it took students to complete the post quiz was 13.1 days. The range was 3 days to 22 days. Moreover, the prequiz contained the same questions as the postquiz. Most students completed the postquiz roughly one month after taking the prequiz, so it is possible that specific correct answers were remembered from the prequiz administration. In future iterations of this course, we will consider developing a separate postquiz of commensurate difficulty to the prequiz to eliminate any potential confounder that may have resulted from administration of the same quiz twice.

The results of this study substantiate the utility of including this primer in the new medical curriculum currently developing at the University of Michigan, which will provide a natural medium for future, more extensive study of the concept and can be used as a guide for other medical schools undergoing curricular reform.

Conclusion

Medical students who participate in radiology small-group learning sessions can significantly improve their general knowledge of chest radiography pathology. Socratic-style questioning by a faculty member is an engaging style of teaching chest radiography, but further studies are needed to determine the efficacy of the “hot seat” format versus traditional didactics. Regardless, small-group learning sessions in chest radiography are a valuable addition to the preclinical medical school curriculum.

References

    1. O’Brien KE, Cannarozzi ML, Torre DM, Mechaber AJ, Durning SJ. Training and assessment of ECG interpretation skills: results from the 2005 CDIM survey. Teach Learn Med. 2009;21(2):111-115.return to text

    2. Jeffrey D, Goddard P, Callaway M, Greenwood R. Chest radiograph interpretation by medical students. Clin Rad. 2003;58(6):478-481.return to text

    3. Feigin DS, Smirniotopoulos JG, Neher TJ. Retention of radiographic anatomy of the chest by 4th-year medical students. Ac Rad. 2002;9(1):82-88.return to text

    4. Salajegheh A, Jahangiri A, Dolan-Evans E, Pakneshan S. A combination of traditional learning and e-learning can be more effective on radiological interpretation skills in medical students: a pre- and post-intervention study. BMC Med Educ. 2016;16(1).return to text

    5. Dawes TJW, Vowler SL, Allen CMC, Dixon AK. Training improves medical student performance in image interpretation. Br J Radiol. 2004;77(921):775-776.return to text

    6. Zou L, King A, Soman S, et al. Medical students’ preferences in radiology education: a comparison between the Socratic and didactic methods utilizing PowerPoint features in radiology education. Ac Radiol. 2011;18(2):253-256.return to text

    7. Lewis PJ, Chen JY, Lin DJ, McNulty NJ. Radiology ExamWeb. Ac Radiol. 2013;20(3):290-296.return to text

    8. Martin GC, Wells DM. Nothing artful about the term “pimping.” Med Ed. 2014;48(10):1028.return to text

    9. Healy J, Yoo P. In defense of “pimping.” J Surg Ed. 2015;72(1):176-177.return to text

    10. Schaik KDV. Pimping Socrates. JAMA. 2014;311(14):1401.return to text

    11. Detsky AS. The art of pimping. JAMA. 2009;301(13):1379.return to text

    12. Kost A, Chen FM. Socrates was not a pimp. Acad Med. 2015;90(1):20.