In our institution, the annual evaluation task of the residents consists of three parts: a self-fulfilled logbook (clinical and scientific workload, radiology and multidisciplinary meeting attendance); a summary of the evaluation by the supervising faculty staff of their knowledge, skills, and attitudes; and our MCQ test addressing all radiology subspecialties. The MCQ test is performed on a yearly basis to provide the resident an insight on his/her learning curve throughout the 5 years of residency. The results of the PGY4 and PGY5 MCQs are validated by the National Accreditation Board and integrated in the qualification process, in the absence of a national board examination.
The current study demonstrated that the scores obtained by the residents varied according to their level of training in radiology with a non-exponential improvement throughout the 5 years of residency. Our learning gain curve of radiology residents that seems to decelerate over time was similar to that observed by Ravesloot [28].
Second, the scores obtained in the MCQresident were statistically significantly higher than in the MCQteacher for all residents, independently of their post-graduate year. The most likely explanation was that the degree of difficulty of the MCQresident was lower than that of the MCQteacher and that the number of NFD was higher in MCQresident than in MCQteacher. We cannot exclude the hypothesis that the residents deliberately lowered the difficulty level and included NFD because they knew that their MCQs would be used for their annual evaluation. It is most likely that these characteristics are inherent to the degree of qualification of the MCQ writer [33].
Third, the observation that the discrimination index of MCQresident was higher than that of MCQteacher warrants further assessment as this feature is important when composing high-quality MCQs. A likely explanation was that MCQresident were written by PGY3–5 and not by PGY1–2 residents. Therefore, the PGY3–5 residents who, in overall, should obtain the highest scores obtain much better scores than the PGY1–2 residents in the MCQs of their peers than in the MCQteacher, thus artificially increasing the discriminating index of the MCQresident.
Finally, the analysis of the MCQs according to Bloom’s taxonomy demonstrated that the MCQresident focused more on recalling skills than the MCQteacher that required the capacity to analyze and apply knowledge. This feature indicates the difficulty in writing high-quality MCQs that require more experience in solving problems [29,30,31]. However, although Bloom’s taxonomy is a hierarchical model, the lowest levels of the hierarchical Bloom’s taxonomy should not be disregarded as unimportant or unworthy of teaching [34]. Actually, while lesion detection may be considered as a (low) knowledge level (pattern recognition), there is a general agreement on the fact that most errors are detection errors rather than characterization errors. Furthermore, the distinction between the categories can be seen as artificial since any given cognitive task may entail a number of processes. Any attempt to nicely categorize cognitive processes into clean, cut-and-dried classifications undermines the holistic, highly connective and interrelated nature of cognition, a criticism that is directed at taxonomies of mental processes in general [35].
The effects of this collaborative approach for MCQ writing are controversial although it at least contributes to create questions that can support formal or summative evaluations [36]. Aflalo demonstrated the absence of statistically significant improvement in achievements, when comparing the examination grades before and after question generation in a group of 133 students generating questions [37]. Although students were able to write complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other’s MCQs [10, 19]. The use of dedicated software like PeerWise which is a freely and globally available online platform allows students to write, share, answer, rate, and discuss peer-written MCQs. Studies demonstrated that PeerWise user students perform significantly better in end-of-course summative assessment than non-user student s[16, 17, 38].
The effect of MCQ format on the resident’s scores was not assessed as questions with videos were eliminated because of technical problems on certain personal computers. While taking the exam, residents were not able to scroll into images. The Clinically Orientated Reasoning Evaluation (CORE) computer-based format that replaced the oral examination in EDIR using DICOM viewer simulating the daily work of radiologists is most likely a better way to evaluate radiology residents [39].
The current study highlighted differences between MCQresident and MCQteacher that will be explained to current and future radiology residents in order to increase the quality of their MCQs. In addition, we plan to share this collaborative approach with other training centers to provide a broader supply of MCQ that would decrease the influence on the exam takers.
Our study had several limitations. First, it was a monocenter study with a limited number of MCQs, from residents and from teachers. Second, both MCQs were selected by two radiologists to create a series of MCQs that would cover all fields of diagnostic and interventional radiology. To minimize selection bias, items were selected based on their characteristics indicated by the residents and the teachers and not by reading the MCQs. In addition, questions with a high degree of importance and a frequent occurrence in clinical practice have been privileged. Finally, our results were influenced by the facts that PGY3–5 residents composed the MCQs and that PGY1–5 residents took the examination. Residents were also aware of the fact that their MCQs would be used in the annual evaluation.
In conclusion, the current study demonstrated that the educational characteristics of MCQresident differ from those of the MCQteacher in many ways. The clear identification of these differences enabled us to indicate points of attention to address in MCQ writing guidance in order to achieve higher quality examinations with the collaboration of the teaching staff.