Skip to main content

Do oncologists prefer subspecialty radiology reports? A quality care study

Abstract

Background

The main objective was to assess whether CT reports of radiologists subspecialized in oncologic imaging respond better to oncological referrals than reports from general radiologists. The secondary objective was to assess differences in ratings between a senior and junior oncologist. Two hundred radiological reports pertaining to oncological patients were retrospectively selected of which 100 each were written by subspecialized radiologists and general radiologists, respectively. The senior and junior oncologists each rated all CT reports using a Likert scale from 1 to 5 (1 = very poor, 5 = excellent) for the following information: anatomical details; interpretation of findings; need for further explanations; appropriateness of conclusions; overall satisfaction. Comparisons between ratings assigned to reports from generalist radiologists and subspecialty radiologists were performed using the Mann–Whitney U test. Agreement between both oncologists was assessed through Gwet's coefficient.

Results

For all but two of the five items obtained from the senior oncologist, oncologists' ratings were significantly higher for subspecialty radiologists' reports (p < 0.01); mean values from both oncologists were generally higher for subspecialty reports (p < 0.001). Agreement between the senior and junior oncologist in the rating of reports from general and subspecialty radiologists was either moderate to substantial (0.5986–0.6788) or substantial to almost perfect (0.6958–0.8358).

Conclusions

According to a senior and junior oncologist, CT reports performed by subspecialized radiologists in oncologic imaging are clearer, more accurate, and more appropriate in the interpretation and conclusions compared to reports written by general radiologists. Likewise, the overall satisfaction of the oncologist from a subspecialized radiologist report is higher.

Key points

  • Radiologists subspecialized in oncologic imaging better report anatomical details and terminology.

  • Subspecialized radiologists better interpret imaging findings and draw appropriate conclusions.

  • Oncologists prefer reports prepared by subspecialized radiologists than those by general radiologists.

Background

Traditionally, radiologists have served as general consultants to physicians of various disciplines. They diagnose, indicate treatment decisions, and guide management across many medical diseases, including oncological diseases. With increasing subspecialization in medicine, it is more and more difficult, if not impossible, for general radiologists to keep pace with the rapidly changing knowledge in the different fields of radiology. Such a changing environment increases the risk of losing value in radiology if radiologists are not focusing on selected areas of imaging [1]. Indeed, the American College of Radiology (ACR) encourages radiologists to provide actionable and meaningful reports that add value to the ordering physician, in order to help the transition from volume-based to value-based imaging care [2].

Many articles in the literature have demonstrated the importance of a second interpretation by subspecialty radiologists [3,4,5]. The subspecialty interpretation is generally sought out to answer specific clinical questions, which may be related to patient symptoms, differential diagnosis, planning of surgical interventions, therapy selection, and prediction of treatment response. For example, in a pilot study of patients presenting with biliary tract disease, non-emergent gastrointestinal bleeding or abdominal mass, the authors reported that diagnostic evaluation made by a consultant subspecialty radiologist resulted in a 64% reduction in time to diagnosis and a 32% reduction in number of studies performed [6]. Chalian et al. reviewed over 2000s-opinion subspecialty consultations in musculoskeletal radiology, and noted clinically relevant different interpretations in 26.2% cases, and discrepant interpretations in 36.3% of oncologic cases [3]. Based on the final pathologic diagnosis, they reported that second-opinion consultations were correct in 82.0% of cases where discrepancies were likely to change patient management [3].

In a similar way, several tertiary oncology care centers have looked at the quantitative effect of radiology consultations on management of cancer patients. One center reported the addition of significant new information in 49% of patients reviewed at their division of oncology radiology meetings, resulting in major management changes in 37% [7]. Two different centers reported a change in interpretation of 41%, with statistically significant changes in tumor node metastases (TNM) staging in 34% of patients, which altered treatment plans in 98% and affected prognosis in 95% [8].

To the best of our knowledge, no previous study has evaluated the value of radiological reports made by radiologists subspecialized in oncologic imaging, for their consistency and meaningfulness according to the clinical indication. Therefore, the main objective of our retrospective study was to assess whether radiological reports of chest and abdomen CT scans performed by radiologists subspecialized in oncologic imaging respond better to the clinical oncological indications than reports written by general radiologists.

The secondary objective was to assess differences in ratings between a senior and junior oncologist.

Methods

The present study was considered a quality care control study by our Ethics Committee and did not fall under the Swiss law of human research. As such, specific approval and informed consent were waived.

Radiological report selection

We retrospectively included consecutive reports of chest and abdomen computed tomography (CT) examinations requested by the Oncology Institute of our Hospital network. The radiological reports were retrieved from our radiology information system (RIS) and then divided into two groups: one group consisted of reports by subspecialized radiologists (starting from January 2019 onwards) and one group of reports by general radiologists (from December 2018 backwards). The cutoff of January 2019 was chosen because it corresponded to a re-organization of the Department of Radiology of our Institution, with a subsequent grouping of radiologists according to sub-specialties. Radiologists were considered subspecialized in oncologic imaging and could join the oncologic imaging group if they had at least 5 years of experience in oncologic imaging and if they demonstrated regular participation in educational events dedicated to oncologic imaging. General radiologists were radiologists with 0–20 years of experience, reporting radiological examinations according to modality.

We excluded reports written before December 2018 by radiologists that in 2019 joined the oncologic imaging group later on, and the search went on to find other reports, in order to reach the same number of reports for subspecialized and general radiologists.

Data records

All the radiological reports in our department are prepared according to the general ACR guidelines [2]. They include the following paragraphs: indication (where the clinical information and clinical question are noted); technical data (including procedure and materials, such as quantity and rate of injection of contrast medium); findings (including description of findings and their interpretation); conclusions (reporting the overall impressions of the examination). The reports were extracted from the RIS, and they were copied and pasted into a separate word file, completely anonymized (not coded) by a radiology resident. First name, second name, date of birth, and gender were canceled from the report. In order to minimize the possibility of recognizing the reporting radiologist, the text format was standardized (i.e., for capital letters in the title, bold font of the headings, etc.) and the technical data were cut because some radiologists use a personal template. The reports selected were initially colored in red and blue to distinguish the ones reported by subspecialists (red) from those reported by generalists (blue), and then mixed, numbered consecutively, and finally colored in black. In a separate Excel spreadsheet file, we recorded the ones from subspecialists and the ones from generalists.

Clinical rating of the radiological reports

One senior oncologist (V.E.) with 10 years of experience in oncology after board certification and one junior oncologist (M.D.G.) with 3 years of experience in oncology after board certification rated the reports. Based on clinical indication, the two oncologists rated the appropriateness of the following information mentioned in the radiological report: the accuracy of the report, the interpretation of the findings, the clarity of the report, and the appropriateness of the conclusions. Furthermore, also the overall subjective satisfaction of the oncologist was rated.

All the above-mentioned information was rated according to an equidistant Likert scale from 1 to 5, where 1 indicated very poor rating; 2 indicated poor rating; 3 indicated fair rating; 4 indicated good rating; and 5 indicated excellent rating.

For accuracy of the report, the correct use of anatomical details and terminology was considered. For interpretation of findings, the evaluation of imaging finding in the body of the report was considered.

For clarity of the report, the need for further explanations after reading the report was considered.

For appropriateness of conclusions, the appropriate answer to the clinical request was considered. The conclusion had to have a comparison with a previous exam if present, a diagnosis or a differential diagnosis if possible, and a follow-up with the right timing and modality if needed. Furthermore, the readers assigned an overall subjective satisfaction rate, from 1 to 5, where 1 = very unsatisfied; 2 = unsatisfied; 3 = neutral; 4 = satisfied; and 5 = very satisfied.

An overall rating score was determined for each item as the mean of the two ratings reported by each oncologist.

Statistical analysis

Descriptive statistics were reported as mean ± standard deviation. Comparisons between generalist and subspecialty radiologists’ reports were performed using the Mann–Whitney U test. Interrater agreement was determined using the Gwet's coefficient [9] instead of Cohen's Kappa to avoid the interrater agreement paradox [10]. Levels of agreement were defined using the classification of Landis and Koch as follows: below 0.00 poor; 0.01–0.20 slight; 0.21–0.40 fair; 0.41–0.60 moderate; 0.61–0.80 substantial; and 0.81–1.00 almost perfect [11].

A post hoc power analysis was carried out to check whether nonsignificant results were due to a lack of statistical power (G*Power 3.1 for Macintosh, Heinrich-Heine, Dusseldorf, Germany).

Statistical significance was set at 5% (p < 0.05). Analyses were performed using STATA16 for Mac (StataCorp, College Station, TX, USA).

Results

Two hundred radiological reports were included (100 written by subspecialized radiologists and 100 by generalist radiologists). Sub-specialized radiologists were three; generalist radiologists were > 10.

As shown in Fig. 1, both clinicians were more satisfied by the subspecialty reports for anatomical details, interpretation of findings, and appropriateness of conclusions. The ratings of the senior oncologist showed no difference between subspecialist/generalist reports in terms of the need for further explanations and for overall satisfaction, whereas for these two information, the junior oncologist was more satisfied by subspecialty radiologists’ reports. A post hoc power analysis showed that the lack of significance recorded regarding two items for the senior oncologist (need for further explanations and overall satisfaction) could be attributed to the limited sample size (see Additional file 1: Table S1).

Fig. 1
figure 1

Comparisons between ratings assigned to reports written by generalist radiologists and subspecialty radiologists

Table 1 summarizes the mean score and standard deviation for the five variables of the two oncologists and shows that the ratings were significantly higher when the reports were written by subspecialty radiologists for all variables (anatomical details/terminology p < 0.0001; interpretation of findings p < 0.0001; need for further explanations p < 0.001; appropriateness of conclusions p < 0.001; overall satisfaction of the oncologist p < 0.001).

Table 1 Ratings of the 2 oncologists for the reports of the generalist and subspecialty radiologists

Table 2 shows that an almost perfect agreement between the clinicians emerged on subspecialty reports regarding anatomical details/terminology (AC = 0.8358) and the appropriateness of conclusions (AC = 0.8216). Substantial agreement was achieved on subspecialty reports about interpretation of findings (AC = 0.6958), need for further explanation (AC = 0.7012), and overall satisfaction of the report (AC = 0.7078). Moderate and substantial levels of agreement were observed on generalist radiologists’ reports about the need for further explanations (AC = 0.5986), anatomical details/terminology (AC = 0.6788), interpretation of findings (AC = 0.6762), appropriateness of conclusions (AC = 0.6731), and overall satisfaction of the report (AC = 0.6647).

Table 2 Agreement between senior and junior oncologists (Gwet’s coefficient)

Discussion

In this study, we have demonstrated that, according to the oncologists, reports prepared by subspecialized radiologists in oncologic imaging responded better to clinical indications for anatomical details and interpretation of findings, needed fewer explanations, and were more appropriate in their conclusions than reports written by generalist radiologists. Moreover, the overall subjective satisfaction of the oncologist was significantly higher for the reports written by subspecialized radiologists, and the agreement between oncologists increased when the reports were by subspecialty radiologists compared to general radiologists.

Meaningful radiology reports are essential to help referring physicians in their decision-making process and to increase the quality of care for patients.

Unfortunately, there is a lack of a standard method to evaluate the quality radiology reports. On the one hand, many previous studies have underlined the importance of imaging interpretation, by assessing the impact on clinical decisions or changes in treatment after a second opinion given by subspecialized radiologists [12,13,14]. On the other hand, it has been stated that the radiology report must not only be accurate but also meaningful [15, 16]. Indeed, according to a Canadian survey, clarity and meaningfulness were the most valued qualities of radiology reports among 200 referring physicians [17]. In the context of radiology reporting, the radiologist's message may not be accurately conveyed to the referring physician, and at other times, the referring physician may misunderstand the radiologist’s report [18].

In our study, both oncologists found that the anatomical details were significantly better explained, and the terminology was more accurate, in the reports made by subspecialized radiologists compared to those made by generalist radiologists. Anatomical details and correct terminology are important facts in oncologic patients at staging, in order to evaluate the possibility of surgical excision, as well as at follow-up, when infiltration of adjacent structures may explain symptoms and may be considered at risk for complications and needing further treatments [19, 20]. This kind of information is frequently dealt with by radiologists in multidisciplinary meetings, where specific questions are posed to decide the most appropriate treatment path [4, 5]. Typical anatomical details needed for oncological patients, for instance, may be the site of retroperitoneal lymph nodes related to the renal vessels for patients with gynecological malignancy who are candidates for retroperitoneal lymphadenectomy, or the relationship between tumoral tissue and mesenteric vessels for pancreatic cancer, or between the malignant tissue and the main pulmonary artery for lung cancer. Our study demonstrates that some of this information may already be mentioned in the original report.

According to both oncologists, imaging findings were significantly better interpreted by subspecialized radiologists than by general radiologists. Indeed, while for a radiologist the description of a hepatic lesion based on the CT density can be evident, this is not the case for the referring oncologist, who needs to know if it is a benign or malignant lesion and, when malignant, if it is more likely a primary or a secondary lesion. Likewise, the mention of the presence of a lymph node should be complemented by its interpretation as positive or negative [21], and the description of a ground-glass opacity should be accompanied by a level of suspiciousness for malignancy [22].

A significant difference in the need for further explanations was recorded only for the junior oncologist. We can speculate that, according to his longer experience, the senior oncologist needs less help through explanations from the radiologists, compared to the junior oncologist. These results are even more interesting because they indirectly indicate that when the imaging report is written by a subspecialized radiologist, the number of phone calls to the radiology department can be lowered. Consequently, as demonstrated by Zabel et al. [23], we may hypothesize that along with a higher value of the radiological report, another advantage of the subspecialty-based organization is a more efficient workflow. This can be important in centers recruiting patients in clinical trials, where a dedicated radiologist is sometimes requested. If the reports are always written by dedicated radiologists, a second reading could be avoided.

The conclusions represent the final impressions and are one of the most challenging parts of the report. According to the ACR guidelines, the conclusions should contain a specific diagnosis, or a differential diagnosis when appropriate, and suggestions for follow-up or additional diagnostic studies to clarify or confirm the impression [2]. In our series, both the oncologists indicated that the conclusions reported by the subspecialized radiologists were significantly more appropriate than those made by generalists.

Of note, with the exception of the variable “need for further explanations” rated by the junior oncologist with a mean of 3.89, the mean ratings of both oncologists are all above 4 on the scale from 1 to 5. Likewise, the senior oncologist did not find a statistically significant difference in the overall satisfaction provided by the reports regardless who has written the report. This indicates that the quality and usefulness of the reports provided by general radiologists were already fairly high. However, the process of upgrading the quality and meaningfulness of radiological reports should aim to the best, not to the fairly high. Furthermore, not all the oncological cases are discussed in multidisciplinary meetings, some oncologists for example do private practice; therefore, the highest quality and usefulness of the radiological reports by subspecialized radiologists might help the decision making process also outside a dedicated oncological hospital.

Worthy of mention is the fact that the inter-reader reliability for all five variables was higher for the radiological reports written by subspecialty radiologists compared to those written by general radiologists.

This study has some limitations. First, the rating of the reports was made without reviewing the images, and therefore, the accuracy of the content of the report was not evaluated. However, this topic has been evaluated elsewhere in the literature and was beyond the scope of this study. Second, the subspecialized radiologists did not have specific board certifications because oncologic imaging does not belong to the subspecialties currently requiring additional training and examination. Nevertheless, they joined the oncologic imaging group based on their own experience and their participation in specific educational events. Third, we did not perform a preliminary power analysis to assess the appropriate number of reports to include. However, we performed a post hoc analysis demonstrating that the results that were not significant (p > 0.05) were related to the limited sample size, and this is clearly stated in the results. Finally, rating of the radiology reports was based on the clinical indication mentioned at the beginning of the report, and we are aware that the clinical history given to the radiologist can sometimes be lacking in detail. However, we did not investigate this in more depth, in order to reproduce a real-life situation as much as possible, where the radiologist looks at the images and provides responses to what is required as a clinical indication.

In conclusion, our study demonstrated that radiology reports of chest and abdomen CT scans written by subspecialized radiologists in oncologic imaging respond better to the clinical indication compared to those written by general radiologists, and that the agreement between oncologists increased when the reports are prepared by subspecialty radiologists. Radiology reports written by subspecialized radiologists in oncology imaging are indeed clearer, more accurate, more appropriate in their conclusions and interpretation, and more reliable compared to radiology reports written by general radiologists.

Availability of data and materials

Not applicable.

Abbreviations

ACR:

American congress of radiology

CT:

Computed tomography

RIS:

Radiology information system

TNM:

Tumor node metastases

References

  1. Levin DC, Rao VM, Berlin J (2013) Ensuring the future of radiology: how to respond to the threats. J Am Coll Radiol 10(9):647–651

    Article  Google Scholar 

  2. https://www.acr.org/Practice-Management-Quality-Informatics/Imaging-3. Accessed 07 Feb 2021.

  3. Chalian M, Del Grande F, Thakkar RS, Jalali SF, Chhabra A, Carrino JA (2016) Second-opinion subspecialty consultations in musculoskeletal radiology. AJR Am J Roentgenol 206:1217–1221

    Article  Google Scholar 

  4. Aletti GD, Garbi A, Messori P et al (2017) Multidisciplinary approach in the management of advanced ovarian cancer patients: A personalized approach. Results from a specialized ovarian cancer unit. Gynecol Oncol. 144(3):468–473

    Article  Google Scholar 

  5. Petrella F, Radice D, Guarize J et al (2021) The impact of multidisciplinary team meetings on patient management in oncologic thoracic surgery: a single-center experience. Cancers (Basel) 13(2):228. https://doi.org/10.3390/cancers13020228

    Article  Google Scholar 

  6. Baker SR (1982) The operation of a radiology consultation service in an acute care hospital. JAMA 248:2152–2154

    Article  CAS  Google Scholar 

  7. Brook OR, Hakmon T, Brook A, Dudnik E, Kuten A, Engel A (2011) The effect of a Radiology Conference consultation on cancer patients management. Ann Oncol 22:1204–1208

    Article  CAS  Google Scholar 

  8. Loevner LA, Sonners AI, Schulman BJ et al (2002) Reinterpretation of cross-sectional images in patients with head and neck cancer in the setting of a multidisciplinary cancer center. AJNR Am J Neuroradiol 23:1622–1626

    PubMed  Google Scholar 

  9. Gwet KL (2016) Testing the difference of correlated agreement coefficients for statistical significance. Educ Psychol Meas 76:609–637

    Article  Google Scholar 

  10. Cicchetti DV, Feinstein AR (1990) High agreement but low kappa: II. Resolving the paradoxes. J Clin Epidemiol 43:551–558

    Article  CAS  Google Scholar 

  11. Landis JR, Koch GG (1977) The measurement of observer agreement for categorical data. Biometrics 33:159–174

    Article  CAS  Google Scholar 

  12. Wechsler RJ, Spettell CM, Kurtz AB et al (1996) Effects of training and experience in interpretation of emergency body CT scans. Radiology 199:717–720

    Article  CAS  Google Scholar 

  13. Briggs GM, Flynn PA, Worthington M, Rennie I, McKinstry CS (2008) The role of specialist neuroradiology second-opinion reporting: is there added value? Clin Radiol 63:791–795

    Article  CAS  Google Scholar 

  14. Ruma J, Klein KA, Chong S et al (2011) Cross-sectional examination interpretation discrepancies between on-call diagnostic radiology residents and subspecialty faculty radiologists: analysis by imaging modality and subspecialty. J Am Coll Radiol 8:409–414

    Article  Google Scholar 

  15. Fischer HW (1983) Better communication between the referring physician and the radiologist (editorial). Radiology 146:845

    Article  Google Scholar 

  16. Lee R, Cohen MD, Jennings GS (2006) A new method of evaluating the quality of radiology reports. Acad Radiol 13(2):241–248

    Article  Google Scholar 

  17. Lafortune M, Broton G, Baudouin JL (1988) The radiological report: what is useful for the referring physician? Can Assoc Radiol J 39:140–143

    CAS  PubMed  Google Scholar 

  18. Berlin L (2000) Pitfalls of the vague radiology report. AJR Am J Roentgenol 174(6):1511–1518

    Article  CAS  Google Scholar 

  19. Casiraghi M, Maisonneuve P, Piperno G et al (2017) Salvage surgery after definitive chemoradiotherapy for non-small cell lung cancer. Semin Thorac Cardiovasc Surg 29(2):233–241

    Article  Google Scholar 

  20. Petrella F, Rizzo S, Radice D et al (2011) Predicting prolonged air leak after standard pulmonary lobectomy: computed tomography assessment and risk factors stratification. Surgeon 9(2):72–77

    Article  Google Scholar 

  21. Rizzo S, Radice D, Femia M et al (2018) Metastatic and non-metastatic lymph nodes: quantification and different distribution of iodine uptake assessed by dual-energy CT. Eur Radiol 28(2):760–769

    Article  Google Scholar 

  22. Gao JW, Rizzo S, Ma LH et al (2017) Pulmonary ground-glass opacity: computed tomography features, histopathology and molecular pathology. Transl Lung Cancer Res 6(1):68–75

    Article  CAS  Google Scholar 

  23. Zabel AOJ, Leschka S, Wildermuth S, Hodler J, Dietrich TJ (2020) Subspecialized radiological reporting reduces radiology report turnaround time. Insights Imaging 11(1):114. https://doi.org/10.1186/s13244-020-00917-z

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Statistical analysis was performed by Marilu’ Garo. Language editing was performed by Susan West.

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

SR and FDG were involved in conceptualization, data curation, methodology, supervision, visualization, original draft and editing, resources; GMN helped in methodology, data curation; MDG, VE, and AS contributed to methodology, data curation, original draft editing. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Stefania Rizzo.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1

: Table S1. Effect size and power of the post hoc analysis.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rizzo, S., Del Grande, M., Espeli, V. et al. Do oncologists prefer subspecialty radiology reports? A quality care study. Insights Imaging 12, 64 (2021). https://doi.org/10.1186/s13244-021-01007-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13244-021-01007-4

Keywords