Editorial
Artificial Intelligence: The Good, the Bad and the Ugly
Authors: Aamer Chughtai
DOI: https://doi.org/10.37184/nrjp.3007-5181.1.19
Year: 2026
Volume: 2
Received: Oct 21, 2024
Revised: Dec 17, 2024
Accepted: Dec 24, 2024
Corresponding Auhtor: Aamer Chughtai (chughta@ccf.org)
All articles are published under the Creative Commons Attribution License
Editorial
Artificial Intelligence: The Good, the Bad and the Ugly
Aamer Chughtai1*
1Cleveland Clinic, Cleveland, OH, USA
The last decade has seen an explosion of application of artificial intelligence (AI) tools in almost all fields of medicine, much more so in diagnostic radiology. The understanding of these tools, however, is unfortunately not as widespread as one would expect. This relative lack of understanding has therefore created a 'fear of the unknown' in most of the end users of imaging technology. Notwithstanding this fear, there is no question that the integration of AI in clinical and imaging fields is already well underway. For example, its use is being investigated in the early detection of lung cancers and prediction of invasiveness of adenocarcinomas in situ [1, 2], automated segmentation and atheroma evaluation in coronary arteries [3], and use of deep learning networks for segmentation of pathologic nuclei in breast cancers, to name a few [4].
A chest X‐ray remains the most common first‐line imaging tool for thoracic assessment worldwide. Even with experienced radiologists and technological advancements in chest radiography, the reported error rates for CXR interpretation have remained high due to various factors such as image quality, workload, and fatigue. Therefore, the reduction of interpretive errors is one of the major forces driving the interest in machine learning‐driven diagnostic tools to facilitate CXR interpretation [5]. For example, applications are being developed to extract features using Artificial neural networks (ANNs) to assist in detecting pneumonia and lung cancers on plain chest radiographs, which can sometimes be difficult to identify [6, 7]. AI tools can assess the image for all target findings without being affected by other modifying factors such as clinical presentation, complex anatomy, or suboptimal acquisition parameters. The use of AI tools also avoids the effect of the radiologist's experience and therefore is highly consistent.
Although the AI tools have improved the efficiency and throughput of diagnostic imaging studies, the knowledge of AI terminology, principles, and applications by radiologists is necessary for optimal use of AI applications. A survey by Rainey et al. demonstrated lack of knowledge, skills, and confidence among the radiologists/medical staff and students which hinders the utilization of full potential of these tools [8]. This will inevitably improve with time as these tools become more widely available.
That is all well and good. Now the bad and the ugly part. Two major issues are associated with the use of AI tools particularly in diagnostic imaging. Firstly, there is the fear that AI will replace radiologists or even eliminate the discipline of diagnostic radiology. While machine learning will continue to be developed and evolve over the next several years, Chan et al. suggest that may not happen any time soon, not even in our or our current radiology trainees' lifetimes [9]. There is almost no possibility that radiologists will be replaced by AI despite all the advances. An automated image interpretation system will only use the information provided to it, which is, as we all radiologists know, minimal, misleading, or none whatsoever. An AI tool will not pick up a phone and call the referring clinician and discuss the clinical scenario, the question posed, and management strategies as we normally do. This is unlikely to be replicated by machine learning.
Secondly, the effect of AI tools in diagnostic imaging on the training of radiology residents raises a lot of questions. A scenario where the subtle and not-so-subtle abnormalities on a plain radiograph are picked up and auto-generated into a report for the radiology trainee to sign may have grave implications for the training. We learn from interpreting imaging studies in light of clinical information, experience gained from previous interpretations, formal and informal teaching, and reading literature. We gain experience from reading studies day in and day out, using our natural neural networks to build a so-called 'image memory', and use it to make meaningful image interpretations, discuss results with the clinical teams, and teach junior residents. If the imaging studies are auto-interpreted and a report generated then the trainee is just left to hit the sign button, without paying much attention to the image itself. And then there is the question of false positive and false negative interpretations by AI. Continual refinement of AI tools is therefore necessary to improve sensitivity and specificity, and care must be taken not to detrimentally affect trainee learning opportunities.
I believe establishing a robust search pattern is the key to becoming a safe and confident radiologist. The trainees must learn to compose an effective, unambiguous report concisely describing findings and concluding the report with impressions that are actionable and address the clinical question. The development of these faculties may be negatively affected by the routine use of autointerpretation and reporting. I am not sure if that plays out on a wider scale and what impact it may have on the quality of trainees in the coming years, but I can venture to say it may not be good.
As I write this, work on AI in medicine continues. If one picks up a radiology journal, majority of articles are in some form or shape are describing an AI application in all fields of diagnostic imaging, and that is a surely the best part.
REFERENCES
1. Wang Y, Zhou C, Ying L, Chan HP, Lee E, Chughtai A, et al. Enhancing early lung cancer diagnosis: predicting lung nodule progression in follow-up low-dose CT scan with deep generative model. Cancers (Basel) 2024; 16(12): 2229. DOI: https://doi.org/10.3390/cancers16122229
2. Wang Y, Zhou C, Ying L, Lee E, Chan HP, Chughtai A, et al. Leveraging serial low-dose CT scans in radionicsbased reinforcement learning to improve early diagnosis of lung cancer at baseline screening. Radiol Cardiothorac Imaging 2024; 6(3): e230196. DOI: https://doi.org/10.1148/ryct.230196
3. Hadjiiski L, Zhou C, Chan HP, Chughtai A, Agarwal P, Kuriakose J, et al. Coronary CT angiography (cCTA): automated registration of coronary arterial trees from multiple phases. Phys Med Biol 2014; 59(16): 4661-80. DOI: https://doi.org/10.1088/0031-9155/59/16/4661
4. Zhou C, Chan HP, Hadjiiski LM, Chughtai A. Recursive training strategy for a deep learning network for segmentation of pathology nuclei with incomplete annotation. IEEE Access 2022; 10: 49337-46. DOI: https://doi.org/10.1109/access.2022.3172958
5. White CS, Salis AI, Meyer CA. Missed lung cancer on chest radiography and computed tomography: imaging and medicolegal issues. J Thorac Imaging. 1999; 14(1): 63-8. DOI: https://doi.org/10.1097/00005382-199901000-00006
6. Gupta A, Padsala M, Saikia P. Detection of pneumonia from chest X-ray images using transfer learning on deep CNN. 4th International Conference for Emerging Technology (INCET), Belgaum, India 2023; pp. 1-6. DOI: https://doi.org/10.1109/INCET57972.2023.10170454
7. Humayun M, Sujatha R, Almuayqil SN, Jhanjhi NZ. A transfer learning approach with a convolutional neural network for the classification of lung carcinoma. Healthcare (Basel) 2022; 10(6): 1058. DOI: https://doi.org/10.3390/healthcare10061058
8. Rainey C, O'Regan T, Matthew J, Skelton E, Woznitza N, Chu KY, et al. Beauty is in the AI of the beholder: are we ready for the clinical integration of artificial intelligence in radiography? An exploratory analysis of perceived AI knowledge, skills, confidence, and education perspectives of UK radiographers. Front Digit Health 2021; 3: 739327. DOI: https://doi.org/10.3389/fdgth.2021.739327
9. Chan S, Siegel EL. Will machine learning end the viability of radiology as a thriving medical specialty? Br J Radiol 2019; 92(1094): 20180416. DOI: https://doi.org/10.1259/bjr.20180416