%0 Journal Article
%@ 0018-9294
%A Yung, Ka-Wai
%A Sivaraj, Jayaram
%A De Coppi, Paolo
%A Stoyanov, Danail
%A Loukogeorgakis, Stavros
%A Mazomenos, Evangelos B
%D 2024
%F discovery:10193306
%I Institute of Electrical and Electronics Engineers (IEEE)
%J IEEE Transactions on Biomedical Engineering
%K Necrotizing Enterocolitis, Fine Grained Visual Classification, Abdominal X-ray
%N 11
%P 3160  -3169
%T Diagnosing Necrotising Enterocolitis Via Fine-Grained Visual Classification
%U https://discovery.ucl.ac.uk/id/eprint/10193306/
%V 71
%X Necrotizing Enterocolitis (NEC) is a devastating condition affecting prematurely born neonates. Reviewing Abdominal X-rays (AXRs) is a key step in NEC diagnosis, staging and treatment decision-making, but poses significant challenges due to the subtle, difficult-to-identify radiological signs of the disease. In this paper, we propose AIDNEC - AI D iagnosis of NEC rotizing enterocolitis, a deep learning method to automatically detect and stratify the severity (surgical or medical) of NEC from no pathology in AXRs. The model is trainable end-to-end and integrates a Detection Transformer and Graph Convolution modules for localizing discriminative areas in AXRs, used to formulate subtle local embeddings. These are then combined with global image features to perform Fine-Grained Visual Classification (FGVC). We evaluate AIDNEC on our GOSH NEC dataset of 1153 images from 334 patients, achieving 79.7% accuracy in classifying NEC against No Pathology. AIDNEC outperforms the backbone by 2.6%, FGVC models by 2.5% and CheXNet by 4.2%, with statistically significant (two-tailed p < 0.05) improvements, while providing meaningful discriminative regions to support the classification decision. Additional validation in the publicly available Chest X-ray14 dataset yields comparable performance to state-of-the-art methods, illustrating AIDNEC's robustness in a different X-ray classification task. Dataset and source code will be released in our institutional database.
%Z This version is the author accepted manuscript. For information on re-use, please refer to the publisher's terms and conditions.