Affordable Access

deepdyve-link
Publisher Website

SARS-Net: COVID-19 detection from chest x-rays by combining graph convolutional network and convolutional neural network.

Authors
  • Kumar, Aayush1
  • Tripathi, Ayush R1
  • Satapathy, Suresh Chandra1
  • Zhang, Yu-Dong2
  • 1 School of Computer Engineering, Kalinga Institute of Industrial Technology (Deemed to Be University), Bhubaneswar, Odisha, 751024, India. , (India)
  • 2 Department of Informatics, University of Leicester, Leicester LE1 7RH, UK.
Type
Published Article
Journal
Pattern Recognition
Publisher
Elsevier
Publication Date
Feb 01, 2022
Volume
122
Pages
108255–108255
Identifiers
DOI: 10.1016/j.patcog.2021.108255
PMID: 34456369
Source
Medline
Keywords
Language
English
License
Unknown

Abstract

COVID-19 has emerged as one of the deadliest pandemics that has ever crept on humanity. Screening tests are currently the most reliable and accurate steps in detecting severe acute respiratory syndrome coronavirus in a patient, and the most used is RT-PCR testing. Various researchers and early studies implied that visual indicators (abnormalities) in a patient's Chest X-Ray (CXR) or computed tomography (CT) imaging were a valuable characteristic of a COVID-19 patient that can be leveraged to find out virus in a vast population. Motivated by various contributions to open-source community to tackle COVID-19 pandemic, we introduce SARS-Net, a CADx system combining Graph Convolutional Networks and Convolutional Neural Networks for detecting abnormalities in a patient's CXR images for presence of COVID-19 infection in a patient. In this paper, we introduce and evaluate the performance of a custom-made deep learning architecture SARS-Net, to classify and detect the Chest X-ray images for COVID-19 diagnosis. Quantitative analysis shows that the proposed model achieves more accuracy than previously mentioned state-of-the-art methods. It was found that our proposed model achieved an accuracy of 97.60% and a sensitivity of 92.90% on the validation set. © 2021 Elsevier Ltd. All rights reserved.

Report this publication

Statistics

Seen <100 times