< Back to previous page

Publication

Representation Learning with Information Theory to Detect COVID-19 and its Severity

Book Contribution - Chapter

Successful data representation is a fundamental factor in machine learning based medical imaging analysis. Deep Learning (DL) has taken an essential role in robust representation learning. However, the inability of deep models to generalize to unseen data can quickly overfit intricate patterns. Thereby, the importance of implementing strategies to aid deep models in discovering useful priors from data to learn their intrinsic properties. Our model, which we call a dual role network (DRN), uses a dependency maximization approach based on Least Squared Mutual Information (LSMI). LSMI leverages dependency measures to ensure representation invariance and local smoothness. While prior works have used information theory dependency measures like mutual information,
these are known to be computationally expensive due to the density estimation step. In contrast, our proposed DRN with LSMI formulation does not require the density estimation step and can be used as an alternative to approximate mutual information. Experiments on the CT based COVID-19 Detection and COVID-19 Severity Detection Challenges of the 2nd COV19D competition [24] demonstrate the effectiveness of our method compared to the baseline method of such competition.
Book: Lecture Notes in Computer Science
Series: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume: 13807
Pages: 605-620
Number of pages: 16
ISBN:978-3-031-25081-1
Publication year:2023
Keywords:Representation learning, mutual information, COVID-19 detection
  • DOI: https://doi.org/10.1007/978-3-031-25082-8_41
  • Scopus Id: 85150997814
  • ORCID: /0000-0002-1774-2970/work/135087026
  • ORCID: /0000-0001-9300-5860/work/135083561
  • ORCID: /0000-0002-7707-1136/work/135094229
  • ORCID: /0000-0002-9442-612X/work/135095461
  • ORCID: /0000-0001-5714-3254/work/135096633
  • ORCID: /0000-0003-4970-6517/work/135098640
  • ORCID: /0000-0001-5127-2573/work/135099520
Accessibility:Closed