Fusing learned representations from Riesz Filters and Deep CNN for lung tissue classification.
Details
Serval ID
serval:BIB_F78834B85B25
Type
Article: article from journal or magazin.
Collection
Publications
Institution
Title
Fusing learned representations from Riesz Filters and Deep CNN for lung tissue classification.
Journal
Medical image analysis
ISSN
1361-8423 (Electronic)
ISSN-L
1361-8415
Publication state
Published
Issued date
08/2019
Peer-reviewed
Oui
Volume
56
Pages
172-183
Language
english
Notes
Publication types: Journal Article ; Research Support, Non-U.S. Gov't
Publication Status: ppublish
Publication Status: ppublish
Abstract
A novel method to detect and classify several classes of diseased and healthy lung tissue in CT (Computed Tomography), based on the fusion of Riesz and deep learning features, is presented. First, discriminative parametric lung tissue texture signatures are learned from Riesz representations using a one-versus-one approach. The signatures are generated for four diseased tissue types and a healthy tissue class, all of which frequently appear in the publicly available Interstitial Lung Diseases (ILD) dataset used in this article. Because the Riesz wavelets are steerable, they can easily be made invariant to local image rotations, a property that is desirable when analyzing lung tissue micro-architectures in CT images. Second, features from deep Convolutional Neural Networks (CNN) are computed by fine-tuning the Inception V3 architecture using an augmented version of the same ILD dataset. Because CNN features are both deep and non-parametric, they can accurately model virtually any pattern that is useful for tissue discrimination, and they are the de facto standard for many medical imaging tasks. However, invariance to local image rotations is not explicitly implemented and can only be approximated with rotation-based data augmentation. This motivates the fusion of Riesz and deep CNN features, as the two techniques are very complementary. The two learned representations are combined in a joint softmax model for final classification, where early and late feature fusion schemes are compared. The experimental results show that a late fusion of the independent probabilities leads to significant improvements in classification performance when compared to each of the separate feature representations and also compared to an ensemble of deep learning approaches.
Keywords
Datasets as Topic, Deep Learning, Humans, Lung Diseases/classification, Lung Diseases/diagnostic imaging, Radiographic Image Interpretation, Computer-Assisted/methods, Reproducibility of Results, Tomography, X-Ray Computed, Classification, Deep learning, ILD, Texture signatures
Pubmed
Web of science
Create date
29/08/2023 7:44
Last modification date
10/10/2023 13:41