Navigation überspringen
Universitätsbibliothek Heidelberg
Status: Bibliographieeintrag

Verfügbarkeit
Standort: ---
Exemplare: ---
heiBIB
 Online-Ressource
Verfasst von:Kleesiek, Jens Philipp [VerfasserIn]   i
 Kersjes, Benedikt [VerfasserIn]   i
 Ueltzhöffer, Kai [VerfasserIn]   i
 Murray, Jacob [VerfasserIn]   i
 Rother, Carsten [VerfasserIn]   i
 Köthe, Ullrich [VerfasserIn]   i
 Schlemmer, Heinz-Peter [VerfasserIn]   i
Titel:Discovering digital tumor signatures-using latent code representations to manipulate and classify liver lesions
Verf.angabe:Jens Kleesiek, Benedikt Kersjes, Kai Ueltzhöffer, Jacob M. Murray, Carsten Rother, Ullrich Köthe and Heinz-Peter Schlemmer
E-Jahr:2021
Jahr:22 June 2021
Umfang:13 S.
Fussnoten:Gesehen am 08.09.2021
Titel Quelle:Enthalten in: Cancers
Ort Quelle:Basel : MDPI, 2009
Jahr Quelle:2021
Band/Heft Quelle:13(2021), 13, Artikel-ID 3108, Seite 1-13
ISSN Quelle:2072-6694
Abstract:Simple Summary We use a generative deep learning paradigm for the identification of digital signatures in radiological imaging data. The model is trained on a small inhouse data set and evaluated on publicly available data. Apart from using the learned signatures for the characterization of lesions, in analogy to radiomics features, we also demonstrate that by manipulating them we can create realistic synthetic CT image patches. This generation of synthetic data can be carried out at user-defined spatial locations. Moreover, the discrimination of liver lesions from normal liver tissue can be achieved with high accuracy, sensitivity, and specificity. Modern generative deep learning (DL) architectures allow for unsupervised learning of latent representations that can be exploited in several downstream tasks. Within the field of oncological medical imaging, we term these latent representations "digital tumor signatures" and hypothesize that they can be used, in analogy to radiomics features, to differentiate between lesions and normal liver tissue. Moreover, we conjecture that they can be used for the generation of synthetic data, specifically for the artificial insertion and removal of liver tumor lesions at user-defined spatial locations in CT images. Our approach utilizes an implicit autoencoder, an unsupervised model architecture that combines an autoencoder and two generative adversarial network (GAN)-like components. The model was trained on liver patches from 25 or 57 inhouse abdominal CT scans, depending on the experiment, demonstrating that only minimal data is required for synthetic image generation. The model was evaluated on a publicly available data set of 131 scans. We show that a PCA embedding of the latent representation captures the structure of the data, providing the foundation for the targeted insertion and removal of tumor lesions. To assess the quality of the synthetic images, we conducted two experiments with five radiologists. For experiment 1, only one rater and the ensemble-rater were marginally above the chance level in distinguishing real from synthetic data. For the second experiment, no rater was above the chance level. To illustrate that the "digital signatures" can also be used to differentiate lesion from normal tissue, we employed several machine learning methods. The best performing method, a LinearSVM, obtained 95% (97%) accuracy, 94% (95%) sensitivity, and 97% (99%) specificity, depending on if all data or only normal appearing patches were used for training of the implicit autoencoder. Overall, we demonstrate that the proposed unsupervised learning paradigm can be utilized for the removal and insertion of liver lesions at user defined spatial locations and that the digital signatures can be used to discriminate between lesions and normal liver tissue in abdominal CT scans.
DOI:doi:10.3390/cancers13133108
URL:Bitte beachten Sie: Dies ist ein Bibliographieeintrag. Ein Volltextzugriff für Mitglieder der Universität besteht hier nur, falls für die entsprechende Zeitschrift/den entsprechenden Sammelband ein Abonnement besteht oder es sich um einen OpenAccess-Titel handelt.

Volltext ; Verlag: https://doi.org/10.3390/cancers13133108
 Volltext: https://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=DynamicDOIArticle&SrcApp=WOS&KeyAID=10.3390%2 ...
 DOI: https://doi.org/10.3390/cancers13133108
Datenträger:Online-Ressource
Sprache:eng
Sach-SW:classification
 diagnosis
 latent code
 machine learning
 synthetic image generation
 unsupervised learning
K10plus-PPN:176968252X
Verknüpfungen:→ Zeitschrift

Permanenter Link auf diesen Titel (bookmarkfähig):  https://katalog.ub.uni-heidelberg.de/titel/68777662   QR-Code
zum Seitenanfang