| Online-Ressource |
Verfasst von: | Li, Hao [VerfasserIn]  |
| Ghamisi, Pedram [VerfasserIn]  |
| Rasti, Behnood [VerfasserIn]  |
| Wu, Zhaoyan [VerfasserIn]  |
| Shapiro, Aurelie [VerfasserIn]  |
| Schultz, Michael [VerfasserIn]  |
| Zipf, Alexander [VerfasserIn]  |
Titel: | A multi-sensor fusion framework based on coupled residual convolutional neural networks |
Verf.angabe: | Hao Li, Pedram Ghamisi, Behnood Rasti, Zhaoyan Wu, Aurelie Shapiro, Michael Schultz and Alexander Zipf |
E-Jahr: | 2020 |
Jahr: | 26 June 2020 |
Umfang: | 21 S. |
Fussnoten: | Gesehen am 07.09.2020 |
Titel Quelle: | Enthalten in: Remote sensing |
Ort Quelle: | Basel : MDPI, 2009 |
Jahr Quelle: | 2020 |
Band/Heft Quelle: | 12(2020,12) Artikel-Nummer 2067, 21 Seiten |
ISSN Quelle: | 2072-4292 |
Abstract: | Multi-sensor remote sensing image classification has been considerably improved by deep learning feature extraction and classification networks. In this paper, we propose a novel multi-sensor fusion framework for the fusion of diverse remote sensing data sources. The novelty of this paper is grounded in three important design innovations: 1- a unique adaptation of the coupled residual networks to address multi-sensor data classification; 2- a smart auxiliary training via adjusting the loss function to address classifications with limited samples; and 3- a unique design of the residual blocks to reduce the computational complexity while preserving the discriminative characteristics of multi-sensor features. The proposed classification framework is evaluated using three different remote sensing datasets: the urban Houston university datasets (including Houston 2013 and the training portion of Houston 2018) and the rural Trento dataset. The proposed framework achieves high overall accuracies of 93.57%, 81.20%, and 98.81% on Houston 2013, the training portion of Houston 2018, and Trento datasets, respectively. Additionally, the experimental results demonstrate considerable improvements in classification accuracies compared with the existing state-of-the-art methods. |
DOI: | doi:10.3390/rs12122067 |
URL: | Bitte beachten Sie: Dies ist ein Bibliographieeintrag. Ein Volltextzugriff für Mitglieder der Universität besteht hier nur, falls für die entsprechende Zeitschrift/den entsprechenden Sammelband ein Abonnement besteht oder es sich um einen OpenAccess-Titel handelt.
Volltext ; Verlag: https://doi.org/10.3390/rs12122067 |
| Volltext: https://www.mdpi.com/2072-4292/12/12/2067 |
| DOI: https://doi.org/10.3390/rs12122067 |
Datenträger: | Online-Ressource |
Sprache: | eng |
Sach-SW: | auxiliary loss function |
| convolutional neural networks (CNNs) |
| data fusion |
| deep learning |
| hyperspectral image classification |
| multi-sensor fusion |
| residual learning |
K10plus-PPN: | 1728955556 |
Verknüpfungen: | → Zeitschrift |
¬A¬ multi-sensor fusion framework based on coupled residual convolutional neural networks / Li, Hao [VerfasserIn]; 26 June 2020 (Online-Ressource)