Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification
Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context...
Ausführliche Beschreibung
Autor*in: |
Mercedes E. Paoletti [verfasserIn] Juan M. Haut [verfasserIn] Javier Plaza [verfasserIn] Antonio Plaza [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2018 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: Remote Sensing - MDPI AG, 2009, 10(2018), 9, p 1454 |
---|---|
Übergeordnetes Werk: |
volume:10 ; year:2018 ; number:9, p 1454 |
Links: |
---|
DOI / URN: |
10.3390/rs10091454 |
---|
Katalog-ID: |
DOAJ01854844X |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ01854844X | ||
003 | DE-627 | ||
005 | 20230310101041.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230226s2018 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.3390/rs10091454 |2 doi | |
035 | |a (DE-627)DOAJ01854844X | ||
035 | |a (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 0 | |a Mercedes E. Paoletti |e verfasserin |4 aut | |
245 | 1 | 0 | |a Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification |
264 | 1 | |c 2018 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. | ||
650 | 4 | |a hyperspectral images (HSIs) | |
650 | 4 | |a convolutional neural networks (CNNs) | |
650 | 4 | |a dense convolutions | |
653 | 0 | |a Science | |
653 | 0 | |a Q | |
700 | 0 | |a Juan M. Haut |e verfasserin |4 aut | |
700 | 0 | |a Javier Plaza |e verfasserin |4 aut | |
700 | 0 | |a Antonio Plaza |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t Remote Sensing |d MDPI AG, 2009 |g 10(2018), 9, p 1454 |w (DE-627)608937916 |w (DE-600)2513863-7 |x 20724292 |7 nnns |
773 | 1 | 8 | |g volume:10 |g year:2018 |g number:9, p 1454 |
856 | 4 | 0 | |u https://doi.org/10.3390/rs10091454 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b |z kostenfrei |
856 | 4 | 0 | |u http://www.mdpi.com/2072-4292/10/9/1454 |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2072-4292 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_206 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2011 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2108 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_2119 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4392 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 10 |j 2018 |e 9, p 1454 |
author_variant |
m e p mep j m h jmh j p jp a p ap |
---|---|
matchkey_str |
article:20724292:2018----::epescnouinlerlewrfryeseta |
hierarchy_sort_str |
2018 |
publishDate |
2018 |
allfields |
10.3390/rs10091454 doi (DE-627)DOAJ01854844X (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b DE-627 ger DE-627 rakwb eng Mercedes E. Paoletti verfasserin aut Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions Science Q Juan M. Haut verfasserin aut Javier Plaza verfasserin aut Antonio Plaza verfasserin aut In Remote Sensing MDPI AG, 2009 10(2018), 9, p 1454 (DE-627)608937916 (DE-600)2513863-7 20724292 nnns volume:10 year:2018 number:9, p 1454 https://doi.org/10.3390/rs10091454 kostenfrei https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b kostenfrei http://www.mdpi.com/2072-4292/10/9/1454 kostenfrei https://doaj.org/toc/2072-4292 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2108 GBV_ILN_2111 GBV_ILN_2119 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4392 GBV_ILN_4700 AR 10 2018 9, p 1454 |
spelling |
10.3390/rs10091454 doi (DE-627)DOAJ01854844X (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b DE-627 ger DE-627 rakwb eng Mercedes E. Paoletti verfasserin aut Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions Science Q Juan M. Haut verfasserin aut Javier Plaza verfasserin aut Antonio Plaza verfasserin aut In Remote Sensing MDPI AG, 2009 10(2018), 9, p 1454 (DE-627)608937916 (DE-600)2513863-7 20724292 nnns volume:10 year:2018 number:9, p 1454 https://doi.org/10.3390/rs10091454 kostenfrei https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b kostenfrei http://www.mdpi.com/2072-4292/10/9/1454 kostenfrei https://doaj.org/toc/2072-4292 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2108 GBV_ILN_2111 GBV_ILN_2119 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4392 GBV_ILN_4700 AR 10 2018 9, p 1454 |
allfields_unstemmed |
10.3390/rs10091454 doi (DE-627)DOAJ01854844X (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b DE-627 ger DE-627 rakwb eng Mercedes E. Paoletti verfasserin aut Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions Science Q Juan M. Haut verfasserin aut Javier Plaza verfasserin aut Antonio Plaza verfasserin aut In Remote Sensing MDPI AG, 2009 10(2018), 9, p 1454 (DE-627)608937916 (DE-600)2513863-7 20724292 nnns volume:10 year:2018 number:9, p 1454 https://doi.org/10.3390/rs10091454 kostenfrei https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b kostenfrei http://www.mdpi.com/2072-4292/10/9/1454 kostenfrei https://doaj.org/toc/2072-4292 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2108 GBV_ILN_2111 GBV_ILN_2119 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4392 GBV_ILN_4700 AR 10 2018 9, p 1454 |
allfieldsGer |
10.3390/rs10091454 doi (DE-627)DOAJ01854844X (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b DE-627 ger DE-627 rakwb eng Mercedes E. Paoletti verfasserin aut Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions Science Q Juan M. Haut verfasserin aut Javier Plaza verfasserin aut Antonio Plaza verfasserin aut In Remote Sensing MDPI AG, 2009 10(2018), 9, p 1454 (DE-627)608937916 (DE-600)2513863-7 20724292 nnns volume:10 year:2018 number:9, p 1454 https://doi.org/10.3390/rs10091454 kostenfrei https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b kostenfrei http://www.mdpi.com/2072-4292/10/9/1454 kostenfrei https://doaj.org/toc/2072-4292 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2108 GBV_ILN_2111 GBV_ILN_2119 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4392 GBV_ILN_4700 AR 10 2018 9, p 1454 |
allfieldsSound |
10.3390/rs10091454 doi (DE-627)DOAJ01854844X (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b DE-627 ger DE-627 rakwb eng Mercedes E. Paoletti verfasserin aut Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions Science Q Juan M. Haut verfasserin aut Javier Plaza verfasserin aut Antonio Plaza verfasserin aut In Remote Sensing MDPI AG, 2009 10(2018), 9, p 1454 (DE-627)608937916 (DE-600)2513863-7 20724292 nnns volume:10 year:2018 number:9, p 1454 https://doi.org/10.3390/rs10091454 kostenfrei https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b kostenfrei http://www.mdpi.com/2072-4292/10/9/1454 kostenfrei https://doaj.org/toc/2072-4292 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2108 GBV_ILN_2111 GBV_ILN_2119 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4392 GBV_ILN_4700 AR 10 2018 9, p 1454 |
language |
English |
source |
In Remote Sensing 10(2018), 9, p 1454 volume:10 year:2018 number:9, p 1454 |
sourceStr |
In Remote Sensing 10(2018), 9, p 1454 volume:10 year:2018 number:9, p 1454 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions Science Q |
isfreeaccess_bool |
true |
container_title |
Remote Sensing |
authorswithroles_txt_mv |
Mercedes E. Paoletti @@aut@@ Juan M. Haut @@aut@@ Javier Plaza @@aut@@ Antonio Plaza @@aut@@ |
publishDateDaySort_date |
2018-01-01T00:00:00Z |
hierarchy_top_id |
608937916 |
id |
DOAJ01854844X |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ01854844X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230310101041.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230226s2018 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.3390/rs10091454</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ01854844X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Mercedes E. Paoletti</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2018</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">hyperspectral images (HSIs)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">convolutional neural networks (CNNs)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">dense convolutions</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Science</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Q</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Juan M. Haut</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Javier Plaza</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Antonio Plaza</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">Remote Sensing</subfield><subfield code="d">MDPI AG, 2009</subfield><subfield code="g">10(2018), 9, p 1454</subfield><subfield code="w">(DE-627)608937916</subfield><subfield code="w">(DE-600)2513863-7</subfield><subfield code="x">20724292</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:10</subfield><subfield code="g">year:2018</subfield><subfield code="g">number:9, p 1454</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.3390/rs10091454</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">http://www.mdpi.com/2072-4292/10/9/1454</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2072-4292</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_206</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4392</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">10</subfield><subfield code="j">2018</subfield><subfield code="e">9, p 1454</subfield></datafield></record></collection>
|
author |
Mercedes E. Paoletti |
spellingShingle |
Mercedes E. Paoletti misc hyperspectral images (HSIs) misc convolutional neural networks (CNNs) misc dense convolutions misc Science misc Q Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification |
authorStr |
Mercedes E. Paoletti |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)608937916 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
illustrated |
Not Illustrated |
issn |
20724292 |
topic_title |
Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification hyperspectral images (HSIs) convolutional neural networks (CNNs) dense convolutions |
topic |
misc hyperspectral images (HSIs) misc convolutional neural networks (CNNs) misc dense convolutions misc Science misc Q |
topic_unstemmed |
misc hyperspectral images (HSIs) misc convolutional neural networks (CNNs) misc dense convolutions misc Science misc Q |
topic_browse |
misc hyperspectral images (HSIs) misc convolutional neural networks (CNNs) misc dense convolutions misc Science misc Q |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Remote Sensing |
hierarchy_parent_id |
608937916 |
hierarchy_top_title |
Remote Sensing |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)608937916 (DE-600)2513863-7 |
title |
Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification |
ctrlnum |
(DE-627)DOAJ01854844X (DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b |
title_full |
Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification |
author_sort |
Mercedes E. Paoletti |
journal |
Remote Sensing |
journalStr |
Remote Sensing |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2018 |
contenttype_str_mv |
txt |
author_browse |
Mercedes E. Paoletti Juan M. Haut Javier Plaza Antonio Plaza |
container_volume |
10 |
format_se |
Elektronische Aufsätze |
author-letter |
Mercedes E. Paoletti |
doi_str_mv |
10.3390/rs10091454 |
author2-role |
verfasserin |
title_sort |
deep&dense convolutional neural network for hyperspectral image classification |
title_auth |
Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification |
abstract |
Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. |
abstractGer |
Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. |
abstract_unstemmed |
Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2108 GBV_ILN_2111 GBV_ILN_2119 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4392 GBV_ILN_4700 |
container_issue |
9, p 1454 |
title_short |
Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification |
url |
https://doi.org/10.3390/rs10091454 https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b http://www.mdpi.com/2072-4292/10/9/1454 https://doaj.org/toc/2072-4292 |
remote_bool |
true |
author2 |
Juan M. Haut Javier Plaza Antonio Plaza |
author2Str |
Juan M. Haut Javier Plaza Antonio Plaza |
ppnlink |
608937916 |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.3390/rs10091454 |
up_date |
2024-07-03T18:31:52.733Z |
_version_ |
1803583763498663936 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ01854844X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230310101041.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230226s2018 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.3390/rs10091454</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ01854844X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ9222da9febaf4ffeb0ea14be3564074b</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Mercedes E. Paoletti</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Deep&Dense Convolutional Neural Network for Hyperspectral Image Classification</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2018</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Deep neural networks (DNNs) have emerged as a relevant tool for the classification of remotely sensed hyperspectral images (HSIs), with convolutional neural networks (CNNs) being the current state-of-the-art in many classification tasks. However, deep CNNs present several limitations in the context of HSI supervised classification. Although deep models are able to extract better and more abstract features, the number of parameters that must be fine-tuned requires a large amount of training data (using small learning rates) in order to avoid the overfitting and vanishing gradient problems. The acquisition of labeled data is expensive and time-consuming, and small learning rates forces the gradient descent to use many small steps to converge, slowing down the runtime of the model. To mitigate these issues, this paper introduces a new deep CNN framework for spectral-spatial classification of HSIs. Our newly proposed framework introduces shortcut connections between layers, in which the feature maps of inferior layers are used as inputs of the current layer, feeding its own output to the rest of the the upper layers. This leads to the combination of various spectral-spatial features across layers that allows us to enhance the generalization ability of the network with HSIs. Our experimental results with four well-known HSI datasets reveal that the proposed deep&dense CNN model is able to provide competitive advantages in terms of classification accuracy when compared to other state-of-the-methods for HSI classification.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">hyperspectral images (HSIs)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">convolutional neural networks (CNNs)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">dense convolutions</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Science</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Q</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Juan M. Haut</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Javier Plaza</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Antonio Plaza</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">Remote Sensing</subfield><subfield code="d">MDPI AG, 2009</subfield><subfield code="g">10(2018), 9, p 1454</subfield><subfield code="w">(DE-627)608937916</subfield><subfield code="w">(DE-600)2513863-7</subfield><subfield code="x">20724292</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:10</subfield><subfield code="g">year:2018</subfield><subfield code="g">number:9, p 1454</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.3390/rs10091454</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/9222da9febaf4ffeb0ea14be3564074b</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">http://www.mdpi.com/2072-4292/10/9/1454</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2072-4292</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_206</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4392</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">10</subfield><subfield code="j">2018</subfield><subfield code="e">9, p 1454</subfield></datafield></record></collection>
|
score |
7.3973246 |