Comparison of Deep Learning Techniques for River Streamflow Forecasting
Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performa...
Ausführliche Beschreibung
Autor*in: |
Xuan-Hien Le [verfasserIn] Duc-Hai Nguyen [verfasserIn] Sungho Jung [verfasserIn] Minho Yeon [verfasserIn] Giha Lee [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2021 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: IEEE Access - IEEE, 2014, 9(2021), Seite 71805-71820 |
---|---|
Übergeordnetes Werk: |
volume:9 ; year:2021 ; pages:71805-71820 |
Links: |
---|
DOI / URN: |
10.1109/ACCESS.2021.3077703 |
---|
Katalog-ID: |
DOAJ062347926 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ062347926 | ||
003 | DE-627 | ||
005 | 20230309020634.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230228s2021 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1109/ACCESS.2021.3077703 |2 doi | |
035 | |a (DE-627)DOAJ062347926 | ||
035 | |a (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TK1-9971 | |
100 | 0 | |a Xuan-Hien Le |e verfasserin |4 aut | |
245 | 1 | 0 | |a Comparison of Deep Learning Techniques for River Streamflow Forecasting |
264 | 1 | |c 2021 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. | ||
650 | 4 | |a Bidirectional LSTM | |
650 | 4 | |a deep learning | |
650 | 4 | |a gated recurrent unit | |
650 | 4 | |a long short-term memory | |
650 | 4 | |a streamflow forecasting | |
653 | 0 | |a Electrical engineering. Electronics. Nuclear engineering | |
700 | 0 | |a Duc-Hai Nguyen |e verfasserin |4 aut | |
700 | 0 | |a Sungho Jung |e verfasserin |4 aut | |
700 | 0 | |a Minho Yeon |e verfasserin |4 aut | |
700 | 0 | |a Giha Lee |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t IEEE Access |d IEEE, 2014 |g 9(2021), Seite 71805-71820 |w (DE-627)728440385 |w (DE-600)2687964-5 |x 21693536 |7 nnns |
773 | 1 | 8 | |g volume:9 |g year:2021 |g pages:71805-71820 |
856 | 4 | 0 | |u https://doi.org/10.1109/ACCESS.2021.3077703 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 |z kostenfrei |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/document/9423961/ |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2169-3536 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 9 |j 2021 |h 71805-71820 |
author_variant |
x h l xhl d h n dhn s j sj m y my g l gl |
---|---|
matchkey_str |
article:21693536:2021----::oprsnfeperigehiusorvrt |
hierarchy_sort_str |
2021 |
callnumber-subject-code |
TK |
publishDate |
2021 |
allfields |
10.1109/ACCESS.2021.3077703 doi (DE-627)DOAJ062347926 (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 DE-627 ger DE-627 rakwb eng TK1-9971 Xuan-Hien Le verfasserin aut Comparison of Deep Learning Techniques for River Streamflow Forecasting 2021 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting Electrical engineering. Electronics. Nuclear engineering Duc-Hai Nguyen verfasserin aut Sungho Jung verfasserin aut Minho Yeon verfasserin aut Giha Lee verfasserin aut In IEEE Access IEEE, 2014 9(2021), Seite 71805-71820 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:9 year:2021 pages:71805-71820 https://doi.org/10.1109/ACCESS.2021.3077703 kostenfrei https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 kostenfrei https://ieeexplore.ieee.org/document/9423961/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 9 2021 71805-71820 |
spelling |
10.1109/ACCESS.2021.3077703 doi (DE-627)DOAJ062347926 (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 DE-627 ger DE-627 rakwb eng TK1-9971 Xuan-Hien Le verfasserin aut Comparison of Deep Learning Techniques for River Streamflow Forecasting 2021 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting Electrical engineering. Electronics. Nuclear engineering Duc-Hai Nguyen verfasserin aut Sungho Jung verfasserin aut Minho Yeon verfasserin aut Giha Lee verfasserin aut In IEEE Access IEEE, 2014 9(2021), Seite 71805-71820 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:9 year:2021 pages:71805-71820 https://doi.org/10.1109/ACCESS.2021.3077703 kostenfrei https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 kostenfrei https://ieeexplore.ieee.org/document/9423961/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 9 2021 71805-71820 |
allfields_unstemmed |
10.1109/ACCESS.2021.3077703 doi (DE-627)DOAJ062347926 (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 DE-627 ger DE-627 rakwb eng TK1-9971 Xuan-Hien Le verfasserin aut Comparison of Deep Learning Techniques for River Streamflow Forecasting 2021 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting Electrical engineering. Electronics. Nuclear engineering Duc-Hai Nguyen verfasserin aut Sungho Jung verfasserin aut Minho Yeon verfasserin aut Giha Lee verfasserin aut In IEEE Access IEEE, 2014 9(2021), Seite 71805-71820 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:9 year:2021 pages:71805-71820 https://doi.org/10.1109/ACCESS.2021.3077703 kostenfrei https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 kostenfrei https://ieeexplore.ieee.org/document/9423961/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 9 2021 71805-71820 |
allfieldsGer |
10.1109/ACCESS.2021.3077703 doi (DE-627)DOAJ062347926 (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 DE-627 ger DE-627 rakwb eng TK1-9971 Xuan-Hien Le verfasserin aut Comparison of Deep Learning Techniques for River Streamflow Forecasting 2021 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting Electrical engineering. Electronics. Nuclear engineering Duc-Hai Nguyen verfasserin aut Sungho Jung verfasserin aut Minho Yeon verfasserin aut Giha Lee verfasserin aut In IEEE Access IEEE, 2014 9(2021), Seite 71805-71820 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:9 year:2021 pages:71805-71820 https://doi.org/10.1109/ACCESS.2021.3077703 kostenfrei https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 kostenfrei https://ieeexplore.ieee.org/document/9423961/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 9 2021 71805-71820 |
allfieldsSound |
10.1109/ACCESS.2021.3077703 doi (DE-627)DOAJ062347926 (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 DE-627 ger DE-627 rakwb eng TK1-9971 Xuan-Hien Le verfasserin aut Comparison of Deep Learning Techniques for River Streamflow Forecasting 2021 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting Electrical engineering. Electronics. Nuclear engineering Duc-Hai Nguyen verfasserin aut Sungho Jung verfasserin aut Minho Yeon verfasserin aut Giha Lee verfasserin aut In IEEE Access IEEE, 2014 9(2021), Seite 71805-71820 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:9 year:2021 pages:71805-71820 https://doi.org/10.1109/ACCESS.2021.3077703 kostenfrei https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 kostenfrei https://ieeexplore.ieee.org/document/9423961/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 9 2021 71805-71820 |
language |
English |
source |
In IEEE Access 9(2021), Seite 71805-71820 volume:9 year:2021 pages:71805-71820 |
sourceStr |
In IEEE Access 9(2021), Seite 71805-71820 volume:9 year:2021 pages:71805-71820 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting Electrical engineering. Electronics. Nuclear engineering |
isfreeaccess_bool |
true |
container_title |
IEEE Access |
authorswithroles_txt_mv |
Xuan-Hien Le @@aut@@ Duc-Hai Nguyen @@aut@@ Sungho Jung @@aut@@ Minho Yeon @@aut@@ Giha Lee @@aut@@ |
publishDateDaySort_date |
2021-01-01T00:00:00Z |
hierarchy_top_id |
728440385 |
id |
DOAJ062347926 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ062347926</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230309020634.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230228s2021 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2021.3077703</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ062347926</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Xuan-Hien Le</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Comparison of Deep Learning Techniques for River Streamflow Forecasting</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2021</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Bidirectional LSTM</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">deep learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">gated recurrent unit</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">long short-term memory</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">streamflow forecasting</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Duc-Hai Nguyen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Sungho Jung</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Minho Yeon</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Giha Lee</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">9(2021), Seite 71805-71820</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:9</subfield><subfield code="g">year:2021</subfield><subfield code="g">pages:71805-71820</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2021.3077703</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/9423961/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">9</subfield><subfield code="j">2021</subfield><subfield code="h">71805-71820</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Xuan-Hien Le |
spellingShingle |
Xuan-Hien Le misc TK1-9971 misc Bidirectional LSTM misc deep learning misc gated recurrent unit misc long short-term memory misc streamflow forecasting misc Electrical engineering. Electronics. Nuclear engineering Comparison of Deep Learning Techniques for River Streamflow Forecasting |
authorStr |
Xuan-Hien Le |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)728440385 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TK1-9971 |
illustrated |
Not Illustrated |
issn |
21693536 |
topic_title |
TK1-9971 Comparison of Deep Learning Techniques for River Streamflow Forecasting Bidirectional LSTM deep learning gated recurrent unit long short-term memory streamflow forecasting |
topic |
misc TK1-9971 misc Bidirectional LSTM misc deep learning misc gated recurrent unit misc long short-term memory misc streamflow forecasting misc Electrical engineering. Electronics. Nuclear engineering |
topic_unstemmed |
misc TK1-9971 misc Bidirectional LSTM misc deep learning misc gated recurrent unit misc long short-term memory misc streamflow forecasting misc Electrical engineering. Electronics. Nuclear engineering |
topic_browse |
misc TK1-9971 misc Bidirectional LSTM misc deep learning misc gated recurrent unit misc long short-term memory misc streamflow forecasting misc Electrical engineering. Electronics. Nuclear engineering |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
IEEE Access |
hierarchy_parent_id |
728440385 |
hierarchy_top_title |
IEEE Access |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)728440385 (DE-600)2687964-5 |
title |
Comparison of Deep Learning Techniques for River Streamflow Forecasting |
ctrlnum |
(DE-627)DOAJ062347926 (DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4 |
title_full |
Comparison of Deep Learning Techniques for River Streamflow Forecasting |
author_sort |
Xuan-Hien Le |
journal |
IEEE Access |
journalStr |
IEEE Access |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2021 |
contenttype_str_mv |
txt |
container_start_page |
71805 |
author_browse |
Xuan-Hien Le Duc-Hai Nguyen Sungho Jung Minho Yeon Giha Lee |
container_volume |
9 |
class |
TK1-9971 |
format_se |
Elektronische Aufsätze |
author-letter |
Xuan-Hien Le |
doi_str_mv |
10.1109/ACCESS.2021.3077703 |
author2-role |
verfasserin |
title_sort |
comparison of deep learning techniques for river streamflow forecasting |
callnumber |
TK1-9971 |
title_auth |
Comparison of Deep Learning Techniques for River Streamflow Forecasting |
abstract |
Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. |
abstractGer |
Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. |
abstract_unstemmed |
Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
Comparison of Deep Learning Techniques for River Streamflow Forecasting |
url |
https://doi.org/10.1109/ACCESS.2021.3077703 https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4 https://ieeexplore.ieee.org/document/9423961/ https://doaj.org/toc/2169-3536 |
remote_bool |
true |
author2 |
Duc-Hai Nguyen Sungho Jung Minho Yeon Giha Lee |
author2Str |
Duc-Hai Nguyen Sungho Jung Minho Yeon Giha Lee |
ppnlink |
728440385 |
callnumber-subject |
TK - Electrical and Nuclear Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1109/ACCESS.2021.3077703 |
callnumber-a |
TK1-9971 |
up_date |
2024-07-04T01:18:21.903Z |
_version_ |
1803609337394888704 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ062347926</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230309020634.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230228s2021 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2021.3077703</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ062347926</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ9e85309437e74a9e9ff047694645bcd4</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Xuan-Hien Le</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Comparison of Deep Learning Techniques for River Streamflow Forecasting</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2021</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Recently, deep learning (DL) models, especially those based on long short-term memory (LSTM), have demonstrated their superior ability in resolving sequential data problems. This study investigated the performance of six models that belong to the supervised learning category to evaluate the performance of DL models in terms of streamflow forecasting. They include a feed-forward neural network (FFNN), a convolutional neural network (CNN), and four LSTM-based models. Two standard models with just one hidden layer—LSTM and gated recurrent unit (GRU)—are used against two more complex models—the stacked LSTM (StackedLSTM) model and the Bidirectional LSTM (BiLSTM) model. The Red River basin—the largest river basin in the north of Vietnam—was adopted as a case study because of its geographic relevance since Hanoi city—the capital of Vietnam—is located downstream of the Red River. Besides, the input data of these models are the observed data at seven hydrological stations on the three main river branches of the Red River system. This study indicates that the four LSTM-based models exhibited considerably better performance and maintained stability than the FFNN and CNN models. However, the complexity of the StackedLSTM and BiLSTM models is not accompanied by performance improvement because the results of the comparison illustrate that their respective performance is not higher than the two standard models—LSTM and GRU. The findings of this study present that LSTM-based models can reach impressive forecasts even in the presence of upstream dams and reservoirs. For the streamflow-forecasting problem, the LSTM and GRU models with a simple architecture (one hidden layer) are sufficient to produce highly reliable forecasts while minimizing the computation time.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Bidirectional LSTM</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">deep learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">gated recurrent unit</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">long short-term memory</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">streamflow forecasting</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Duc-Hai Nguyen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Sungho Jung</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Minho Yeon</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Giha Lee</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">9(2021), Seite 71805-71820</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:9</subfield><subfield code="g">year:2021</subfield><subfield code="g">pages:71805-71820</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2021.3077703</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/9e85309437e74a9e9ff047694645bcd4</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/9423961/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">9</subfield><subfield code="j">2021</subfield><subfield code="h">71805-71820</subfield></datafield></record></collection>
|
score |
7.3976336 |