Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm
Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existi...
Ausführliche Beschreibung
Autor*in: |
Rollin, Ndom Francis [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2023 |
---|
Schlagwörter: |
---|
Anmerkung: |
© The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
---|
Übergeordnetes Werk: |
Enthalten in: SN Computer Science - Singapore : Springer Singapore, 2020, 5(2023), 1 vom: 06. Dez. |
---|---|
Übergeordnetes Werk: |
volume:5 ; year:2023 ; number:1 ; day:06 ; month:12 |
Links: |
---|
DOI / URN: |
10.1007/s42979-023-02376-x |
---|
Katalog-ID: |
SPR05400585X |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | SPR05400585X | ||
003 | DE-627 | ||
005 | 20231207064648.0 | ||
007 | cr uuu---uuuuu | ||
008 | 231207s2023 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s42979-023-02376-x |2 doi | |
035 | |a (DE-627)SPR05400585X | ||
035 | |a (SPR)s42979-023-02376-x-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Rollin, Ndom Francis |e verfasserin |0 (orcid)0000-0003-1229-837X |4 aut | |
245 | 1 | 0 | |a Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm |
264 | 1 | |c 2023 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. | ||
520 | |a Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. | ||
650 | 4 | |a Neural networks |7 (dpeaa)DE-He213 | |
650 | 4 | |a Feed forward algorithm |7 (dpeaa)DE-He213 | |
650 | 4 | |a Recurrent neural networks |7 (dpeaa)DE-He213 | |
650 | 4 | |a Gated recurrent unit |7 (dpeaa)DE-He213 | |
650 | 4 | |a Long short term memory |7 (dpeaa)DE-He213 | |
700 | 1 | |a Giquel, Sassa |4 aut | |
700 | 1 | |a Chantal, Mveh-Abia |4 aut | |
700 | 1 | |a Raoul, Ayissi |4 aut | |
700 | 1 | |a Remy, Etoua |4 aut | |
700 | 1 | |a Yves, Emvudu |4 aut | |
773 | 0 | 8 | |i Enthalten in |t SN Computer Science |d Singapore : Springer Singapore, 2020 |g 5(2023), 1 vom: 06. Dez. |w (DE-627)1668832976 |w (DE-600)2977367-2 |x 2661-8907 |7 nnns |
773 | 1 | 8 | |g volume:5 |g year:2023 |g number:1 |g day:06 |g month:12 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s42979-023-02376-x |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_32 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_74 | ||
912 | |a GBV_ILN_90 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_100 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_138 | ||
912 | |a GBV_ILN_150 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_152 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_171 | ||
912 | |a GBV_ILN_187 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_224 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_250 | ||
912 | |a GBV_ILN_281 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_636 | ||
912 | |a GBV_ILN_702 | ||
912 | |a GBV_ILN_2001 | ||
912 | |a GBV_ILN_2003 | ||
912 | |a GBV_ILN_2004 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2006 | ||
912 | |a GBV_ILN_2007 | ||
912 | |a GBV_ILN_2008 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2010 | ||
912 | |a GBV_ILN_2011 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2015 | ||
912 | |a GBV_ILN_2020 | ||
912 | |a GBV_ILN_2021 | ||
912 | |a GBV_ILN_2025 | ||
912 | |a GBV_ILN_2026 | ||
912 | |a GBV_ILN_2027 | ||
912 | |a GBV_ILN_2031 | ||
912 | |a GBV_ILN_2034 | ||
912 | |a GBV_ILN_2037 | ||
912 | |a GBV_ILN_2038 | ||
912 | |a GBV_ILN_2039 | ||
912 | |a GBV_ILN_2044 | ||
912 | |a GBV_ILN_2048 | ||
912 | |a GBV_ILN_2049 | ||
912 | |a GBV_ILN_2050 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2056 | ||
912 | |a GBV_ILN_2057 | ||
912 | |a GBV_ILN_2059 | ||
912 | |a GBV_ILN_2061 | ||
912 | |a GBV_ILN_2064 | ||
912 | |a GBV_ILN_2065 | ||
912 | |a GBV_ILN_2068 | ||
912 | |a GBV_ILN_2088 | ||
912 | |a GBV_ILN_2093 | ||
912 | |a GBV_ILN_2106 | ||
912 | |a GBV_ILN_2107 | ||
912 | |a GBV_ILN_2108 | ||
912 | |a GBV_ILN_2110 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_2112 | ||
912 | |a GBV_ILN_2113 | ||
912 | |a GBV_ILN_2118 | ||
912 | |a GBV_ILN_2122 | ||
912 | |a GBV_ILN_2129 | ||
912 | |a GBV_ILN_2143 | ||
912 | |a GBV_ILN_2144 | ||
912 | |a GBV_ILN_2147 | ||
912 | |a GBV_ILN_2148 | ||
912 | |a GBV_ILN_2152 | ||
912 | |a GBV_ILN_2153 | ||
912 | |a GBV_ILN_2190 | ||
912 | |a GBV_ILN_2232 | ||
912 | |a GBV_ILN_2336 | ||
912 | |a GBV_ILN_2446 | ||
912 | |a GBV_ILN_2470 | ||
912 | |a GBV_ILN_2472 | ||
912 | |a GBV_ILN_2507 | ||
912 | |a GBV_ILN_2522 | ||
912 | |a GBV_ILN_4035 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4046 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4242 | ||
912 | |a GBV_ILN_4246 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4251 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4326 | ||
912 | |a GBV_ILN_4328 | ||
912 | |a GBV_ILN_4333 | ||
912 | |a GBV_ILN_4334 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4336 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4393 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 5 |j 2023 |e 1 |b 06 |c 12 |
author_variant |
n f r nf nfr s g sg m a c mac a r ar e r er e y ey |
---|---|
matchkey_str |
article:26618907:2023----::ailaigtdnteurnnuantok |
hierarchy_sort_str |
2023 |
publishDate |
2023 |
allfields |
10.1007/s42979-023-02376-x doi (DE-627)SPR05400585X (SPR)s42979-023-02376-x-e DE-627 ger DE-627 rakwb eng Rollin, Ndom Francis verfasserin (orcid)0000-0003-1229-837X aut Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. Neural networks (dpeaa)DE-He213 Feed forward algorithm (dpeaa)DE-He213 Recurrent neural networks (dpeaa)DE-He213 Gated recurrent unit (dpeaa)DE-He213 Long short term memory (dpeaa)DE-He213 Giquel, Sassa aut Chantal, Mveh-Abia aut Raoul, Ayissi aut Remy, Etoua aut Yves, Emvudu aut Enthalten in SN Computer Science Singapore : Springer Singapore, 2020 5(2023), 1 vom: 06. Dez. (DE-627)1668832976 (DE-600)2977367-2 2661-8907 nnns volume:5 year:2023 number:1 day:06 month:12 https://dx.doi.org/10.1007/s42979-023-02376-x lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2056 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2118 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 5 2023 1 06 12 |
spelling |
10.1007/s42979-023-02376-x doi (DE-627)SPR05400585X (SPR)s42979-023-02376-x-e DE-627 ger DE-627 rakwb eng Rollin, Ndom Francis verfasserin (orcid)0000-0003-1229-837X aut Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. Neural networks (dpeaa)DE-He213 Feed forward algorithm (dpeaa)DE-He213 Recurrent neural networks (dpeaa)DE-He213 Gated recurrent unit (dpeaa)DE-He213 Long short term memory (dpeaa)DE-He213 Giquel, Sassa aut Chantal, Mveh-Abia aut Raoul, Ayissi aut Remy, Etoua aut Yves, Emvudu aut Enthalten in SN Computer Science Singapore : Springer Singapore, 2020 5(2023), 1 vom: 06. Dez. (DE-627)1668832976 (DE-600)2977367-2 2661-8907 nnns volume:5 year:2023 number:1 day:06 month:12 https://dx.doi.org/10.1007/s42979-023-02376-x lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2056 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2118 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 5 2023 1 06 12 |
allfields_unstemmed |
10.1007/s42979-023-02376-x doi (DE-627)SPR05400585X (SPR)s42979-023-02376-x-e DE-627 ger DE-627 rakwb eng Rollin, Ndom Francis verfasserin (orcid)0000-0003-1229-837X aut Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. Neural networks (dpeaa)DE-He213 Feed forward algorithm (dpeaa)DE-He213 Recurrent neural networks (dpeaa)DE-He213 Gated recurrent unit (dpeaa)DE-He213 Long short term memory (dpeaa)DE-He213 Giquel, Sassa aut Chantal, Mveh-Abia aut Raoul, Ayissi aut Remy, Etoua aut Yves, Emvudu aut Enthalten in SN Computer Science Singapore : Springer Singapore, 2020 5(2023), 1 vom: 06. Dez. (DE-627)1668832976 (DE-600)2977367-2 2661-8907 nnns volume:5 year:2023 number:1 day:06 month:12 https://dx.doi.org/10.1007/s42979-023-02376-x lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2056 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2118 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 5 2023 1 06 12 |
allfieldsGer |
10.1007/s42979-023-02376-x doi (DE-627)SPR05400585X (SPR)s42979-023-02376-x-e DE-627 ger DE-627 rakwb eng Rollin, Ndom Francis verfasserin (orcid)0000-0003-1229-837X aut Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. Neural networks (dpeaa)DE-He213 Feed forward algorithm (dpeaa)DE-He213 Recurrent neural networks (dpeaa)DE-He213 Gated recurrent unit (dpeaa)DE-He213 Long short term memory (dpeaa)DE-He213 Giquel, Sassa aut Chantal, Mveh-Abia aut Raoul, Ayissi aut Remy, Etoua aut Yves, Emvudu aut Enthalten in SN Computer Science Singapore : Springer Singapore, 2020 5(2023), 1 vom: 06. Dez. (DE-627)1668832976 (DE-600)2977367-2 2661-8907 nnns volume:5 year:2023 number:1 day:06 month:12 https://dx.doi.org/10.1007/s42979-023-02376-x lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2056 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2118 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 5 2023 1 06 12 |
allfieldsSound |
10.1007/s42979-023-02376-x doi (DE-627)SPR05400585X (SPR)s42979-023-02376-x-e DE-627 ger DE-627 rakwb eng Rollin, Ndom Francis verfasserin (orcid)0000-0003-1229-837X aut Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. Neural networks (dpeaa)DE-He213 Feed forward algorithm (dpeaa)DE-He213 Recurrent neural networks (dpeaa)DE-He213 Gated recurrent unit (dpeaa)DE-He213 Long short term memory (dpeaa)DE-He213 Giquel, Sassa aut Chantal, Mveh-Abia aut Raoul, Ayissi aut Remy, Etoua aut Yves, Emvudu aut Enthalten in SN Computer Science Singapore : Springer Singapore, 2020 5(2023), 1 vom: 06. Dez. (DE-627)1668832976 (DE-600)2977367-2 2661-8907 nnns volume:5 year:2023 number:1 day:06 month:12 https://dx.doi.org/10.1007/s42979-023-02376-x lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2056 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2118 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 5 2023 1 06 12 |
language |
English |
source |
Enthalten in SN Computer Science 5(2023), 1 vom: 06. Dez. volume:5 year:2023 number:1 day:06 month:12 |
sourceStr |
Enthalten in SN Computer Science 5(2023), 1 vom: 06. Dez. volume:5 year:2023 number:1 day:06 month:12 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Neural networks Feed forward algorithm Recurrent neural networks Gated recurrent unit Long short term memory |
isfreeaccess_bool |
false |
container_title |
SN Computer Science |
authorswithroles_txt_mv |
Rollin, Ndom Francis @@aut@@ Giquel, Sassa @@aut@@ Chantal, Mveh-Abia @@aut@@ Raoul, Ayissi @@aut@@ Remy, Etoua @@aut@@ Yves, Emvudu @@aut@@ |
publishDateDaySort_date |
2023-12-06T00:00:00Z |
hierarchy_top_id |
1668832976 |
id |
SPR05400585X |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">SPR05400585X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20231207064648.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">231207s2023 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s42979-023-02376-x</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR05400585X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s42979-023-02376-x-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Rollin, Ndom Francis</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0003-1229-837X</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2023</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural networks</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Feed forward algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Recurrent neural networks</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gated recurrent unit</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Long short term memory</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Giquel, Sassa</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chantal, Mveh-Abia</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Raoul, Ayissi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Remy, Etoua</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Yves, Emvudu</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">SN Computer Science</subfield><subfield code="d">Singapore : Springer Singapore, 2020</subfield><subfield code="g">5(2023), 1 vom: 06. Dez.</subfield><subfield code="w">(DE-627)1668832976</subfield><subfield code="w">(DE-600)2977367-2</subfield><subfield code="x">2661-8907</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:5</subfield><subfield code="g">year:2023</subfield><subfield code="g">number:1</subfield><subfield code="g">day:06</subfield><subfield code="g">month:12</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s42979-023-02376-x</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2008</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2056</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4328</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">5</subfield><subfield code="j">2023</subfield><subfield code="e">1</subfield><subfield code="b">06</subfield><subfield code="c">12</subfield></datafield></record></collection>
|
author |
Rollin, Ndom Francis |
spellingShingle |
Rollin, Ndom Francis misc Neural networks misc Feed forward algorithm misc Recurrent neural networks misc Gated recurrent unit misc Long short term memory Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm |
authorStr |
Rollin, Ndom Francis |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)1668832976 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
issn |
2661-8907 |
topic_title |
Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm Neural networks (dpeaa)DE-He213 Feed forward algorithm (dpeaa)DE-He213 Recurrent neural networks (dpeaa)DE-He213 Gated recurrent unit (dpeaa)DE-He213 Long short term memory (dpeaa)DE-He213 |
topic |
misc Neural networks misc Feed forward algorithm misc Recurrent neural networks misc Gated recurrent unit misc Long short term memory |
topic_unstemmed |
misc Neural networks misc Feed forward algorithm misc Recurrent neural networks misc Gated recurrent unit misc Long short term memory |
topic_browse |
misc Neural networks misc Feed forward algorithm misc Recurrent neural networks misc Gated recurrent unit misc Long short term memory |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
SN Computer Science |
hierarchy_parent_id |
1668832976 |
hierarchy_top_title |
SN Computer Science |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)1668832976 (DE-600)2977367-2 |
title |
Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm |
ctrlnum |
(DE-627)SPR05400585X (SPR)s42979-023-02376-x-e |
title_full |
Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm |
author_sort |
Rollin, Ndom Francis |
journal |
SN Computer Science |
journalStr |
SN Computer Science |
lang_code |
eng |
isOA_bool |
false |
recordtype |
marc |
publishDateSort |
2023 |
contenttype_str_mv |
txt |
author_browse |
Rollin, Ndom Francis Giquel, Sassa Chantal, Mveh-Abia Raoul, Ayissi Remy, Etoua Yves, Emvudu |
container_volume |
5 |
format_se |
Elektronische Aufsätze |
author-letter |
Rollin, Ndom Francis |
doi_str_mv |
10.1007/s42979-023-02376-x |
normlink |
(ORCID)0000-0003-1229-837X |
normlink_prefix_str_mv |
(orcid)0000-0003-1229-837X |
title_sort |
radial basis gated unit-recurrent neural network (rbgu-rnn) algorithm |
title_auth |
Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm |
abstract |
Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
abstractGer |
Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
abstract_unstemmed |
Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration. © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2056 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2118 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 |
container_issue |
1 |
title_short |
Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm |
url |
https://dx.doi.org/10.1007/s42979-023-02376-x |
remote_bool |
true |
author2 |
Giquel, Sassa Chantal, Mveh-Abia Raoul, Ayissi Remy, Etoua Yves, Emvudu |
author2Str |
Giquel, Sassa Chantal, Mveh-Abia Raoul, Ayissi Remy, Etoua Yves, Emvudu |
ppnlink |
1668832976 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s42979-023-02376-x |
up_date |
2024-07-03T23:24:56.602Z |
_version_ |
1803602201524830208 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">SPR05400585X</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20231207064648.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">231207s2023 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s42979-023-02376-x</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR05400585X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s42979-023-02376-x-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Rollin, Ndom Francis</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0003-1229-837X</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2023</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that %$58.31 \%%$ of user traffic consumption can be explained by LSTM model, while %$96.86 \%%$ of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that %$61.24 \%%$ of user traffic consumption can also be explained by LSTM model and %$95.20 \%%$ can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural networks</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Feed forward algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Recurrent neural networks</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gated recurrent unit</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Long short term memory</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Giquel, Sassa</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chantal, Mveh-Abia</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Raoul, Ayissi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Remy, Etoua</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Yves, Emvudu</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">SN Computer Science</subfield><subfield code="d">Singapore : Springer Singapore, 2020</subfield><subfield code="g">5(2023), 1 vom: 06. Dez.</subfield><subfield code="w">(DE-627)1668832976</subfield><subfield code="w">(DE-600)2977367-2</subfield><subfield code="x">2661-8907</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:5</subfield><subfield code="g">year:2023</subfield><subfield code="g">number:1</subfield><subfield code="g">day:06</subfield><subfield code="g">month:12</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s42979-023-02376-x</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2008</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2056</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4328</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">5</subfield><subfield code="j">2023</subfield><subfield code="e">1</subfield><subfield code="b">06</subfield><subfield code="c">12</subfield></datafield></record></collection>
|
score |
7.399089 |