Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs
Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into...
Ausführliche Beschreibung
Autor*in: |
Javier Sánchez García [verfasserIn] Salvador Cruz Rambaud [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2022 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: Mathematics - MDPI AG, 2013, 10(2022), 6, p 877 |
---|---|
Übergeordnetes Werk: |
volume:10 ; year:2022 ; number:6, p 877 |
Links: |
---|
DOI / URN: |
10.3390/math10060877 |
---|
Katalog-ID: |
DOAJ064424944 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ064424944 | ||
003 | DE-627 | ||
005 | 20240414135438.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230228s2022 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.3390/math10060877 |2 doi | |
035 | |a (DE-627)DOAJ064424944 | ||
035 | |a (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a QA1-939 | |
100 | 0 | |a Javier Sánchez García |e verfasserin |4 aut | |
245 | 1 | 0 | |a Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs |
264 | 1 | |c 2022 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. | ||
650 | 4 | |a VAR | |
650 | 4 | |a machine learning | |
650 | 4 | |a LASSO (Least Absolute Shrinkage and Selection Operator) | |
650 | 4 | |a regularization methods | |
650 | 4 | |a sparsity | |
650 | 4 | |a monetary economics | |
653 | 0 | |a Mathematics | |
700 | 0 | |a Salvador Cruz Rambaud |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t Mathematics |d MDPI AG, 2013 |g 10(2022), 6, p 877 |w (DE-627)737287764 |w (DE-600)2704244-3 |x 22277390 |7 nnns |
773 | 1 | 8 | |g volume:10 |g year:2022 |g number:6, p 877 |
856 | 4 | 0 | |u https://doi.org/10.3390/math10060877 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 |z kostenfrei |
856 | 4 | 0 | |u https://www.mdpi.com/2227-7390/10/6/877 |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2227-7390 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4326 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 10 |j 2022 |e 6, p 877 |
author_variant |
j s g jsg s c r scr |
---|---|
matchkey_str |
article:22277390:2022----::ahnlannrglrztomtosnihiesoamn |
hierarchy_sort_str |
2022 |
callnumber-subject-code |
QA |
publishDate |
2022 |
allfields |
10.3390/math10060877 doi (DE-627)DOAJ064424944 (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 DE-627 ger DE-627 rakwb eng QA1-939 Javier Sánchez García verfasserin aut Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics Mathematics Salvador Cruz Rambaud verfasserin aut In Mathematics MDPI AG, 2013 10(2022), 6, p 877 (DE-627)737287764 (DE-600)2704244-3 22277390 nnns volume:10 year:2022 number:6, p 877 https://doi.org/10.3390/math10060877 kostenfrei https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 kostenfrei https://www.mdpi.com/2227-7390/10/6/877 kostenfrei https://doaj.org/toc/2227-7390 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2111 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 10 2022 6, p 877 |
spelling |
10.3390/math10060877 doi (DE-627)DOAJ064424944 (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 DE-627 ger DE-627 rakwb eng QA1-939 Javier Sánchez García verfasserin aut Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics Mathematics Salvador Cruz Rambaud verfasserin aut In Mathematics MDPI AG, 2013 10(2022), 6, p 877 (DE-627)737287764 (DE-600)2704244-3 22277390 nnns volume:10 year:2022 number:6, p 877 https://doi.org/10.3390/math10060877 kostenfrei https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 kostenfrei https://www.mdpi.com/2227-7390/10/6/877 kostenfrei https://doaj.org/toc/2227-7390 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2111 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 10 2022 6, p 877 |
allfields_unstemmed |
10.3390/math10060877 doi (DE-627)DOAJ064424944 (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 DE-627 ger DE-627 rakwb eng QA1-939 Javier Sánchez García verfasserin aut Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics Mathematics Salvador Cruz Rambaud verfasserin aut In Mathematics MDPI AG, 2013 10(2022), 6, p 877 (DE-627)737287764 (DE-600)2704244-3 22277390 nnns volume:10 year:2022 number:6, p 877 https://doi.org/10.3390/math10060877 kostenfrei https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 kostenfrei https://www.mdpi.com/2227-7390/10/6/877 kostenfrei https://doaj.org/toc/2227-7390 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2111 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 10 2022 6, p 877 |
allfieldsGer |
10.3390/math10060877 doi (DE-627)DOAJ064424944 (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 DE-627 ger DE-627 rakwb eng QA1-939 Javier Sánchez García verfasserin aut Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics Mathematics Salvador Cruz Rambaud verfasserin aut In Mathematics MDPI AG, 2013 10(2022), 6, p 877 (DE-627)737287764 (DE-600)2704244-3 22277390 nnns volume:10 year:2022 number:6, p 877 https://doi.org/10.3390/math10060877 kostenfrei https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 kostenfrei https://www.mdpi.com/2227-7390/10/6/877 kostenfrei https://doaj.org/toc/2227-7390 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2111 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 10 2022 6, p 877 |
allfieldsSound |
10.3390/math10060877 doi (DE-627)DOAJ064424944 (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 DE-627 ger DE-627 rakwb eng QA1-939 Javier Sánchez García verfasserin aut Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics Mathematics Salvador Cruz Rambaud verfasserin aut In Mathematics MDPI AG, 2013 10(2022), 6, p 877 (DE-627)737287764 (DE-600)2704244-3 22277390 nnns volume:10 year:2022 number:6, p 877 https://doi.org/10.3390/math10060877 kostenfrei https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 kostenfrei https://www.mdpi.com/2227-7390/10/6/877 kostenfrei https://doaj.org/toc/2227-7390 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2111 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 10 2022 6, p 877 |
language |
English |
source |
In Mathematics 10(2022), 6, p 877 volume:10 year:2022 number:6, p 877 |
sourceStr |
In Mathematics 10(2022), 6, p 877 volume:10 year:2022 number:6, p 877 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics Mathematics |
isfreeaccess_bool |
true |
container_title |
Mathematics |
authorswithroles_txt_mv |
Javier Sánchez García @@aut@@ Salvador Cruz Rambaud @@aut@@ |
publishDateDaySort_date |
2022-01-01T00:00:00Z |
hierarchy_top_id |
737287764 |
id |
DOAJ064424944 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ064424944</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240414135438.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230228s2022 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.3390/math10060877</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ064424944</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA1-939</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Javier Sánchez García</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2022</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">VAR</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">machine learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">LASSO (Least Absolute Shrinkage and Selection Operator)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">regularization methods</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">sparsity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">monetary economics</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Mathematics</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Salvador Cruz Rambaud</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">Mathematics</subfield><subfield code="d">MDPI AG, 2013</subfield><subfield code="g">10(2022), 6, p 877</subfield><subfield code="w">(DE-627)737287764</subfield><subfield code="w">(DE-600)2704244-3</subfield><subfield code="x">22277390</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:10</subfield><subfield code="g">year:2022</subfield><subfield code="g">number:6, p 877</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.3390/math10060877</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.mdpi.com/2227-7390/10/6/877</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2227-7390</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">10</subfield><subfield code="j">2022</subfield><subfield code="e">6, p 877</subfield></datafield></record></collection>
|
callnumber-first |
Q - Science |
author |
Javier Sánchez García |
spellingShingle |
Javier Sánchez García misc QA1-939 misc VAR misc machine learning misc LASSO (Least Absolute Shrinkage and Selection Operator) misc regularization methods misc sparsity misc monetary economics misc Mathematics Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs |
authorStr |
Javier Sánchez García |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)737287764 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
QA1-939 |
illustrated |
Not Illustrated |
issn |
22277390 |
topic_title |
QA1-939 Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs VAR machine learning LASSO (Least Absolute Shrinkage and Selection Operator) regularization methods sparsity monetary economics |
topic |
misc QA1-939 misc VAR misc machine learning misc LASSO (Least Absolute Shrinkage and Selection Operator) misc regularization methods misc sparsity misc monetary economics misc Mathematics |
topic_unstemmed |
misc QA1-939 misc VAR misc machine learning misc LASSO (Least Absolute Shrinkage and Selection Operator) misc regularization methods misc sparsity misc monetary economics misc Mathematics |
topic_browse |
misc QA1-939 misc VAR misc machine learning misc LASSO (Least Absolute Shrinkage and Selection Operator) misc regularization methods misc sparsity misc monetary economics misc Mathematics |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Mathematics |
hierarchy_parent_id |
737287764 |
hierarchy_top_title |
Mathematics |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)737287764 (DE-600)2704244-3 |
title |
Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs |
ctrlnum |
(DE-627)DOAJ064424944 (DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893 |
title_full |
Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs |
author_sort |
Javier Sánchez García |
journal |
Mathematics |
journalStr |
Mathematics |
callnumber-first-code |
Q |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2022 |
contenttype_str_mv |
txt |
author_browse |
Javier Sánchez García Salvador Cruz Rambaud |
container_volume |
10 |
class |
QA1-939 |
format_se |
Elektronische Aufsätze |
author-letter |
Javier Sánchez García |
doi_str_mv |
10.3390/math10060877 |
author2-role |
verfasserin |
title_sort |
machine learning regularization methods in high-dimensional monetary and financial vars |
callnumber |
QA1-939 |
title_auth |
Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs |
abstract |
Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. |
abstractGer |
Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. |
abstract_unstemmed |
Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2111 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
container_issue |
6, p 877 |
title_short |
Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs |
url |
https://doi.org/10.3390/math10060877 https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893 https://www.mdpi.com/2227-7390/10/6/877 https://doaj.org/toc/2227-7390 |
remote_bool |
true |
author2 |
Salvador Cruz Rambaud |
author2Str |
Salvador Cruz Rambaud |
ppnlink |
737287764 |
callnumber-subject |
QA - Mathematics |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.3390/math10060877 |
callnumber-a |
QA1-939 |
up_date |
2024-07-03T22:49:34.581Z |
_version_ |
1803599976463335425 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ064424944</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240414135438.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230228s2022 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.3390/math10060877</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ064424944</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ238f90ca91fa4c90ac8bfe52347dc893</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA1-939</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Javier Sánchez García</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Machine Learning Regularization Methods in High-Dimensional Monetary and Financial VARs</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2022</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Vector autoregressions (VARs) and their multiple variants are standard models in economic and financial research due to their power for forecasting, data analysis and inference. These properties are a consequence of their capabilities to include multiple variables and lags which, however, turns into an exponential growth of the parameters to be estimated. This means that high-dimensional models with multiple variables and lags are difficult to estimate, leading to omitted variables, information biases and a loss of potential forecasting power. Traditionally, the existing literature has resorted to factor analysis, and specially, to Bayesian methods to overcome this situation. This paper explores the so-called machine learning regularization methods as an alternative to traditional methods of forecasting and impulse response analysis. We find that regularization structures, which allow for high dimensional models, perform better than standard Bayesian methods in nowcasting and forecasting. Moreover, impulse response analysis is robust and consistent with economic theory and evidence, and with the different regularization structures. Specifically, regarding the best regularization structure, an elementwise machine learning structure performs better in nowcasting and in computational efficiency, whilst a componentwise structure performs better in forecasting and cross-validation methods.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">VAR</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">machine learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">LASSO (Least Absolute Shrinkage and Selection Operator)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">regularization methods</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">sparsity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">monetary economics</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Mathematics</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Salvador Cruz Rambaud</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">Mathematics</subfield><subfield code="d">MDPI AG, 2013</subfield><subfield code="g">10(2022), 6, p 877</subfield><subfield code="w">(DE-627)737287764</subfield><subfield code="w">(DE-600)2704244-3</subfield><subfield code="x">22277390</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:10</subfield><subfield code="g">year:2022</subfield><subfield code="g">number:6, p 877</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.3390/math10060877</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/238f90ca91fa4c90ac8bfe52347dc893</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.mdpi.com/2227-7390/10/6/877</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2227-7390</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">10</subfield><subfield code="j">2022</subfield><subfield code="e">6, p 877</subfield></datafield></record></collection>
|
score |
7.4011316 |