Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory
As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models ofte...
Ausführliche Beschreibung
Autor*in: |
Ke Zang [verfasserIn] Zhenguo Ma [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2020 |
---|
Schlagwörter: |
Automatic modulation classification (AMC) recurrent neural networks (RNNs) |
---|
Übergeordnetes Werk: |
In: IEEE Access - IEEE, 2014, 8(2020), Seite 213052-213061 |
---|---|
Übergeordnetes Werk: |
volume:8 ; year:2020 ; pages:213052-213061 |
Links: |
---|
DOI / URN: |
10.1109/ACCESS.2020.3039543 |
---|
Katalog-ID: |
DOAJ062562576 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ062562576 | ||
003 | DE-627 | ||
005 | 20230309021822.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230228s2020 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1109/ACCESS.2020.3039543 |2 doi | |
035 | |a (DE-627)DOAJ062562576 | ||
035 | |a (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TK1-9971 | |
100 | 0 | |a Ke Zang |e verfasserin |4 aut | |
245 | 1 | 0 | |a Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
264 | 1 | |c 2020 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. | ||
650 | 4 | |a Automatic modulation classification (AMC) | |
650 | 4 | |a recurrent neural networks (RNNs) | |
650 | 4 | |a hierarchical recurrent structure | |
650 | 4 | |a long-term memory | |
653 | 0 | |a Electrical engineering. Electronics. Nuclear engineering | |
700 | 0 | |a Zhenguo Ma |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t IEEE Access |d IEEE, 2014 |g 8(2020), Seite 213052-213061 |w (DE-627)728440385 |w (DE-600)2687964-5 |x 21693536 |7 nnns |
773 | 1 | 8 | |g volume:8 |g year:2020 |g pages:213052-213061 |
856 | 4 | 0 | |u https://doi.org/10.1109/ACCESS.2020.3039543 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/8aa308d30133449686f3cda521a9c18d |z kostenfrei |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/document/9265251/ |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2169-3536 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 8 |j 2020 |h 213052-213061 |
author_variant |
k z kz z m zm |
---|---|
matchkey_str |
article:21693536:2020----::uoaimdltocasfctobsdnirrhcleurnnuantok |
hierarchy_sort_str |
2020 |
callnumber-subject-code |
TK |
publishDate |
2020 |
allfields |
10.1109/ACCESS.2020.3039543 doi (DE-627)DOAJ062562576 (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d DE-627 ger DE-627 rakwb eng TK1-9971 Ke Zang verfasserin aut Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory Electrical engineering. Electronics. Nuclear engineering Zhenguo Ma verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 213052-213061 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:213052-213061 https://doi.org/10.1109/ACCESS.2020.3039543 kostenfrei https://doaj.org/article/8aa308d30133449686f3cda521a9c18d kostenfrei https://ieeexplore.ieee.org/document/9265251/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 213052-213061 |
spelling |
10.1109/ACCESS.2020.3039543 doi (DE-627)DOAJ062562576 (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d DE-627 ger DE-627 rakwb eng TK1-9971 Ke Zang verfasserin aut Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory Electrical engineering. Electronics. Nuclear engineering Zhenguo Ma verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 213052-213061 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:213052-213061 https://doi.org/10.1109/ACCESS.2020.3039543 kostenfrei https://doaj.org/article/8aa308d30133449686f3cda521a9c18d kostenfrei https://ieeexplore.ieee.org/document/9265251/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 213052-213061 |
allfields_unstemmed |
10.1109/ACCESS.2020.3039543 doi (DE-627)DOAJ062562576 (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d DE-627 ger DE-627 rakwb eng TK1-9971 Ke Zang verfasserin aut Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory Electrical engineering. Electronics. Nuclear engineering Zhenguo Ma verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 213052-213061 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:213052-213061 https://doi.org/10.1109/ACCESS.2020.3039543 kostenfrei https://doaj.org/article/8aa308d30133449686f3cda521a9c18d kostenfrei https://ieeexplore.ieee.org/document/9265251/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 213052-213061 |
allfieldsGer |
10.1109/ACCESS.2020.3039543 doi (DE-627)DOAJ062562576 (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d DE-627 ger DE-627 rakwb eng TK1-9971 Ke Zang verfasserin aut Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory Electrical engineering. Electronics. Nuclear engineering Zhenguo Ma verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 213052-213061 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:213052-213061 https://doi.org/10.1109/ACCESS.2020.3039543 kostenfrei https://doaj.org/article/8aa308d30133449686f3cda521a9c18d kostenfrei https://ieeexplore.ieee.org/document/9265251/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 213052-213061 |
allfieldsSound |
10.1109/ACCESS.2020.3039543 doi (DE-627)DOAJ062562576 (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d DE-627 ger DE-627 rakwb eng TK1-9971 Ke Zang verfasserin aut Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory Electrical engineering. Electronics. Nuclear engineering Zhenguo Ma verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 213052-213061 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:213052-213061 https://doi.org/10.1109/ACCESS.2020.3039543 kostenfrei https://doaj.org/article/8aa308d30133449686f3cda521a9c18d kostenfrei https://ieeexplore.ieee.org/document/9265251/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 213052-213061 |
language |
English |
source |
In IEEE Access 8(2020), Seite 213052-213061 volume:8 year:2020 pages:213052-213061 |
sourceStr |
In IEEE Access 8(2020), Seite 213052-213061 volume:8 year:2020 pages:213052-213061 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory Electrical engineering. Electronics. Nuclear engineering |
isfreeaccess_bool |
true |
container_title |
IEEE Access |
authorswithroles_txt_mv |
Ke Zang @@aut@@ Zhenguo Ma @@aut@@ |
publishDateDaySort_date |
2020-01-01T00:00:00Z |
hierarchy_top_id |
728440385 |
id |
DOAJ062562576 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ062562576</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230309021822.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230228s2020 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2020.3039543</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ062562576</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ8aa308d30133449686f3cda521a9c18d</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Ke Zang</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Automatic modulation classification (AMC)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">recurrent neural networks (RNNs)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">hierarchical recurrent structure</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">long-term memory</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Zhenguo Ma</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">8(2020), Seite 213052-213061</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:8</subfield><subfield code="g">year:2020</subfield><subfield code="g">pages:213052-213061</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2020.3039543</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/8aa308d30133449686f3cda521a9c18d</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/9265251/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">8</subfield><subfield code="j">2020</subfield><subfield code="h">213052-213061</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Ke Zang |
spellingShingle |
Ke Zang misc TK1-9971 misc Automatic modulation classification (AMC) misc recurrent neural networks (RNNs) misc hierarchical recurrent structure misc long-term memory misc Electrical engineering. Electronics. Nuclear engineering Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
authorStr |
Ke Zang |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)728440385 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TK1-9971 |
illustrated |
Not Illustrated |
issn |
21693536 |
topic_title |
TK1-9971 Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory Automatic modulation classification (AMC) recurrent neural networks (RNNs) hierarchical recurrent structure long-term memory |
topic |
misc TK1-9971 misc Automatic modulation classification (AMC) misc recurrent neural networks (RNNs) misc hierarchical recurrent structure misc long-term memory misc Electrical engineering. Electronics. Nuclear engineering |
topic_unstemmed |
misc TK1-9971 misc Automatic modulation classification (AMC) misc recurrent neural networks (RNNs) misc hierarchical recurrent structure misc long-term memory misc Electrical engineering. Electronics. Nuclear engineering |
topic_browse |
misc TK1-9971 misc Automatic modulation classification (AMC) misc recurrent neural networks (RNNs) misc hierarchical recurrent structure misc long-term memory misc Electrical engineering. Electronics. Nuclear engineering |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
IEEE Access |
hierarchy_parent_id |
728440385 |
hierarchy_top_title |
IEEE Access |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)728440385 (DE-600)2687964-5 |
title |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
ctrlnum |
(DE-627)DOAJ062562576 (DE-599)DOAJ8aa308d30133449686f3cda521a9c18d |
title_full |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
author_sort |
Ke Zang |
journal |
IEEE Access |
journalStr |
IEEE Access |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2020 |
contenttype_str_mv |
txt |
container_start_page |
213052 |
author_browse |
Ke Zang Zhenguo Ma |
container_volume |
8 |
class |
TK1-9971 |
format_se |
Elektronische Aufsätze |
author-letter |
Ke Zang |
doi_str_mv |
10.1109/ACCESS.2020.3039543 |
author2-role |
verfasserin |
title_sort |
automatic modulation classification based on hierarchical recurrent neural networks with grouped auxiliary memory |
callnumber |
TK1-9971 |
title_auth |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
abstract |
As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. |
abstractGer |
As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. |
abstract_unstemmed |
As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory |
url |
https://doi.org/10.1109/ACCESS.2020.3039543 https://doaj.org/article/8aa308d30133449686f3cda521a9c18d https://ieeexplore.ieee.org/document/9265251/ https://doaj.org/toc/2169-3536 |
remote_bool |
true |
author2 |
Zhenguo Ma |
author2Str |
Zhenguo Ma |
ppnlink |
728440385 |
callnumber-subject |
TK - Electrical and Nuclear Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1109/ACCESS.2020.3039543 |
callnumber-a |
TK1-9971 |
up_date |
2024-07-04T02:07:13.754Z |
_version_ |
1803612411665580032 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ062562576</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230309021822.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230228s2020 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2020.3039543</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ062562576</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ8aa308d30133449686f3cda521a9c18d</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Ke Zang</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Automatic Modulation Classification Based on Hierarchical Recurrent Neural Networks With Grouped Auxiliary Memory</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">As a valuable topic in wireless communication systems, automatic modulation classification has been studied for many years. In recent years, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been used in this area and have achieved good results. However, these models often suffer from the vanishing gradient problem when the temporal depth and spatial depth increases, which diminishes the ability to latch long-term memories. In this paper, we propose a new hierarchical RNN architecture with grouped auxiliary memory to better capture long-term dependencies. The proposed model is compared with LSTM and gated recurrent unit (GRU) on the RadioML 2016.10a dataset, which is widely used as a benchmark in modulation classification. The results show that the proposed network yields a higher average classification accuracy under varying signal-to-noise ratio (SNR) conditions ranging from 0 dB to 20 dB, even with much fewer parameters. The performance superiority is also confirmed using a dataset with variable lengths of signals.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Automatic modulation classification (AMC)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">recurrent neural networks (RNNs)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">hierarchical recurrent structure</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">long-term memory</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Zhenguo Ma</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">8(2020), Seite 213052-213061</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:8</subfield><subfield code="g">year:2020</subfield><subfield code="g">pages:213052-213061</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2020.3039543</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/8aa308d30133449686f3cda521a9c18d</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/9265251/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">8</subfield><subfield code="j">2020</subfield><subfield code="h">213052-213061</subfield></datafield></record></collection>
|
score |
7.400527 |