Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework
Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shear...
Ausführliche Beschreibung
Autor*in: |
Jiang Qian [verfasserIn] Liu Yadong [verfasserIn] Dai Jindun [verfasserIn] Fu Xiaofei [verfasserIn] Jiang Xiuchen [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2019 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: IEEE Access - IEEE, 2014, 7(2019), Seite 83484-83494 |
---|---|
Übergeordnetes Werk: |
volume:7 ; year:2019 ; pages:83484-83494 |
Links: |
---|
DOI / URN: |
10.1109/ACCESS.2019.2924033 |
---|
Katalog-ID: |
DOAJ084943432 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ084943432 | ||
003 | DE-627 | ||
005 | 20230311032610.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230311s2019 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1109/ACCESS.2019.2924033 |2 doi | |
035 | |a (DE-627)DOAJ084943432 | ||
035 | |a (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TK1-9971 | |
100 | 0 | |a Jiang Qian |e verfasserin |4 aut | |
245 | 1 | 0 | |a Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework |
264 | 1 | |c 2019 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. | ||
650 | 4 | |a Image fusion | |
650 | 4 | |a multi-focus and multi-sensor | |
650 | 4 | |a multi-scale decomposition | |
650 | 4 | |a saliency map | |
650 | 4 | |a PCNN | |
653 | 0 | |a Electrical engineering. Electronics. Nuclear engineering | |
700 | 0 | |a Liu Yadong |e verfasserin |4 aut | |
700 | 0 | |a Dai Jindun |e verfasserin |4 aut | |
700 | 0 | |a Fu Xiaofei |e verfasserin |4 aut | |
700 | 0 | |a Jiang Xiuchen |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t IEEE Access |d IEEE, 2014 |g 7(2019), Seite 83484-83494 |w (DE-627)728440385 |w (DE-600)2687964-5 |x 21693536 |7 nnns |
773 | 1 | 8 | |g volume:7 |g year:2019 |g pages:83484-83494 |
856 | 4 | 0 | |u https://doi.org/10.1109/ACCESS.2019.2924033 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 |z kostenfrei |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/document/8742596/ |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2169-3536 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 7 |j 2019 |h 83484-83494 |
author_variant |
j q jq l y ly d j dj f x fx j x jx |
---|---|
matchkey_str |
article:21693536:2019----::mgfsomtobsdntutrbsdainyaa |
hierarchy_sort_str |
2019 |
callnumber-subject-code |
TK |
publishDate |
2019 |
allfields |
10.1109/ACCESS.2019.2924033 doi (DE-627)DOAJ084943432 (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 DE-627 ger DE-627 rakwb eng TK1-9971 Jiang Qian verfasserin aut Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN Electrical engineering. Electronics. Nuclear engineering Liu Yadong verfasserin aut Dai Jindun verfasserin aut Fu Xiaofei verfasserin aut Jiang Xiuchen verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 83484-83494 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:83484-83494 https://doi.org/10.1109/ACCESS.2019.2924033 kostenfrei https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 kostenfrei https://ieeexplore.ieee.org/document/8742596/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 83484-83494 |
spelling |
10.1109/ACCESS.2019.2924033 doi (DE-627)DOAJ084943432 (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 DE-627 ger DE-627 rakwb eng TK1-9971 Jiang Qian verfasserin aut Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN Electrical engineering. Electronics. Nuclear engineering Liu Yadong verfasserin aut Dai Jindun verfasserin aut Fu Xiaofei verfasserin aut Jiang Xiuchen verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 83484-83494 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:83484-83494 https://doi.org/10.1109/ACCESS.2019.2924033 kostenfrei https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 kostenfrei https://ieeexplore.ieee.org/document/8742596/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 83484-83494 |
allfields_unstemmed |
10.1109/ACCESS.2019.2924033 doi (DE-627)DOAJ084943432 (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 DE-627 ger DE-627 rakwb eng TK1-9971 Jiang Qian verfasserin aut Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN Electrical engineering. Electronics. Nuclear engineering Liu Yadong verfasserin aut Dai Jindun verfasserin aut Fu Xiaofei verfasserin aut Jiang Xiuchen verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 83484-83494 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:83484-83494 https://doi.org/10.1109/ACCESS.2019.2924033 kostenfrei https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 kostenfrei https://ieeexplore.ieee.org/document/8742596/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 83484-83494 |
allfieldsGer |
10.1109/ACCESS.2019.2924033 doi (DE-627)DOAJ084943432 (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 DE-627 ger DE-627 rakwb eng TK1-9971 Jiang Qian verfasserin aut Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN Electrical engineering. Electronics. Nuclear engineering Liu Yadong verfasserin aut Dai Jindun verfasserin aut Fu Xiaofei verfasserin aut Jiang Xiuchen verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 83484-83494 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:83484-83494 https://doi.org/10.1109/ACCESS.2019.2924033 kostenfrei https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 kostenfrei https://ieeexplore.ieee.org/document/8742596/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 83484-83494 |
allfieldsSound |
10.1109/ACCESS.2019.2924033 doi (DE-627)DOAJ084943432 (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 DE-627 ger DE-627 rakwb eng TK1-9971 Jiang Qian verfasserin aut Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN Electrical engineering. Electronics. Nuclear engineering Liu Yadong verfasserin aut Dai Jindun verfasserin aut Fu Xiaofei verfasserin aut Jiang Xiuchen verfasserin aut In IEEE Access IEEE, 2014 7(2019), Seite 83484-83494 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:7 year:2019 pages:83484-83494 https://doi.org/10.1109/ACCESS.2019.2924033 kostenfrei https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 kostenfrei https://ieeexplore.ieee.org/document/8742596/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 7 2019 83484-83494 |
language |
English |
source |
In IEEE Access 7(2019), Seite 83484-83494 volume:7 year:2019 pages:83484-83494 |
sourceStr |
In IEEE Access 7(2019), Seite 83484-83494 volume:7 year:2019 pages:83484-83494 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN Electrical engineering. Electronics. Nuclear engineering |
isfreeaccess_bool |
true |
container_title |
IEEE Access |
authorswithroles_txt_mv |
Jiang Qian @@aut@@ Liu Yadong @@aut@@ Dai Jindun @@aut@@ Fu Xiaofei @@aut@@ Jiang Xiuchen @@aut@@ |
publishDateDaySort_date |
2019-01-01T00:00:00Z |
hierarchy_top_id |
728440385 |
id |
DOAJ084943432 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">DOAJ084943432</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230311032610.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230311s2019 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2019.2924033</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ084943432</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ909afd9280e9441cad41b48614a8cca1</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Jiang Qian</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2019</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Image fusion</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">multi-focus and multi-sensor</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">multi-scale decomposition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">saliency map</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">PCNN</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Liu Yadong</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Dai Jindun</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Fu Xiaofei</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Jiang Xiuchen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">7(2019), Seite 83484-83494</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:7</subfield><subfield code="g">year:2019</subfield><subfield code="g">pages:83484-83494</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2019.2924033</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/909afd9280e9441cad41b48614a8cca1</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/8742596/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">7</subfield><subfield code="j">2019</subfield><subfield code="h">83484-83494</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Jiang Qian |
spellingShingle |
Jiang Qian misc TK1-9971 misc Image fusion misc multi-focus and multi-sensor misc multi-scale decomposition misc saliency map misc PCNN misc Electrical engineering. Electronics. Nuclear engineering Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework |
authorStr |
Jiang Qian |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)728440385 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TK1-9971 |
illustrated |
Not Illustrated |
issn |
21693536 |
topic_title |
TK1-9971 Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework Image fusion multi-focus and multi-sensor multi-scale decomposition saliency map PCNN |
topic |
misc TK1-9971 misc Image fusion misc multi-focus and multi-sensor misc multi-scale decomposition misc saliency map misc PCNN misc Electrical engineering. Electronics. Nuclear engineering |
topic_unstemmed |
misc TK1-9971 misc Image fusion misc multi-focus and multi-sensor misc multi-scale decomposition misc saliency map misc PCNN misc Electrical engineering. Electronics. Nuclear engineering |
topic_browse |
misc TK1-9971 misc Image fusion misc multi-focus and multi-sensor misc multi-scale decomposition misc saliency map misc PCNN misc Electrical engineering. Electronics. Nuclear engineering |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
IEEE Access |
hierarchy_parent_id |
728440385 |
hierarchy_top_title |
IEEE Access |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)728440385 (DE-600)2687964-5 |
title |
Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework |
ctrlnum |
(DE-627)DOAJ084943432 (DE-599)DOAJ909afd9280e9441cad41b48614a8cca1 |
title_full |
Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework |
author_sort |
Jiang Qian |
journal |
IEEE Access |
journalStr |
IEEE Access |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2019 |
contenttype_str_mv |
txt |
container_start_page |
83484 |
author_browse |
Jiang Qian Liu Yadong Dai Jindun Fu Xiaofei Jiang Xiuchen |
container_volume |
7 |
class |
TK1-9971 |
format_se |
Elektronische Aufsätze |
author-letter |
Jiang Qian |
doi_str_mv |
10.1109/ACCESS.2019.2924033 |
author2-role |
verfasserin |
title_sort |
image fusion method based on structure-based saliency map and fdst-pcnn framework |
callnumber |
TK1-9971 |
title_auth |
Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework |
abstract |
Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. |
abstractGer |
Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. |
abstract_unstemmed |
Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework |
url |
https://doi.org/10.1109/ACCESS.2019.2924033 https://doaj.org/article/909afd9280e9441cad41b48614a8cca1 https://ieeexplore.ieee.org/document/8742596/ https://doaj.org/toc/2169-3536 |
remote_bool |
true |
author2 |
Liu Yadong Dai Jindun Fu Xiaofei Jiang Xiuchen |
author2Str |
Liu Yadong Dai Jindun Fu Xiaofei Jiang Xiuchen |
ppnlink |
728440385 |
callnumber-subject |
TK - Electrical and Nuclear Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1109/ACCESS.2019.2924033 |
callnumber-a |
TK1-9971 |
up_date |
2024-07-04T01:12:22.185Z |
_version_ |
1803608960209518592 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">DOAJ084943432</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230311032610.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230311s2019 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2019.2924033</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ084943432</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ909afd9280e9441cad41b48614a8cca1</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Jiang Qian</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Image Fusion Method Based on Structure-Based Saliency Map and FDST-PCNN Framework</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2019</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Image fusion has become an active and promising research topic in image processing. It provides an effective way to combine several source images to form a composite image with more detailed information than any one of the source images. An FDST-PCNN framework, which integrates finite discrete shearlet transform (FDST) with pulse-coupled neural network (PCNN), is proposed to possess a higher ability enhance fusion effects. We first propose a structure-based saliency (SBS) map to enhance the clear and important features in one image. The SBS map combines the depth information with the saliency information and could be a good representation of the most essential information of the source images. After multi-scale decomposition by the FDST, the SBS map of the source images and the modified-spatial-frequency of the subbands are both utilized to tune the PCNN neuron response and determine the fused coefficients in each subband. The experimental results on multi-focus and multi-sensor images verify the effectiveness of our proposed fusion method. Compared with other PCNN-based fusion methods, the proposed method achieves significant improvement in preserving detailed edge information and improving overall visual performance.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Image fusion</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">multi-focus and multi-sensor</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">multi-scale decomposition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">saliency map</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">PCNN</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Liu Yadong</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Dai Jindun</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Fu Xiaofei</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Jiang Xiuchen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">7(2019), Seite 83484-83494</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:7</subfield><subfield code="g">year:2019</subfield><subfield code="g">pages:83484-83494</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2019.2924033</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/909afd9280e9441cad41b48614a8cca1</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/8742596/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">7</subfield><subfield code="j">2019</subfield><subfield code="h">83484-83494</subfield></datafield></record></collection>
|
score |
7.400833 |