Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude
Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidl...
Ausführliche Beschreibung
Autor*in: |
Zare, Ali Asghar [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2016 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer-Verlag Berlin Heidelberg 2016 |
---|
Übergeordnetes Werk: |
Enthalten in: International journal of machine learning and cybernetics - Heidelberg : Springer, 2010, 9(2016), 5 vom: 29. Sept., Seite 727-741 |
---|---|
Übergeordnetes Werk: |
volume:9 ; year:2016 ; number:5 ; day:29 ; month:09 ; pages:727-741 |
Links: |
---|
DOI / URN: |
10.1007/s13042-016-0602-3 |
---|
Katalog-ID: |
SPR029604907 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | SPR029604907 | ||
003 | DE-627 | ||
005 | 20230331104715.0 | ||
007 | cr uuu---uuuuu | ||
008 | 201007s2016 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s13042-016-0602-3 |2 doi | |
035 | |a (DE-627)SPR029604907 | ||
035 | |a (SPR)s13042-016-0602-3-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Zare, Ali Asghar |e verfasserin |4 aut | |
245 | 1 | 0 | |a Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude |
264 | 1 | |c 2016 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a © Springer-Verlag Berlin Heidelberg 2016 | ||
520 | |a Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. | ||
650 | 4 | |a Wrist cropping algorithm |7 (dpeaa)DE-He213 | |
650 | 4 | |a Static gesture recognition |7 (dpeaa)DE-He213 | |
650 | 4 | |a Hand detection |7 (dpeaa)DE-He213 | |
650 | 4 | |a Skin segmentation |7 (dpeaa)DE-He213 | |
650 | 4 | |a Cumulative angular function |7 (dpeaa)DE-He213 | |
650 | 4 | |a Neural network |7 (dpeaa)DE-He213 | |
700 | 1 | |a Zahiri, Seyed Hamid |4 aut | |
773 | 0 | 8 | |i Enthalten in |t International journal of machine learning and cybernetics |d Heidelberg : Springer, 2010 |g 9(2016), 5 vom: 29. Sept., Seite 727-741 |w (DE-627)635135132 |w (DE-600)2572473-3 |x 1868-808X |7 nnns |
773 | 1 | 8 | |g volume:9 |g year:2016 |g number:5 |g day:29 |g month:09 |g pages:727-741 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s13042-016-0602-3 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_32 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_74 | ||
912 | |a GBV_ILN_90 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_100 | ||
912 | |a GBV_ILN_101 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_120 | ||
912 | |a GBV_ILN_138 | ||
912 | |a GBV_ILN_150 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_171 | ||
912 | |a GBV_ILN_187 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_224 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_250 | ||
912 | |a GBV_ILN_281 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_636 | ||
912 | |a GBV_ILN_702 | ||
912 | |a GBV_ILN_2001 | ||
912 | |a GBV_ILN_2003 | ||
912 | |a GBV_ILN_2004 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2006 | ||
912 | |a GBV_ILN_2007 | ||
912 | |a GBV_ILN_2008 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2010 | ||
912 | |a GBV_ILN_2011 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2015 | ||
912 | |a GBV_ILN_2020 | ||
912 | |a GBV_ILN_2021 | ||
912 | |a GBV_ILN_2025 | ||
912 | |a GBV_ILN_2026 | ||
912 | |a GBV_ILN_2027 | ||
912 | |a GBV_ILN_2031 | ||
912 | |a GBV_ILN_2034 | ||
912 | |a GBV_ILN_2037 | ||
912 | |a GBV_ILN_2038 | ||
912 | |a GBV_ILN_2039 | ||
912 | |a GBV_ILN_2044 | ||
912 | |a GBV_ILN_2048 | ||
912 | |a GBV_ILN_2049 | ||
912 | |a GBV_ILN_2050 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2057 | ||
912 | |a GBV_ILN_2059 | ||
912 | |a GBV_ILN_2061 | ||
912 | |a GBV_ILN_2064 | ||
912 | |a GBV_ILN_2065 | ||
912 | |a GBV_ILN_2068 | ||
912 | |a GBV_ILN_2070 | ||
912 | |a GBV_ILN_2086 | ||
912 | |a GBV_ILN_2088 | ||
912 | |a GBV_ILN_2093 | ||
912 | |a GBV_ILN_2106 | ||
912 | |a GBV_ILN_2107 | ||
912 | |a GBV_ILN_2108 | ||
912 | |a GBV_ILN_2110 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_2112 | ||
912 | |a GBV_ILN_2113 | ||
912 | |a GBV_ILN_2116 | ||
912 | |a GBV_ILN_2118 | ||
912 | |a GBV_ILN_2119 | ||
912 | |a GBV_ILN_2122 | ||
912 | |a GBV_ILN_2129 | ||
912 | |a GBV_ILN_2143 | ||
912 | |a GBV_ILN_2144 | ||
912 | |a GBV_ILN_2147 | ||
912 | |a GBV_ILN_2148 | ||
912 | |a GBV_ILN_2152 | ||
912 | |a GBV_ILN_2153 | ||
912 | |a GBV_ILN_2188 | ||
912 | |a GBV_ILN_2190 | ||
912 | |a GBV_ILN_2232 | ||
912 | |a GBV_ILN_2336 | ||
912 | |a GBV_ILN_2446 | ||
912 | |a GBV_ILN_2470 | ||
912 | |a GBV_ILN_2472 | ||
912 | |a GBV_ILN_2507 | ||
912 | |a GBV_ILN_2522 | ||
912 | |a GBV_ILN_2548 | ||
912 | |a GBV_ILN_4035 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4046 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4242 | ||
912 | |a GBV_ILN_4246 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4251 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4326 | ||
912 | |a GBV_ILN_4333 | ||
912 | |a GBV_ILN_4334 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4336 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4393 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 9 |j 2016 |e 5 |b 29 |c 09 |h 727-741 |
author_variant |
a a z aa aaz s h z sh shz |
---|---|
matchkey_str |
article:1868808X:2016----::eontooaelieinrneednsaifriinagaeaeofu |
hierarchy_sort_str |
2016 |
publishDate |
2016 |
allfields |
10.1007/s13042-016-0602-3 doi (DE-627)SPR029604907 (SPR)s13042-016-0602-3-e DE-627 ger DE-627 rakwb eng Zare, Ali Asghar verfasserin aut Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude 2016 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag Berlin Heidelberg 2016 Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. Wrist cropping algorithm (dpeaa)DE-He213 Static gesture recognition (dpeaa)DE-He213 Hand detection (dpeaa)DE-He213 Skin segmentation (dpeaa)DE-He213 Cumulative angular function (dpeaa)DE-He213 Neural network (dpeaa)DE-He213 Zahiri, Seyed Hamid aut Enthalten in International journal of machine learning and cybernetics Heidelberg : Springer, 2010 9(2016), 5 vom: 29. Sept., Seite 727-741 (DE-627)635135132 (DE-600)2572473-3 1868-808X nnns volume:9 year:2016 number:5 day:29 month:09 pages:727-741 https://dx.doi.org/10.1007/s13042-016-0602-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 9 2016 5 29 09 727-741 |
spelling |
10.1007/s13042-016-0602-3 doi (DE-627)SPR029604907 (SPR)s13042-016-0602-3-e DE-627 ger DE-627 rakwb eng Zare, Ali Asghar verfasserin aut Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude 2016 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag Berlin Heidelberg 2016 Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. Wrist cropping algorithm (dpeaa)DE-He213 Static gesture recognition (dpeaa)DE-He213 Hand detection (dpeaa)DE-He213 Skin segmentation (dpeaa)DE-He213 Cumulative angular function (dpeaa)DE-He213 Neural network (dpeaa)DE-He213 Zahiri, Seyed Hamid aut Enthalten in International journal of machine learning and cybernetics Heidelberg : Springer, 2010 9(2016), 5 vom: 29. Sept., Seite 727-741 (DE-627)635135132 (DE-600)2572473-3 1868-808X nnns volume:9 year:2016 number:5 day:29 month:09 pages:727-741 https://dx.doi.org/10.1007/s13042-016-0602-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 9 2016 5 29 09 727-741 |
allfields_unstemmed |
10.1007/s13042-016-0602-3 doi (DE-627)SPR029604907 (SPR)s13042-016-0602-3-e DE-627 ger DE-627 rakwb eng Zare, Ali Asghar verfasserin aut Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude 2016 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag Berlin Heidelberg 2016 Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. Wrist cropping algorithm (dpeaa)DE-He213 Static gesture recognition (dpeaa)DE-He213 Hand detection (dpeaa)DE-He213 Skin segmentation (dpeaa)DE-He213 Cumulative angular function (dpeaa)DE-He213 Neural network (dpeaa)DE-He213 Zahiri, Seyed Hamid aut Enthalten in International journal of machine learning and cybernetics Heidelberg : Springer, 2010 9(2016), 5 vom: 29. Sept., Seite 727-741 (DE-627)635135132 (DE-600)2572473-3 1868-808X nnns volume:9 year:2016 number:5 day:29 month:09 pages:727-741 https://dx.doi.org/10.1007/s13042-016-0602-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 9 2016 5 29 09 727-741 |
allfieldsGer |
10.1007/s13042-016-0602-3 doi (DE-627)SPR029604907 (SPR)s13042-016-0602-3-e DE-627 ger DE-627 rakwb eng Zare, Ali Asghar verfasserin aut Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude 2016 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag Berlin Heidelberg 2016 Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. Wrist cropping algorithm (dpeaa)DE-He213 Static gesture recognition (dpeaa)DE-He213 Hand detection (dpeaa)DE-He213 Skin segmentation (dpeaa)DE-He213 Cumulative angular function (dpeaa)DE-He213 Neural network (dpeaa)DE-He213 Zahiri, Seyed Hamid aut Enthalten in International journal of machine learning and cybernetics Heidelberg : Springer, 2010 9(2016), 5 vom: 29. Sept., Seite 727-741 (DE-627)635135132 (DE-600)2572473-3 1868-808X nnns volume:9 year:2016 number:5 day:29 month:09 pages:727-741 https://dx.doi.org/10.1007/s13042-016-0602-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 9 2016 5 29 09 727-741 |
allfieldsSound |
10.1007/s13042-016-0602-3 doi (DE-627)SPR029604907 (SPR)s13042-016-0602-3-e DE-627 ger DE-627 rakwb eng Zare, Ali Asghar verfasserin aut Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude 2016 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag Berlin Heidelberg 2016 Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. Wrist cropping algorithm (dpeaa)DE-He213 Static gesture recognition (dpeaa)DE-He213 Hand detection (dpeaa)DE-He213 Skin segmentation (dpeaa)DE-He213 Cumulative angular function (dpeaa)DE-He213 Neural network (dpeaa)DE-He213 Zahiri, Seyed Hamid aut Enthalten in International journal of machine learning and cybernetics Heidelberg : Springer, 2010 9(2016), 5 vom: 29. Sept., Seite 727-741 (DE-627)635135132 (DE-600)2572473-3 1868-808X nnns volume:9 year:2016 number:5 day:29 month:09 pages:727-741 https://dx.doi.org/10.1007/s13042-016-0602-3 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 9 2016 5 29 09 727-741 |
language |
English |
source |
Enthalten in International journal of machine learning and cybernetics 9(2016), 5 vom: 29. Sept., Seite 727-741 volume:9 year:2016 number:5 day:29 month:09 pages:727-741 |
sourceStr |
Enthalten in International journal of machine learning and cybernetics 9(2016), 5 vom: 29. Sept., Seite 727-741 volume:9 year:2016 number:5 day:29 month:09 pages:727-741 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Wrist cropping algorithm Static gesture recognition Hand detection Skin segmentation Cumulative angular function Neural network |
isfreeaccess_bool |
false |
container_title |
International journal of machine learning and cybernetics |
authorswithroles_txt_mv |
Zare, Ali Asghar @@aut@@ Zahiri, Seyed Hamid @@aut@@ |
publishDateDaySort_date |
2016-09-29T00:00:00Z |
hierarchy_top_id |
635135132 |
id |
SPR029604907 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR029604907</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230331104715.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201007s2016 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s13042-016-0602-3</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR029604907</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s13042-016-0602-3-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Zare, Ali Asghar</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2016</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag Berlin Heidelberg 2016</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Wrist cropping algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Static gesture recognition</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Hand detection</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Skin segmentation</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Cumulative angular function</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural network</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zahiri, Seyed Hamid</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">International journal of machine learning and cybernetics</subfield><subfield code="d">Heidelberg : Springer, 2010</subfield><subfield code="g">9(2016), 5 vom: 29. Sept., Seite 727-741</subfield><subfield code="w">(DE-627)635135132</subfield><subfield code="w">(DE-600)2572473-3</subfield><subfield code="x">1868-808X</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:9</subfield><subfield code="g">year:2016</subfield><subfield code="g">number:5</subfield><subfield code="g">day:29</subfield><subfield code="g">month:09</subfield><subfield code="g">pages:727-741</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s13042-016-0602-3</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_101</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2008</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2070</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2086</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2116</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2188</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2548</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">9</subfield><subfield code="j">2016</subfield><subfield code="e">5</subfield><subfield code="b">29</subfield><subfield code="c">09</subfield><subfield code="h">727-741</subfield></datafield></record></collection>
|
author |
Zare, Ali Asghar |
spellingShingle |
Zare, Ali Asghar misc Wrist cropping algorithm misc Static gesture recognition misc Hand detection misc Skin segmentation misc Cumulative angular function misc Neural network Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude |
authorStr |
Zare, Ali Asghar |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)635135132 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
issn |
1868-808X |
topic_title |
Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude Wrist cropping algorithm (dpeaa)DE-He213 Static gesture recognition (dpeaa)DE-He213 Hand detection (dpeaa)DE-He213 Skin segmentation (dpeaa)DE-He213 Cumulative angular function (dpeaa)DE-He213 Neural network (dpeaa)DE-He213 |
topic |
misc Wrist cropping algorithm misc Static gesture recognition misc Hand detection misc Skin segmentation misc Cumulative angular function misc Neural network |
topic_unstemmed |
misc Wrist cropping algorithm misc Static gesture recognition misc Hand detection misc Skin segmentation misc Cumulative angular function misc Neural network |
topic_browse |
misc Wrist cropping algorithm misc Static gesture recognition misc Hand detection misc Skin segmentation misc Cumulative angular function misc Neural network |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
International journal of machine learning and cybernetics |
hierarchy_parent_id |
635135132 |
hierarchy_top_title |
International journal of machine learning and cybernetics |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)635135132 (DE-600)2572473-3 |
title |
Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude |
ctrlnum |
(DE-627)SPR029604907 (SPR)s13042-016-0602-3-e |
title_full |
Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude |
author_sort |
Zare, Ali Asghar |
journal |
International journal of machine learning and cybernetics |
journalStr |
International journal of machine learning and cybernetics |
lang_code |
eng |
isOA_bool |
false |
recordtype |
marc |
publishDateSort |
2016 |
contenttype_str_mv |
txt |
container_start_page |
727 |
author_browse |
Zare, Ali Asghar Zahiri, Seyed Hamid |
container_volume |
9 |
format_se |
Elektronische Aufsätze |
author-letter |
Zare, Ali Asghar |
doi_str_mv |
10.1007/s13042-016-0602-3 |
title_sort |
recognition of a real-time signer-independent static farsi sign language based on fourier coefficients amplitude |
title_auth |
Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude |
abstract |
Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. © Springer-Verlag Berlin Heidelberg 2016 |
abstractGer |
Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. © Springer-Verlag Berlin Heidelberg 2016 |
abstract_unstemmed |
Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system. © Springer-Verlag Berlin Heidelberg 2016 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 |
container_issue |
5 |
title_short |
Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude |
url |
https://dx.doi.org/10.1007/s13042-016-0602-3 |
remote_bool |
true |
author2 |
Zahiri, Seyed Hamid |
author2Str |
Zahiri, Seyed Hamid |
ppnlink |
635135132 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s13042-016-0602-3 |
up_date |
2024-07-04T01:40:01.818Z |
_version_ |
1803610700454559744 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR029604907</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230331104715.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201007s2016 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s13042-016-0602-3</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR029604907</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s13042-016-0602-3-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Zare, Ali Asghar</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Recognition of a real-time signer-independent static Farsi sign language based on fourier coefficients amplitude</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2016</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag Berlin Heidelberg 2016</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract A sign language is a language which uses manual communication and body language to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. Sign language is a preliminary communication way for individuals with hearing and speech problems. Considering that more than a hundred million people all around the world are annoyed by hearing loss and impaired speech, it is needed to design a system for automatic sign language interpreter as an interface between deaf-and-dumb and ordinary people can feel it strongly. Given this, in this article we aimed to design such a computer vision-based translation interface. Farsi sign language recognition is one of the most challenging fields of study is given because of some features such as the wide range of similar gestures, hands orientation, complicated background, and ambient light variation. A Farsi sign language recognition system is based on computer vision which is capable of real-time gesture processing and is independent of the signer. Furthermore, there is no need to use glove or marker by the signer in the proposed system. After hand segmentation from video frames, the proposed algorithm extracts boundary of the dominant hand to compute the normalized accumulation angle and represents the boundary, so that the invariance to transition and scale change of the features is realized at this stage. Afterward, Fourier coefficients amplitude is extracted as preferred features at the frequency domain, while invariance to rotation of the features is added at this point. Then the frequency features, as extracted features for gesture recognition, are applied to inputs of feed-forward multilayer perception neural network. The proposed method is presented to make recognition system independent of the signer and retrofit it against signer’s distance changes from camera using features of powerful invariant extraction against transition, scale change, and rotation. Data classification is carried out by three classifiers including Bayes, K-NN, and neural network. Performance of the classifiers was also compared. Training set of gestures comprised 250 samples for 10 gestures and 5 positions and orientations that were performed by 5 individuals. Recognition results showed an outstanding recognition rate of the system.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Wrist cropping algorithm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Static gesture recognition</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Hand detection</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Skin segmentation</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Cumulative angular function</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural network</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zahiri, Seyed Hamid</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">International journal of machine learning and cybernetics</subfield><subfield code="d">Heidelberg : Springer, 2010</subfield><subfield code="g">9(2016), 5 vom: 29. Sept., Seite 727-741</subfield><subfield code="w">(DE-627)635135132</subfield><subfield code="w">(DE-600)2572473-3</subfield><subfield code="x">1868-808X</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:9</subfield><subfield code="g">year:2016</subfield><subfield code="g">number:5</subfield><subfield code="g">day:29</subfield><subfield code="g">month:09</subfield><subfield code="g">pages:727-741</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s13042-016-0602-3</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_101</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2008</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2070</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2086</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2116</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2188</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2548</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">9</subfield><subfield code="j">2016</subfield><subfield code="e">5</subfield><subfield code="b">29</subfield><subfield code="c">09</subfield><subfield code="h">727-741</subfield></datafield></record></collection>
|
score |
7.4013834 |