Hands on Wheel Classification Based on Depth Images and Neural Networks
This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car...
Ausführliche Beschreibung
Autor*in: |
Schmitz Jan-Christoph [verfasserIn] Tilgner Stephan [verfasserIn] Kalischewski Kathrin [verfasserIn] Wagner Daniel [verfasserIn] Kummert Anton [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch ; Französisch |
Erschienen: |
2020 |
---|
Übergeordnetes Werk: |
In: MATEC Web of Conferences - EDP Sciences, 2013, 308, p 06003(2020) |
---|---|
Übergeordnetes Werk: |
volume:308, p 06003 ; year:2020 |
Links: |
---|
DOI / URN: |
10.1051/matecconf/202030806003 |
---|
Katalog-ID: |
DOAJ054652065 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ054652065 | ||
003 | DE-627 | ||
005 | 20230308183553.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230227s2020 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1051/matecconf/202030806003 |2 doi | |
035 | |a (DE-627)DOAJ054652065 | ||
035 | |a (DE-599)DOAJafa92e39474547279e3c30487ce12a53 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng |a fre | ||
050 | 0 | |a TA1-2040 | |
100 | 0 | |a Schmitz Jan-Christoph |e verfasserin |4 aut | |
245 | 1 | 0 | |a Hands on Wheel Classification Based on Depth Images and Neural Networks |
264 | 1 | |c 2020 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. | ||
653 | 0 | |a Engineering (General). Civil engineering (General) | |
700 | 0 | |a Tilgner Stephan |e verfasserin |4 aut | |
700 | 0 | |a Kalischewski Kathrin |e verfasserin |4 aut | |
700 | 0 | |a Wagner Daniel |e verfasserin |4 aut | |
700 | 0 | |a Kummert Anton |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t MATEC Web of Conferences |d EDP Sciences, 2013 |g 308, p 06003(2020) |w (DE-627)720166209 |w (DE-600)2673602-0 |x 2261236X |7 nnns |
773 | 1 | 8 | |g volume:308, p 06003 |g year:2020 |
856 | 4 | 0 | |u https://doi.org/10.1051/matecconf/202030806003 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/afa92e39474547279e3c30487ce12a53 |z kostenfrei |
856 | 4 | 0 | |u https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2261-236X |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 308, p 06003 |j 2020 |
author_variant |
s j c sjc t s ts k k kk w d wd k a ka |
---|---|
matchkey_str |
article:2261236X:2020----::adowellsiiainaeodphmgs |
hierarchy_sort_str |
2020 |
callnumber-subject-code |
TA |
publishDate |
2020 |
allfields |
10.1051/matecconf/202030806003 doi (DE-627)DOAJ054652065 (DE-599)DOAJafa92e39474547279e3c30487ce12a53 DE-627 ger DE-627 rakwb eng fre TA1-2040 Schmitz Jan-Christoph verfasserin aut Hands on Wheel Classification Based on Depth Images and Neural Networks 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. Engineering (General). Civil engineering (General) Tilgner Stephan verfasserin aut Kalischewski Kathrin verfasserin aut Wagner Daniel verfasserin aut Kummert Anton verfasserin aut In MATEC Web of Conferences EDP Sciences, 2013 308, p 06003(2020) (DE-627)720166209 (DE-600)2673602-0 2261236X nnns volume:308, p 06003 year:2020 https://doi.org/10.1051/matecconf/202030806003 kostenfrei https://doaj.org/article/afa92e39474547279e3c30487ce12a53 kostenfrei https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf kostenfrei https://doaj.org/toc/2261-236X Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 308, p 06003 2020 |
spelling |
10.1051/matecconf/202030806003 doi (DE-627)DOAJ054652065 (DE-599)DOAJafa92e39474547279e3c30487ce12a53 DE-627 ger DE-627 rakwb eng fre TA1-2040 Schmitz Jan-Christoph verfasserin aut Hands on Wheel Classification Based on Depth Images and Neural Networks 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. Engineering (General). Civil engineering (General) Tilgner Stephan verfasserin aut Kalischewski Kathrin verfasserin aut Wagner Daniel verfasserin aut Kummert Anton verfasserin aut In MATEC Web of Conferences EDP Sciences, 2013 308, p 06003(2020) (DE-627)720166209 (DE-600)2673602-0 2261236X nnns volume:308, p 06003 year:2020 https://doi.org/10.1051/matecconf/202030806003 kostenfrei https://doaj.org/article/afa92e39474547279e3c30487ce12a53 kostenfrei https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf kostenfrei https://doaj.org/toc/2261-236X Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 308, p 06003 2020 |
allfields_unstemmed |
10.1051/matecconf/202030806003 doi (DE-627)DOAJ054652065 (DE-599)DOAJafa92e39474547279e3c30487ce12a53 DE-627 ger DE-627 rakwb eng fre TA1-2040 Schmitz Jan-Christoph verfasserin aut Hands on Wheel Classification Based on Depth Images and Neural Networks 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. Engineering (General). Civil engineering (General) Tilgner Stephan verfasserin aut Kalischewski Kathrin verfasserin aut Wagner Daniel verfasserin aut Kummert Anton verfasserin aut In MATEC Web of Conferences EDP Sciences, 2013 308, p 06003(2020) (DE-627)720166209 (DE-600)2673602-0 2261236X nnns volume:308, p 06003 year:2020 https://doi.org/10.1051/matecconf/202030806003 kostenfrei https://doaj.org/article/afa92e39474547279e3c30487ce12a53 kostenfrei https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf kostenfrei https://doaj.org/toc/2261-236X Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 308, p 06003 2020 |
allfieldsGer |
10.1051/matecconf/202030806003 doi (DE-627)DOAJ054652065 (DE-599)DOAJafa92e39474547279e3c30487ce12a53 DE-627 ger DE-627 rakwb eng fre TA1-2040 Schmitz Jan-Christoph verfasserin aut Hands on Wheel Classification Based on Depth Images and Neural Networks 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. Engineering (General). Civil engineering (General) Tilgner Stephan verfasserin aut Kalischewski Kathrin verfasserin aut Wagner Daniel verfasserin aut Kummert Anton verfasserin aut In MATEC Web of Conferences EDP Sciences, 2013 308, p 06003(2020) (DE-627)720166209 (DE-600)2673602-0 2261236X nnns volume:308, p 06003 year:2020 https://doi.org/10.1051/matecconf/202030806003 kostenfrei https://doaj.org/article/afa92e39474547279e3c30487ce12a53 kostenfrei https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf kostenfrei https://doaj.org/toc/2261-236X Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 308, p 06003 2020 |
allfieldsSound |
10.1051/matecconf/202030806003 doi (DE-627)DOAJ054652065 (DE-599)DOAJafa92e39474547279e3c30487ce12a53 DE-627 ger DE-627 rakwb eng fre TA1-2040 Schmitz Jan-Christoph verfasserin aut Hands on Wheel Classification Based on Depth Images and Neural Networks 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. Engineering (General). Civil engineering (General) Tilgner Stephan verfasserin aut Kalischewski Kathrin verfasserin aut Wagner Daniel verfasserin aut Kummert Anton verfasserin aut In MATEC Web of Conferences EDP Sciences, 2013 308, p 06003(2020) (DE-627)720166209 (DE-600)2673602-0 2261236X nnns volume:308, p 06003 year:2020 https://doi.org/10.1051/matecconf/202030806003 kostenfrei https://doaj.org/article/afa92e39474547279e3c30487ce12a53 kostenfrei https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf kostenfrei https://doaj.org/toc/2261-236X Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 308, p 06003 2020 |
language |
English French |
source |
In MATEC Web of Conferences 308, p 06003(2020) volume:308, p 06003 year:2020 |
sourceStr |
In MATEC Web of Conferences 308, p 06003(2020) volume:308, p 06003 year:2020 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Engineering (General). Civil engineering (General) |
isfreeaccess_bool |
true |
container_title |
MATEC Web of Conferences |
authorswithroles_txt_mv |
Schmitz Jan-Christoph @@aut@@ Tilgner Stephan @@aut@@ Kalischewski Kathrin @@aut@@ Wagner Daniel @@aut@@ Kummert Anton @@aut@@ |
publishDateDaySort_date |
2020-01-01T00:00:00Z |
hierarchy_top_id |
720166209 |
id |
DOAJ054652065 |
language_de |
englisch franzoesisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ054652065</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230308183553.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230227s2020 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1051/matecconf/202030806003</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ054652065</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJafa92e39474547279e3c30487ce12a53</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield><subfield code="a">fre</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TA1-2040</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Schmitz Jan-Christoph</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hands on Wheel Classification Based on Depth Images and Neural Networks</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable.</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Engineering (General). Civil engineering (General)</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Tilgner Stephan</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Kalischewski Kathrin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Wagner Daniel</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Kummert Anton</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">MATEC Web of Conferences</subfield><subfield code="d">EDP Sciences, 2013</subfield><subfield code="g">308, p 06003(2020)</subfield><subfield code="w">(DE-627)720166209</subfield><subfield code="w">(DE-600)2673602-0</subfield><subfield code="x">2261236X</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:308, p 06003</subfield><subfield code="g">year:2020</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1051/matecconf/202030806003</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/afa92e39474547279e3c30487ce12a53</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2261-236X</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">308, p 06003</subfield><subfield code="j">2020</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Schmitz Jan-Christoph |
spellingShingle |
Schmitz Jan-Christoph misc TA1-2040 misc Engineering (General). Civil engineering (General) Hands on Wheel Classification Based on Depth Images and Neural Networks |
authorStr |
Schmitz Jan-Christoph |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)720166209 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TA1-2040 |
illustrated |
Not Illustrated |
issn |
2261236X |
topic_title |
TA1-2040 Hands on Wheel Classification Based on Depth Images and Neural Networks |
topic |
misc TA1-2040 misc Engineering (General). Civil engineering (General) |
topic_unstemmed |
misc TA1-2040 misc Engineering (General). Civil engineering (General) |
topic_browse |
misc TA1-2040 misc Engineering (General). Civil engineering (General) |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
MATEC Web of Conferences |
hierarchy_parent_id |
720166209 |
hierarchy_top_title |
MATEC Web of Conferences |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)720166209 (DE-600)2673602-0 |
title |
Hands on Wheel Classification Based on Depth Images and Neural Networks |
ctrlnum |
(DE-627)DOAJ054652065 (DE-599)DOAJafa92e39474547279e3c30487ce12a53 |
title_full |
Hands on Wheel Classification Based on Depth Images and Neural Networks |
author_sort |
Schmitz Jan-Christoph |
journal |
MATEC Web of Conferences |
journalStr |
MATEC Web of Conferences |
callnumber-first-code |
T |
lang_code |
eng fre |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2020 |
contenttype_str_mv |
txt |
author_browse |
Schmitz Jan-Christoph Tilgner Stephan Kalischewski Kathrin Wagner Daniel Kummert Anton |
container_volume |
308, p 06003 |
class |
TA1-2040 |
format_se |
Elektronische Aufsätze |
author-letter |
Schmitz Jan-Christoph |
doi_str_mv |
10.1051/matecconf/202030806003 |
author2-role |
verfasserin |
title_sort |
hands on wheel classification based on depth images and neural networks |
callnumber |
TA1-2040 |
title_auth |
Hands on Wheel Classification Based on Depth Images and Neural Networks |
abstract |
This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. |
abstractGer |
This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. |
abstract_unstemmed |
This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
Hands on Wheel Classification Based on Depth Images and Neural Networks |
url |
https://doi.org/10.1051/matecconf/202030806003 https://doaj.org/article/afa92e39474547279e3c30487ce12a53 https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf https://doaj.org/toc/2261-236X |
remote_bool |
true |
author2 |
Tilgner Stephan Kalischewski Kathrin Wagner Daniel Kummert Anton |
author2Str |
Tilgner Stephan Kalischewski Kathrin Wagner Daniel Kummert Anton |
ppnlink |
720166209 |
callnumber-subject |
TA - General and Civil Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1051/matecconf/202030806003 |
callnumber-a |
TA1-2040 |
up_date |
2024-07-04T00:03:46.297Z |
_version_ |
1803604644383948802 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ054652065</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230308183553.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230227s2020 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1051/matecconf/202030806003</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ054652065</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJafa92e39474547279e3c30487ce12a53</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield><subfield code="a">fre</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TA1-2040</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Schmitz Jan-Christoph</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hands on Wheel Classification Based on Depth Images and Neural Networks</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This paper describes a system to automatically observe if the driver has his hands on the wheel, which is important to know that he can intervene if necessary. To accomplish this an artificial neural network is used, which utilizes depth information captured by a camera in the roof module of the car. This means that the driver and the steering wheel are viewed from above. The created classification system is described. It is designed to require as little computational effort as possible, since the target application is on an embedded system in the car. A dataset is presented and the effect of a class imbalance that is incorporated in it is studied. Furthermore, it is examined which part, i.e. the depth or the intensity image, of the available data is important to achieve the best possible performance. Finally, by examining a learning curve, an experiment is made to find out whether the recording of further training data would be reasonable.</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Engineering (General). Civil engineering (General)</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Tilgner Stephan</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Kalischewski Kathrin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Wagner Daniel</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Kummert Anton</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">MATEC Web of Conferences</subfield><subfield code="d">EDP Sciences, 2013</subfield><subfield code="g">308, p 06003(2020)</subfield><subfield code="w">(DE-627)720166209</subfield><subfield code="w">(DE-600)2673602-0</subfield><subfield code="x">2261236X</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:308, p 06003</subfield><subfield code="g">year:2020</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1051/matecconf/202030806003</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/afa92e39474547279e3c30487ce12a53</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.matec-conferences.org/articles/matecconf/pdf/2020/04/matecconf_ictte2019_06003.pdf</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2261-236X</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">308, p 06003</subfield><subfield code="j">2020</subfield></datafield></record></collection>
|
score |
7.400832 |