Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection
Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an i...
Ausführliche Beschreibung
Autor*in: |
Waseem Shariff [verfasserIn] Mehdi Sefidgar Dilmaghani [verfasserIn] Paul Kielty [verfasserIn] Joe Lemley [verfasserIn] Muhammad Ali Farooq [verfasserIn] Faisal Khan [verfasserIn] Peter Corcoran [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2023 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: IEEE Open Journal of Vehicular Technology - IEEE, 2020, 4(2023), Seite 836-848 |
---|---|
Übergeordnetes Werk: |
volume:4 ; year:2023 ; pages:836-848 |
Links: |
---|
DOI / URN: |
10.1109/OJVT.2023.3325656 |
---|
Katalog-ID: |
DOAJ098385364 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ098385364 | ||
003 | DE-627 | ||
005 | 20240414005450.0 | ||
007 | cr uuu---uuuuu | ||
008 | 240413s2023 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1109/OJVT.2023.3325656 |2 doi | |
035 | |a (DE-627)DOAJ098385364 | ||
035 | |a (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TA1001-1280 | |
050 | 0 | |a HE1-9990 | |
100 | 0 | |a Waseem Shariff |e verfasserin |4 aut | |
245 | 1 | 0 | |a Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection |
264 | 1 | |c 2023 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. | ||
650 | 4 | |a Distraction recognition | |
650 | 4 | |a driver monitoring system driver monitoring system (DMS) | |
650 | 4 | |a event based vision | |
650 | 4 | |a neuromorphic sensing | |
650 | 4 | |a submanifold convolutions | |
650 | 4 | |a computational complexity | |
653 | 0 | |a Transportation engineering | |
653 | 0 | |a Transportation and communications | |
700 | 0 | |a Mehdi Sefidgar Dilmaghani |e verfasserin |4 aut | |
700 | 0 | |a Paul Kielty |e verfasserin |4 aut | |
700 | 0 | |a Joe Lemley |e verfasserin |4 aut | |
700 | 0 | |a Muhammad Ali Farooq |e verfasserin |4 aut | |
700 | 0 | |a Faisal Khan |e verfasserin |4 aut | |
700 | 0 | |a Peter Corcoran |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t IEEE Open Journal of Vehicular Technology |d IEEE, 2020 |g 4(2023), Seite 836-848 |w (DE-627)1688452915 |w (DE-600)3006290-1 |x 26441330 |7 nnns |
773 | 1 | 8 | |g volume:4 |g year:2023 |g pages:836-848 |
856 | 4 | 0 | |u https://doi.org/10.1109/OJVT.2023.3325656 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 |z kostenfrei |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/document/10287603/ |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2644-1330 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 4 |j 2023 |h 836-848 |
author_variant |
w s ws m s d msd p k pk j l jl m a f maf f k fk p c pc |
---|---|
matchkey_str |
article:26441330:2023----::ermrhcrvroioigytmaopttoalefcetroocnet |
hierarchy_sort_str |
2023 |
callnumber-subject-code |
TA |
publishDate |
2023 |
allfields |
10.1109/OJVT.2023.3325656 doi (DE-627)DOAJ098385364 (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 DE-627 ger DE-627 rakwb eng TA1001-1280 HE1-9990 Waseem Shariff verfasserin aut Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity Transportation engineering Transportation and communications Mehdi Sefidgar Dilmaghani verfasserin aut Paul Kielty verfasserin aut Joe Lemley verfasserin aut Muhammad Ali Farooq verfasserin aut Faisal Khan verfasserin aut Peter Corcoran verfasserin aut In IEEE Open Journal of Vehicular Technology IEEE, 2020 4(2023), Seite 836-848 (DE-627)1688452915 (DE-600)3006290-1 26441330 nnns volume:4 year:2023 pages:836-848 https://doi.org/10.1109/OJVT.2023.3325656 kostenfrei https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 kostenfrei https://ieeexplore.ieee.org/document/10287603/ kostenfrei https://doaj.org/toc/2644-1330 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 4 2023 836-848 |
spelling |
10.1109/OJVT.2023.3325656 doi (DE-627)DOAJ098385364 (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 DE-627 ger DE-627 rakwb eng TA1001-1280 HE1-9990 Waseem Shariff verfasserin aut Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity Transportation engineering Transportation and communications Mehdi Sefidgar Dilmaghani verfasserin aut Paul Kielty verfasserin aut Joe Lemley verfasserin aut Muhammad Ali Farooq verfasserin aut Faisal Khan verfasserin aut Peter Corcoran verfasserin aut In IEEE Open Journal of Vehicular Technology IEEE, 2020 4(2023), Seite 836-848 (DE-627)1688452915 (DE-600)3006290-1 26441330 nnns volume:4 year:2023 pages:836-848 https://doi.org/10.1109/OJVT.2023.3325656 kostenfrei https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 kostenfrei https://ieeexplore.ieee.org/document/10287603/ kostenfrei https://doaj.org/toc/2644-1330 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 4 2023 836-848 |
allfields_unstemmed |
10.1109/OJVT.2023.3325656 doi (DE-627)DOAJ098385364 (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 DE-627 ger DE-627 rakwb eng TA1001-1280 HE1-9990 Waseem Shariff verfasserin aut Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity Transportation engineering Transportation and communications Mehdi Sefidgar Dilmaghani verfasserin aut Paul Kielty verfasserin aut Joe Lemley verfasserin aut Muhammad Ali Farooq verfasserin aut Faisal Khan verfasserin aut Peter Corcoran verfasserin aut In IEEE Open Journal of Vehicular Technology IEEE, 2020 4(2023), Seite 836-848 (DE-627)1688452915 (DE-600)3006290-1 26441330 nnns volume:4 year:2023 pages:836-848 https://doi.org/10.1109/OJVT.2023.3325656 kostenfrei https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 kostenfrei https://ieeexplore.ieee.org/document/10287603/ kostenfrei https://doaj.org/toc/2644-1330 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 4 2023 836-848 |
allfieldsGer |
10.1109/OJVT.2023.3325656 doi (DE-627)DOAJ098385364 (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 DE-627 ger DE-627 rakwb eng TA1001-1280 HE1-9990 Waseem Shariff verfasserin aut Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity Transportation engineering Transportation and communications Mehdi Sefidgar Dilmaghani verfasserin aut Paul Kielty verfasserin aut Joe Lemley verfasserin aut Muhammad Ali Farooq verfasserin aut Faisal Khan verfasserin aut Peter Corcoran verfasserin aut In IEEE Open Journal of Vehicular Technology IEEE, 2020 4(2023), Seite 836-848 (DE-627)1688452915 (DE-600)3006290-1 26441330 nnns volume:4 year:2023 pages:836-848 https://doi.org/10.1109/OJVT.2023.3325656 kostenfrei https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 kostenfrei https://ieeexplore.ieee.org/document/10287603/ kostenfrei https://doaj.org/toc/2644-1330 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 4 2023 836-848 |
allfieldsSound |
10.1109/OJVT.2023.3325656 doi (DE-627)DOAJ098385364 (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 DE-627 ger DE-627 rakwb eng TA1001-1280 HE1-9990 Waseem Shariff verfasserin aut Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection 2023 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity Transportation engineering Transportation and communications Mehdi Sefidgar Dilmaghani verfasserin aut Paul Kielty verfasserin aut Joe Lemley verfasserin aut Muhammad Ali Farooq verfasserin aut Faisal Khan verfasserin aut Peter Corcoran verfasserin aut In IEEE Open Journal of Vehicular Technology IEEE, 2020 4(2023), Seite 836-848 (DE-627)1688452915 (DE-600)3006290-1 26441330 nnns volume:4 year:2023 pages:836-848 https://doi.org/10.1109/OJVT.2023.3325656 kostenfrei https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 kostenfrei https://ieeexplore.ieee.org/document/10287603/ kostenfrei https://doaj.org/toc/2644-1330 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 4 2023 836-848 |
language |
English |
source |
In IEEE Open Journal of Vehicular Technology 4(2023), Seite 836-848 volume:4 year:2023 pages:836-848 |
sourceStr |
In IEEE Open Journal of Vehicular Technology 4(2023), Seite 836-848 volume:4 year:2023 pages:836-848 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity Transportation engineering Transportation and communications |
isfreeaccess_bool |
true |
container_title |
IEEE Open Journal of Vehicular Technology |
authorswithroles_txt_mv |
Waseem Shariff @@aut@@ Mehdi Sefidgar Dilmaghani @@aut@@ Paul Kielty @@aut@@ Joe Lemley @@aut@@ Muhammad Ali Farooq @@aut@@ Faisal Khan @@aut@@ Peter Corcoran @@aut@@ |
publishDateDaySort_date |
2023-01-01T00:00:00Z |
hierarchy_top_id |
1688452915 |
id |
DOAJ098385364 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ098385364</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240414005450.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">240413s2023 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/OJVT.2023.3325656</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ098385364</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJd25f3cd6134346d99139fedee29d8133</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TA1001-1280</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">HE1-9990</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Waseem Shariff</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2023</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Distraction recognition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">driver monitoring system driver monitoring system (DMS)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">event based vision</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">neuromorphic sensing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">submanifold convolutions</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">computational complexity</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Transportation engineering</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Transportation and communications</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Mehdi Sefidgar Dilmaghani</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Paul Kielty</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Joe Lemley</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Muhammad Ali Farooq</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Faisal Khan</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Peter Corcoran</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Open Journal of Vehicular Technology</subfield><subfield code="d">IEEE, 2020</subfield><subfield code="g">4(2023), Seite 836-848</subfield><subfield code="w">(DE-627)1688452915</subfield><subfield code="w">(DE-600)3006290-1</subfield><subfield code="x">26441330</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:4</subfield><subfield code="g">year:2023</subfield><subfield code="g">pages:836-848</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/OJVT.2023.3325656</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/d25f3cd6134346d99139fedee29d8133</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/10287603/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2644-1330</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">4</subfield><subfield code="j">2023</subfield><subfield code="h">836-848</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Waseem Shariff |
spellingShingle |
Waseem Shariff misc TA1001-1280 misc HE1-9990 misc Distraction recognition misc driver monitoring system driver monitoring system (DMS) misc event based vision misc neuromorphic sensing misc submanifold convolutions misc computational complexity misc Transportation engineering misc Transportation and communications Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection |
authorStr |
Waseem Shariff |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)1688452915 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TA1001-1280 |
illustrated |
Not Illustrated |
issn |
26441330 |
topic_title |
TA1001-1280 HE1-9990 Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection Distraction recognition driver monitoring system driver monitoring system (DMS) event based vision neuromorphic sensing submanifold convolutions computational complexity |
topic |
misc TA1001-1280 misc HE1-9990 misc Distraction recognition misc driver monitoring system driver monitoring system (DMS) misc event based vision misc neuromorphic sensing misc submanifold convolutions misc computational complexity misc Transportation engineering misc Transportation and communications |
topic_unstemmed |
misc TA1001-1280 misc HE1-9990 misc Distraction recognition misc driver monitoring system driver monitoring system (DMS) misc event based vision misc neuromorphic sensing misc submanifold convolutions misc computational complexity misc Transportation engineering misc Transportation and communications |
topic_browse |
misc TA1001-1280 misc HE1-9990 misc Distraction recognition misc driver monitoring system driver monitoring system (DMS) misc event based vision misc neuromorphic sensing misc submanifold convolutions misc computational complexity misc Transportation engineering misc Transportation and communications |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
IEEE Open Journal of Vehicular Technology |
hierarchy_parent_id |
1688452915 |
hierarchy_top_title |
IEEE Open Journal of Vehicular Technology |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)1688452915 (DE-600)3006290-1 |
title |
Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection |
ctrlnum |
(DE-627)DOAJ098385364 (DE-599)DOAJd25f3cd6134346d99139fedee29d8133 |
title_full |
Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection |
author_sort |
Waseem Shariff |
journal |
IEEE Open Journal of Vehicular Technology |
journalStr |
IEEE Open Journal of Vehicular Technology |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2023 |
contenttype_str_mv |
txt |
container_start_page |
836 |
author_browse |
Waseem Shariff Mehdi Sefidgar Dilmaghani Paul Kielty Joe Lemley Muhammad Ali Farooq Faisal Khan Peter Corcoran |
container_volume |
4 |
class |
TA1001-1280 HE1-9990 |
format_se |
Elektronische Aufsätze |
author-letter |
Waseem Shariff |
doi_str_mv |
10.1109/OJVT.2023.3325656 |
author2-role |
verfasserin |
title_sort |
neuromorphic driver monitoring systems: a computationally efficient proof-of-concept for driver distraction detection |
callnumber |
TA1001-1280 |
title_auth |
Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection |
abstract |
Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. |
abstractGer |
Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. |
abstract_unstemmed |
Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection |
url |
https://doi.org/10.1109/OJVT.2023.3325656 https://doaj.org/article/d25f3cd6134346d99139fedee29d8133 https://ieeexplore.ieee.org/document/10287603/ https://doaj.org/toc/2644-1330 |
remote_bool |
true |
author2 |
Mehdi Sefidgar Dilmaghani Paul Kielty Joe Lemley Muhammad Ali Farooq Faisal Khan Peter Corcoran |
author2Str |
Mehdi Sefidgar Dilmaghani Paul Kielty Joe Lemley Muhammad Ali Farooq Faisal Khan Peter Corcoran |
ppnlink |
1688452915 |
callnumber-subject |
TA - General and Civil Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1109/OJVT.2023.3325656 |
callnumber-a |
TA1001-1280 |
up_date |
2024-07-03T16:58:25.039Z |
_version_ |
1803577883405320192 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ098385364</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240414005450.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">240413s2023 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/OJVT.2023.3325656</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ098385364</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJd25f3cd6134346d99139fedee29d8133</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TA1001-1280</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">HE1-9990</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Waseem Shariff</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Neuromorphic Driver Monitoring Systems: A Computationally Efficient Proof-of-Concept for Driver Distraction Detection</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2023</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Driver Monitoring Systems (DMS) represent a promising approach for enhancing driver safety within vehicular technologies. This research explores the integration of neuromorphic event camera technology into DMS, offering faster and more localized detection of changes due to motion or lighting in an imaged scene. When applied to the observation of a human subject event camera provides a new level of sensing capabilities over conventional imaging systems. The study focuses on the application of DMS by incorporating the event cameras, augmented by submanifold sparse neural network models (SSNN) to reduce computational complexity. To validate the effectiveness of proposed machine learning pipeline built on event data we have opted the Driver Distraction as a critical use case. The SSNN model is trained on synthetic event data generated from the publicly available Drive&Act and Driver Monitoring Dataset (DMD) using a video-to-event conversion algorithm (V2E). The proposed approach yields comparable performance with state-of-the-art approaches, achieving an accuracy of 86.25% on the Drive&Act dataset and 80% on comprehensive DMD dataset while significantly reducing computational complexity. In addition, to demonstrate the generalization of our results the network is also evaluated using locally acquired event dataset gathered from a commercially available neuromorphic event sensor.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Distraction recognition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">driver monitoring system driver monitoring system (DMS)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">event based vision</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">neuromorphic sensing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">submanifold convolutions</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">computational complexity</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Transportation engineering</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Transportation and communications</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Mehdi Sefidgar Dilmaghani</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Paul Kielty</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Joe Lemley</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Muhammad Ali Farooq</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Faisal Khan</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Peter Corcoran</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Open Journal of Vehicular Technology</subfield><subfield code="d">IEEE, 2020</subfield><subfield code="g">4(2023), Seite 836-848</subfield><subfield code="w">(DE-627)1688452915</subfield><subfield code="w">(DE-600)3006290-1</subfield><subfield code="x">26441330</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:4</subfield><subfield code="g">year:2023</subfield><subfield code="g">pages:836-848</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/OJVT.2023.3325656</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/d25f3cd6134346d99139fedee29d8133</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/10287603/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2644-1330</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">4</subfield><subfield code="j">2023</subfield><subfield code="h">836-848</subfield></datafield></record></collection>
|
score |
7.4019566 |