Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors
Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree...
Ausführliche Beschreibung
Autor*in: |
João M. Lopes [verfasserIn] Joana Figueiredo [verfasserIn] Pedro Fonseca [verfasserIn] João J. Cerqueira [verfasserIn] João P. Vilas-Boas [verfasserIn] Cristina P. Santos [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2022 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: Sensors - MDPI AG, 2003, 22(2022), 20, p 7913 |
---|---|
Übergeordnetes Werk: |
volume:22 ; year:2022 ; number:20, p 7913 |
Links: |
---|
DOI / URN: |
10.3390/s22207913 |
---|
Katalog-ID: |
DOAJ021642672 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ021642672 | ||
003 | DE-627 | ||
005 | 20240414173820.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230226s2022 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.3390/s22207913 |2 doi | |
035 | |a (DE-627)DOAJ021642672 | ||
035 | |a (DE-599)DOAJ8242a98108c84002a76310da858da395 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TP1-1185 | |
100 | 0 | |a João M. Lopes |e verfasserin |4 aut | |
245 | 1 | 0 | |a Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors |
264 | 1 | |c 2022 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. | ||
650 | 4 | |a artificial intelligence | |
650 | 4 | |a deep learning | |
650 | 4 | |a energy expenditure | |
650 | 4 | |a gait rehabilitation | |
650 | 4 | |a human-in-the-loop | |
650 | 4 | |a robotics-based rehabilitation | |
653 | 0 | |a Chemical technology | |
700 | 0 | |a Joana Figueiredo |e verfasserin |4 aut | |
700 | 0 | |a Pedro Fonseca |e verfasserin |4 aut | |
700 | 0 | |a João J. Cerqueira |e verfasserin |4 aut | |
700 | 0 | |a João P. Vilas-Boas |e verfasserin |4 aut | |
700 | 0 | |a Cristina P. Santos |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t Sensors |d MDPI AG, 2003 |g 22(2022), 20, p 7913 |w (DE-627)331640910 |w (DE-600)2052857-7 |x 14248220 |7 nnns |
773 | 1 | 8 | |g volume:22 |g year:2022 |g number:20, p 7913 |
856 | 4 | 0 | |u https://doi.org/10.3390/s22207913 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/8242a98108c84002a76310da858da395 |z kostenfrei |
856 | 4 | 0 | |u https://www.mdpi.com/1424-8220/22/20/7913 |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/1424-8220 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_206 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2011 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2057 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_2507 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 22 |j 2022 |e 20, p 7913 |
author_variant |
j m l jml j f jf p f pf j j c jjc j p v b jpvb c p s cps |
---|---|
matchkey_str |
article:14248220:2022----::eperigaeeegepniuesiainnsitdnnnsitdatsnieta |
hierarchy_sort_str |
2022 |
callnumber-subject-code |
TP |
publishDate |
2022 |
allfields |
10.3390/s22207913 doi (DE-627)DOAJ021642672 (DE-599)DOAJ8242a98108c84002a76310da858da395 DE-627 ger DE-627 rakwb eng TP1-1185 João M. Lopes verfasserin aut Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation Chemical technology Joana Figueiredo verfasserin aut Pedro Fonseca verfasserin aut João J. Cerqueira verfasserin aut João P. Vilas-Boas verfasserin aut Cristina P. Santos verfasserin aut In Sensors MDPI AG, 2003 22(2022), 20, p 7913 (DE-627)331640910 (DE-600)2052857-7 14248220 nnns volume:22 year:2022 number:20, p 7913 https://doi.org/10.3390/s22207913 kostenfrei https://doaj.org/article/8242a98108c84002a76310da858da395 kostenfrei https://www.mdpi.com/1424-8220/22/20/7913 kostenfrei https://doaj.org/toc/1424-8220 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2111 GBV_ILN_2507 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 22 2022 20, p 7913 |
spelling |
10.3390/s22207913 doi (DE-627)DOAJ021642672 (DE-599)DOAJ8242a98108c84002a76310da858da395 DE-627 ger DE-627 rakwb eng TP1-1185 João M. Lopes verfasserin aut Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation Chemical technology Joana Figueiredo verfasserin aut Pedro Fonseca verfasserin aut João J. Cerqueira verfasserin aut João P. Vilas-Boas verfasserin aut Cristina P. Santos verfasserin aut In Sensors MDPI AG, 2003 22(2022), 20, p 7913 (DE-627)331640910 (DE-600)2052857-7 14248220 nnns volume:22 year:2022 number:20, p 7913 https://doi.org/10.3390/s22207913 kostenfrei https://doaj.org/article/8242a98108c84002a76310da858da395 kostenfrei https://www.mdpi.com/1424-8220/22/20/7913 kostenfrei https://doaj.org/toc/1424-8220 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2111 GBV_ILN_2507 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 22 2022 20, p 7913 |
allfields_unstemmed |
10.3390/s22207913 doi (DE-627)DOAJ021642672 (DE-599)DOAJ8242a98108c84002a76310da858da395 DE-627 ger DE-627 rakwb eng TP1-1185 João M. Lopes verfasserin aut Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation Chemical technology Joana Figueiredo verfasserin aut Pedro Fonseca verfasserin aut João J. Cerqueira verfasserin aut João P. Vilas-Boas verfasserin aut Cristina P. Santos verfasserin aut In Sensors MDPI AG, 2003 22(2022), 20, p 7913 (DE-627)331640910 (DE-600)2052857-7 14248220 nnns volume:22 year:2022 number:20, p 7913 https://doi.org/10.3390/s22207913 kostenfrei https://doaj.org/article/8242a98108c84002a76310da858da395 kostenfrei https://www.mdpi.com/1424-8220/22/20/7913 kostenfrei https://doaj.org/toc/1424-8220 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2111 GBV_ILN_2507 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 22 2022 20, p 7913 |
allfieldsGer |
10.3390/s22207913 doi (DE-627)DOAJ021642672 (DE-599)DOAJ8242a98108c84002a76310da858da395 DE-627 ger DE-627 rakwb eng TP1-1185 João M. Lopes verfasserin aut Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation Chemical technology Joana Figueiredo verfasserin aut Pedro Fonseca verfasserin aut João J. Cerqueira verfasserin aut João P. Vilas-Boas verfasserin aut Cristina P. Santos verfasserin aut In Sensors MDPI AG, 2003 22(2022), 20, p 7913 (DE-627)331640910 (DE-600)2052857-7 14248220 nnns volume:22 year:2022 number:20, p 7913 https://doi.org/10.3390/s22207913 kostenfrei https://doaj.org/article/8242a98108c84002a76310da858da395 kostenfrei https://www.mdpi.com/1424-8220/22/20/7913 kostenfrei https://doaj.org/toc/1424-8220 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2111 GBV_ILN_2507 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 22 2022 20, p 7913 |
allfieldsSound |
10.3390/s22207913 doi (DE-627)DOAJ021642672 (DE-599)DOAJ8242a98108c84002a76310da858da395 DE-627 ger DE-627 rakwb eng TP1-1185 João M. Lopes verfasserin aut Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors 2022 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation Chemical technology Joana Figueiredo verfasserin aut Pedro Fonseca verfasserin aut João J. Cerqueira verfasserin aut João P. Vilas-Boas verfasserin aut Cristina P. Santos verfasserin aut In Sensors MDPI AG, 2003 22(2022), 20, p 7913 (DE-627)331640910 (DE-600)2052857-7 14248220 nnns volume:22 year:2022 number:20, p 7913 https://doi.org/10.3390/s22207913 kostenfrei https://doaj.org/article/8242a98108c84002a76310da858da395 kostenfrei https://www.mdpi.com/1424-8220/22/20/7913 kostenfrei https://doaj.org/toc/1424-8220 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2111 GBV_ILN_2507 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 22 2022 20, p 7913 |
language |
English |
source |
In Sensors 22(2022), 20, p 7913 volume:22 year:2022 number:20, p 7913 |
sourceStr |
In Sensors 22(2022), 20, p 7913 volume:22 year:2022 number:20, p 7913 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation Chemical technology |
isfreeaccess_bool |
true |
container_title |
Sensors |
authorswithroles_txt_mv |
João M. Lopes @@aut@@ Joana Figueiredo @@aut@@ Pedro Fonseca @@aut@@ João J. Cerqueira @@aut@@ João P. Vilas-Boas @@aut@@ Cristina P. Santos @@aut@@ |
publishDateDaySort_date |
2022-01-01T00:00:00Z |
hierarchy_top_id |
331640910 |
id |
DOAJ021642672 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ021642672</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240414173820.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230226s2022 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.3390/s22207913</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ021642672</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ8242a98108c84002a76310da858da395</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TP1-1185</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">João M. Lopes</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2022</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">artificial intelligence</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">deep learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">energy expenditure</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">gait rehabilitation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">human-in-the-loop</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">robotics-based rehabilitation</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Chemical technology</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Joana Figueiredo</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Pedro Fonseca</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">João J. Cerqueira</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">João P. Vilas-Boas</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Cristina P. Santos</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">Sensors</subfield><subfield code="d">MDPI AG, 2003</subfield><subfield code="g">22(2022), 20, p 7913</subfield><subfield code="w">(DE-627)331640910</subfield><subfield code="w">(DE-600)2052857-7</subfield><subfield code="x">14248220</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:22</subfield><subfield code="g">year:2022</subfield><subfield code="g">number:20, p 7913</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.3390/s22207913</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/8242a98108c84002a76310da858da395</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.mdpi.com/1424-8220/22/20/7913</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/1424-8220</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_206</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">22</subfield><subfield code="j">2022</subfield><subfield code="e">20, p 7913</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
João M. Lopes |
spellingShingle |
João M. Lopes misc TP1-1185 misc artificial intelligence misc deep learning misc energy expenditure misc gait rehabilitation misc human-in-the-loop misc robotics-based rehabilitation misc Chemical technology Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors |
authorStr |
João M. Lopes |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)331640910 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TP1-1185 |
illustrated |
Not Illustrated |
issn |
14248220 |
topic_title |
TP1-1185 Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors artificial intelligence deep learning energy expenditure gait rehabilitation human-in-the-loop robotics-based rehabilitation |
topic |
misc TP1-1185 misc artificial intelligence misc deep learning misc energy expenditure misc gait rehabilitation misc human-in-the-loop misc robotics-based rehabilitation misc Chemical technology |
topic_unstemmed |
misc TP1-1185 misc artificial intelligence misc deep learning misc energy expenditure misc gait rehabilitation misc human-in-the-loop misc robotics-based rehabilitation misc Chemical technology |
topic_browse |
misc TP1-1185 misc artificial intelligence misc deep learning misc energy expenditure misc gait rehabilitation misc human-in-the-loop misc robotics-based rehabilitation misc Chemical technology |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Sensors |
hierarchy_parent_id |
331640910 |
hierarchy_top_title |
Sensors |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)331640910 (DE-600)2052857-7 |
title |
Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors |
ctrlnum |
(DE-627)DOAJ021642672 (DE-599)DOAJ8242a98108c84002a76310da858da395 |
title_full |
Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors |
author_sort |
João M. Lopes |
journal |
Sensors |
journalStr |
Sensors |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2022 |
contenttype_str_mv |
txt |
author_browse |
João M. Lopes Joana Figueiredo Pedro Fonseca João J. Cerqueira João P. Vilas-Boas Cristina P. Santos |
container_volume |
22 |
class |
TP1-1185 |
format_se |
Elektronische Aufsätze |
author-letter |
João M. Lopes |
doi_str_mv |
10.3390/s22207913 |
author2-role |
verfasserin |
title_sort |
deep learning-based energy expenditure estimation in assisted and non-assisted gait using inertial, emg, and heart rate wearable sensors |
callnumber |
TP1-1185 |
title_auth |
Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors |
abstract |
Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. |
abstractGer |
Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. |
abstract_unstemmed |
Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_206 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2005 GBV_ILN_2009 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2111 GBV_ILN_2507 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
container_issue |
20, p 7913 |
title_short |
Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors |
url |
https://doi.org/10.3390/s22207913 https://doaj.org/article/8242a98108c84002a76310da858da395 https://www.mdpi.com/1424-8220/22/20/7913 https://doaj.org/toc/1424-8220 |
remote_bool |
true |
author2 |
Joana Figueiredo Pedro Fonseca João J. Cerqueira João P. Vilas-Boas Cristina P. Santos |
author2Str |
Joana Figueiredo Pedro Fonseca João J. Cerqueira João P. Vilas-Boas Cristina P. Santos |
ppnlink |
331640910 |
callnumber-subject |
TP - Chemical Technology |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.3390/s22207913 |
callnumber-a |
TP1-1185 |
up_date |
2024-07-03T22:03:02.959Z |
_version_ |
1803597049192513536 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ021642672</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240414173820.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230226s2022 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.3390/s22207913</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ021642672</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ8242a98108c84002a76310da858da395</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TP1-1185</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">João M. Lopes</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Deep Learning-Based Energy Expenditure Estimation in Assisted and Non-Assisted Gait Using Inertial, EMG, and Heart Rate Wearable Sensors</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2022</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Energy expenditure is a key rehabilitation outcome and is starting to be used in robotics-based rehabilitation through human-in-the-loop control to tailor robot assistance towards reducing patients’ energy effort. However, it is usually assessed by indirect calorimetry which entails a certain degree of invasiveness and provides delayed data, which is not suitable for controlling robotic devices. This work proposes a deep learning-based tool for steady-state energy expenditure estimation based on more ergonomic sensors than indirect calorimetry. The study innovates by estimating the energy expenditure in assisted and non-assisted conditions and in slow gait speeds similarly to impaired subjects. This work explores and benchmarks the long short-term memory (LSTM) and convolutional neural network (CNN) as deep learning regressors. As inputs, we fused inertial data, electromyography, and heart rate signals measured by on-body sensors from eight healthy volunteers walking with and without assistance from an ankle-foot exoskeleton at 0.22, 0.33, and 0.44 m/s. LSTM and CNN were compared against indirect calorimetry using a leave-one-subject-out cross-validation technique. Results showed the suitability of this tool, especially CNN, that demonstrated root-mean-squared errors of 0.36 W/kg and high correlation (ρ < 0.85) between target and estimation (<inline-formula<<math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"<<semantics<<mrow<<msup<<mrow<<mover accent="true"<<mrow<<mi mathvariant="normal"<R</mi<</mrow<<mo stretchy="true"<¯</mo<</mover<</mrow<<mn<2</mn<</msup<</mrow<</semantics<</math<</inline-formula< = 0.79). CNN was able to discriminate the energy expenditure between assisted and non-assisted gait, basal, and walking energy expenditure, throughout three slow gait speeds. CNN regressor driven by kinematic and physiological data was shown to be a more ergonomic technique for estimating the energy expenditure, contributing to the clinical assessment in slow and robotic-assisted gait and future research concerning human-in-the-loop control.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">artificial intelligence</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">deep learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">energy expenditure</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">gait rehabilitation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">human-in-the-loop</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">robotics-based rehabilitation</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Chemical technology</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Joana Figueiredo</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Pedro Fonseca</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">João J. Cerqueira</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">João P. Vilas-Boas</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Cristina P. Santos</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">Sensors</subfield><subfield code="d">MDPI AG, 2003</subfield><subfield code="g">22(2022), 20, p 7913</subfield><subfield code="w">(DE-627)331640910</subfield><subfield code="w">(DE-600)2052857-7</subfield><subfield code="x">14248220</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:22</subfield><subfield code="g">year:2022</subfield><subfield code="g">number:20, p 7913</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.3390/s22207913</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/8242a98108c84002a76310da858da395</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.mdpi.com/1424-8220/22/20/7913</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/1424-8220</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_206</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">22</subfield><subfield code="j">2022</subfield><subfield code="e">20, p 7913</subfield></datafield></record></collection>
|
score |
7.398796 |