UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment
Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that peop...
Ausführliche Beschreibung
Autor*in: |
Ghosh, Arindam [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2019 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer-Verlag GmbH Germany, part of Springer Nature 2019 |
---|
Übergeordnetes Werk: |
Enthalten in: Journal of ambient intelligence and humanized computing - Berlin : Springer, 2010, 14(2019), 12 vom: 19. März, Seite 15809-15830 |
---|---|
Übergeordnetes Werk: |
volume:14 ; year:2019 ; number:12 ; day:19 ; month:03 ; pages:15809-15830 |
Links: |
---|
DOI / URN: |
10.1007/s12652-019-01260-y |
---|
Katalog-ID: |
SPR054185610 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | SPR054185610 | ||
003 | DE-627 | ||
005 | 20231228070402.0 | ||
007 | cr uuu---uuuuu | ||
008 | 231228s2019 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s12652-019-01260-y |2 doi | |
035 | |a (DE-627)SPR054185610 | ||
035 | |a (SPR)s12652-019-01260-y-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Ghosh, Arindam |e verfasserin |0 (orcid)0000-0001-9700-3555 |4 aut | |
245 | 1 | 0 | |a UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
264 | 1 | |c 2019 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a © Springer-Verlag GmbH Germany, part of Springer Nature 2019 | ||
520 | |a Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. | ||
650 | 4 | |a Human activity |7 (dpeaa)DE-He213 | |
650 | 4 | |a Non-invasive |7 (dpeaa)DE-He213 | |
650 | 4 | |a Non-intrusive |7 (dpeaa)DE-He213 | |
650 | 4 | |a Device free |7 (dpeaa)DE-He213 | |
650 | 4 | |a Group activity |7 (dpeaa)DE-He213 | |
650 | 4 | |a Postural activity |7 (dpeaa)DE-He213 | |
650 | 4 | |a Ultrasonic sensor |7 (dpeaa)DE-He213 | |
650 | 4 | |a Movement tracking |7 (dpeaa)DE-He213 | |
650 | 4 | |a Human counting |7 (dpeaa)DE-He213 | |
650 | 4 | |a Human identification |7 (dpeaa)DE-He213 | |
700 | 1 | |a Chakraborty, Amartya |4 aut | |
700 | 1 | |a Chakraborty, Dhruv |4 aut | |
700 | 1 | |a Saha, Mousumi |4 aut | |
700 | 1 | |a Saha, Sujoy |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Journal of ambient intelligence and humanized computing |d Berlin : Springer, 2010 |g 14(2019), 12 vom: 19. März, Seite 15809-15830 |w (DE-627)620775734 |w (DE-600)2543187-0 |x 1868-5145 |7 nnns |
773 | 1 | 8 | |g volume:14 |g year:2019 |g number:12 |g day:19 |g month:03 |g pages:15809-15830 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s12652-019-01260-y |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_32 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_74 | ||
912 | |a GBV_ILN_90 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_100 | ||
912 | |a GBV_ILN_101 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_120 | ||
912 | |a GBV_ILN_138 | ||
912 | |a GBV_ILN_150 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_171 | ||
912 | |a GBV_ILN_187 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_224 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_250 | ||
912 | |a GBV_ILN_281 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_636 | ||
912 | |a GBV_ILN_702 | ||
912 | |a GBV_ILN_2001 | ||
912 | |a GBV_ILN_2003 | ||
912 | |a GBV_ILN_2004 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2006 | ||
912 | |a GBV_ILN_2007 | ||
912 | |a GBV_ILN_2008 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2010 | ||
912 | |a GBV_ILN_2011 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2015 | ||
912 | |a GBV_ILN_2020 | ||
912 | |a GBV_ILN_2021 | ||
912 | |a GBV_ILN_2025 | ||
912 | |a GBV_ILN_2026 | ||
912 | |a GBV_ILN_2027 | ||
912 | |a GBV_ILN_2031 | ||
912 | |a GBV_ILN_2034 | ||
912 | |a GBV_ILN_2037 | ||
912 | |a GBV_ILN_2038 | ||
912 | |a GBV_ILN_2039 | ||
912 | |a GBV_ILN_2044 | ||
912 | |a GBV_ILN_2048 | ||
912 | |a GBV_ILN_2049 | ||
912 | |a GBV_ILN_2050 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2057 | ||
912 | |a GBV_ILN_2059 | ||
912 | |a GBV_ILN_2061 | ||
912 | |a GBV_ILN_2064 | ||
912 | |a GBV_ILN_2065 | ||
912 | |a GBV_ILN_2068 | ||
912 | |a GBV_ILN_2070 | ||
912 | |a GBV_ILN_2086 | ||
912 | |a GBV_ILN_2088 | ||
912 | |a GBV_ILN_2093 | ||
912 | |a GBV_ILN_2106 | ||
912 | |a GBV_ILN_2107 | ||
912 | |a GBV_ILN_2108 | ||
912 | |a GBV_ILN_2110 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_2112 | ||
912 | |a GBV_ILN_2113 | ||
912 | |a GBV_ILN_2116 | ||
912 | |a GBV_ILN_2118 | ||
912 | |a GBV_ILN_2119 | ||
912 | |a GBV_ILN_2122 | ||
912 | |a GBV_ILN_2129 | ||
912 | |a GBV_ILN_2143 | ||
912 | |a GBV_ILN_2144 | ||
912 | |a GBV_ILN_2147 | ||
912 | |a GBV_ILN_2148 | ||
912 | |a GBV_ILN_2152 | ||
912 | |a GBV_ILN_2153 | ||
912 | |a GBV_ILN_2188 | ||
912 | |a GBV_ILN_2190 | ||
912 | |a GBV_ILN_2232 | ||
912 | |a GBV_ILN_2336 | ||
912 | |a GBV_ILN_2446 | ||
912 | |a GBV_ILN_2470 | ||
912 | |a GBV_ILN_2472 | ||
912 | |a GBV_ILN_2507 | ||
912 | |a GBV_ILN_2522 | ||
912 | |a GBV_ILN_2548 | ||
912 | |a GBV_ILN_4035 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4046 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4242 | ||
912 | |a GBV_ILN_4246 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4251 | ||
912 | |a GBV_ILN_4277 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4326 | ||
912 | |a GBV_ILN_4333 | ||
912 | |a GBV_ILN_4334 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4336 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4393 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 14 |j 2019 |e 12 |b 19 |c 03 |h 15809-15830 |
author_variant |
a g ag a c ac d c dc m s ms s s ss |
---|---|
matchkey_str |
article:18685145:2019----::lrsnennnrsvapocfruaatvtietfctouigeeoeeuutaois |
hierarchy_sort_str |
2019 |
publishDate |
2019 |
allfields |
10.1007/s12652-019-01260-y doi (DE-627)SPR054185610 (SPR)s12652-019-01260-y-e DE-627 ger DE-627 rakwb eng Ghosh, Arindam verfasserin (orcid)0000-0001-9700-3555 aut UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag GmbH Germany, part of Springer Nature 2019 Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. Human activity (dpeaa)DE-He213 Non-invasive (dpeaa)DE-He213 Non-intrusive (dpeaa)DE-He213 Device free (dpeaa)DE-He213 Group activity (dpeaa)DE-He213 Postural activity (dpeaa)DE-He213 Ultrasonic sensor (dpeaa)DE-He213 Movement tracking (dpeaa)DE-He213 Human counting (dpeaa)DE-He213 Human identification (dpeaa)DE-He213 Chakraborty, Amartya aut Chakraborty, Dhruv aut Saha, Mousumi aut Saha, Sujoy aut Enthalten in Journal of ambient intelligence and humanized computing Berlin : Springer, 2010 14(2019), 12 vom: 19. März, Seite 15809-15830 (DE-627)620775734 (DE-600)2543187-0 1868-5145 nnns volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 https://dx.doi.org/10.1007/s12652-019-01260-y lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4277 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 14 2019 12 19 03 15809-15830 |
spelling |
10.1007/s12652-019-01260-y doi (DE-627)SPR054185610 (SPR)s12652-019-01260-y-e DE-627 ger DE-627 rakwb eng Ghosh, Arindam verfasserin (orcid)0000-0001-9700-3555 aut UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag GmbH Germany, part of Springer Nature 2019 Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. Human activity (dpeaa)DE-He213 Non-invasive (dpeaa)DE-He213 Non-intrusive (dpeaa)DE-He213 Device free (dpeaa)DE-He213 Group activity (dpeaa)DE-He213 Postural activity (dpeaa)DE-He213 Ultrasonic sensor (dpeaa)DE-He213 Movement tracking (dpeaa)DE-He213 Human counting (dpeaa)DE-He213 Human identification (dpeaa)DE-He213 Chakraborty, Amartya aut Chakraborty, Dhruv aut Saha, Mousumi aut Saha, Sujoy aut Enthalten in Journal of ambient intelligence and humanized computing Berlin : Springer, 2010 14(2019), 12 vom: 19. März, Seite 15809-15830 (DE-627)620775734 (DE-600)2543187-0 1868-5145 nnns volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 https://dx.doi.org/10.1007/s12652-019-01260-y lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4277 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 14 2019 12 19 03 15809-15830 |
allfields_unstemmed |
10.1007/s12652-019-01260-y doi (DE-627)SPR054185610 (SPR)s12652-019-01260-y-e DE-627 ger DE-627 rakwb eng Ghosh, Arindam verfasserin (orcid)0000-0001-9700-3555 aut UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag GmbH Germany, part of Springer Nature 2019 Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. Human activity (dpeaa)DE-He213 Non-invasive (dpeaa)DE-He213 Non-intrusive (dpeaa)DE-He213 Device free (dpeaa)DE-He213 Group activity (dpeaa)DE-He213 Postural activity (dpeaa)DE-He213 Ultrasonic sensor (dpeaa)DE-He213 Movement tracking (dpeaa)DE-He213 Human counting (dpeaa)DE-He213 Human identification (dpeaa)DE-He213 Chakraborty, Amartya aut Chakraborty, Dhruv aut Saha, Mousumi aut Saha, Sujoy aut Enthalten in Journal of ambient intelligence and humanized computing Berlin : Springer, 2010 14(2019), 12 vom: 19. März, Seite 15809-15830 (DE-627)620775734 (DE-600)2543187-0 1868-5145 nnns volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 https://dx.doi.org/10.1007/s12652-019-01260-y lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4277 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 14 2019 12 19 03 15809-15830 |
allfieldsGer |
10.1007/s12652-019-01260-y doi (DE-627)SPR054185610 (SPR)s12652-019-01260-y-e DE-627 ger DE-627 rakwb eng Ghosh, Arindam verfasserin (orcid)0000-0001-9700-3555 aut UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag GmbH Germany, part of Springer Nature 2019 Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. Human activity (dpeaa)DE-He213 Non-invasive (dpeaa)DE-He213 Non-intrusive (dpeaa)DE-He213 Device free (dpeaa)DE-He213 Group activity (dpeaa)DE-He213 Postural activity (dpeaa)DE-He213 Ultrasonic sensor (dpeaa)DE-He213 Movement tracking (dpeaa)DE-He213 Human counting (dpeaa)DE-He213 Human identification (dpeaa)DE-He213 Chakraborty, Amartya aut Chakraborty, Dhruv aut Saha, Mousumi aut Saha, Sujoy aut Enthalten in Journal of ambient intelligence and humanized computing Berlin : Springer, 2010 14(2019), 12 vom: 19. März, Seite 15809-15830 (DE-627)620775734 (DE-600)2543187-0 1868-5145 nnns volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 https://dx.doi.org/10.1007/s12652-019-01260-y lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4277 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 14 2019 12 19 03 15809-15830 |
allfieldsSound |
10.1007/s12652-019-01260-y doi (DE-627)SPR054185610 (SPR)s12652-019-01260-y-e DE-627 ger DE-627 rakwb eng Ghosh, Arindam verfasserin (orcid)0000-0001-9700-3555 aut UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment 2019 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier © Springer-Verlag GmbH Germany, part of Springer Nature 2019 Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. Human activity (dpeaa)DE-He213 Non-invasive (dpeaa)DE-He213 Non-intrusive (dpeaa)DE-He213 Device free (dpeaa)DE-He213 Group activity (dpeaa)DE-He213 Postural activity (dpeaa)DE-He213 Ultrasonic sensor (dpeaa)DE-He213 Movement tracking (dpeaa)DE-He213 Human counting (dpeaa)DE-He213 Human identification (dpeaa)DE-He213 Chakraborty, Amartya aut Chakraborty, Dhruv aut Saha, Mousumi aut Saha, Sujoy aut Enthalten in Journal of ambient intelligence and humanized computing Berlin : Springer, 2010 14(2019), 12 vom: 19. März, Seite 15809-15830 (DE-627)620775734 (DE-600)2543187-0 1868-5145 nnns volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 https://dx.doi.org/10.1007/s12652-019-01260-y lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4277 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 AR 14 2019 12 19 03 15809-15830 |
language |
English |
source |
Enthalten in Journal of ambient intelligence and humanized computing 14(2019), 12 vom: 19. März, Seite 15809-15830 volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 |
sourceStr |
Enthalten in Journal of ambient intelligence and humanized computing 14(2019), 12 vom: 19. März, Seite 15809-15830 volume:14 year:2019 number:12 day:19 month:03 pages:15809-15830 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Human activity Non-invasive Non-intrusive Device free Group activity Postural activity Ultrasonic sensor Movement tracking Human counting Human identification |
isfreeaccess_bool |
false |
container_title |
Journal of ambient intelligence and humanized computing |
authorswithroles_txt_mv |
Ghosh, Arindam @@aut@@ Chakraborty, Amartya @@aut@@ Chakraborty, Dhruv @@aut@@ Saha, Mousumi @@aut@@ Saha, Sujoy @@aut@@ |
publishDateDaySort_date |
2019-03-19T00:00:00Z |
hierarchy_top_id |
620775734 |
id |
SPR054185610 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">SPR054185610</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20231228070402.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">231228s2019 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s12652-019-01260-y</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR054185610</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s12652-019-01260-y-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Ghosh, Arindam</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0001-9700-3555</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2019</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag GmbH Germany, part of Springer Nature 2019</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Human activity</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Non-invasive</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Non-intrusive</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Device free</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Group activity</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Postural activity</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Ultrasonic sensor</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Movement tracking</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Human counting</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Human identification</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chakraborty, Amartya</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chakraborty, Dhruv</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Saha, Mousumi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Saha, Sujoy</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Journal of ambient intelligence and humanized computing</subfield><subfield code="d">Berlin : Springer, 2010</subfield><subfield code="g">14(2019), 12 vom: 19. März, Seite 15809-15830</subfield><subfield code="w">(DE-627)620775734</subfield><subfield code="w">(DE-600)2543187-0</subfield><subfield code="x">1868-5145</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:14</subfield><subfield code="g">year:2019</subfield><subfield code="g">number:12</subfield><subfield code="g">day:19</subfield><subfield code="g">month:03</subfield><subfield code="g">pages:15809-15830</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s12652-019-01260-y</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_101</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2008</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2070</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2086</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2116</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2188</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2548</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">14</subfield><subfield code="j">2019</subfield><subfield code="e">12</subfield><subfield code="b">19</subfield><subfield code="c">03</subfield><subfield code="h">15809-15830</subfield></datafield></record></collection>
|
author |
Ghosh, Arindam |
spellingShingle |
Ghosh, Arindam misc Human activity misc Non-invasive misc Non-intrusive misc Device free misc Group activity misc Postural activity misc Ultrasonic sensor misc Movement tracking misc Human counting misc Human identification UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
authorStr |
Ghosh, Arindam |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)620775734 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
issn |
1868-5145 |
topic_title |
UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment Human activity (dpeaa)DE-He213 Non-invasive (dpeaa)DE-He213 Non-intrusive (dpeaa)DE-He213 Device free (dpeaa)DE-He213 Group activity (dpeaa)DE-He213 Postural activity (dpeaa)DE-He213 Ultrasonic sensor (dpeaa)DE-He213 Movement tracking (dpeaa)DE-He213 Human counting (dpeaa)DE-He213 Human identification (dpeaa)DE-He213 |
topic |
misc Human activity misc Non-invasive misc Non-intrusive misc Device free misc Group activity misc Postural activity misc Ultrasonic sensor misc Movement tracking misc Human counting misc Human identification |
topic_unstemmed |
misc Human activity misc Non-invasive misc Non-intrusive misc Device free misc Group activity misc Postural activity misc Ultrasonic sensor misc Movement tracking misc Human counting misc Human identification |
topic_browse |
misc Human activity misc Non-invasive misc Non-intrusive misc Device free misc Group activity misc Postural activity misc Ultrasonic sensor misc Movement tracking misc Human counting misc Human identification |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Journal of ambient intelligence and humanized computing |
hierarchy_parent_id |
620775734 |
hierarchy_top_title |
Journal of ambient intelligence and humanized computing |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)620775734 (DE-600)2543187-0 |
title |
UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
ctrlnum |
(DE-627)SPR054185610 (SPR)s12652-019-01260-y-e |
title_full |
UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
author_sort |
Ghosh, Arindam |
journal |
Journal of ambient intelligence and humanized computing |
journalStr |
Journal of ambient intelligence and humanized computing |
lang_code |
eng |
isOA_bool |
false |
recordtype |
marc |
publishDateSort |
2019 |
contenttype_str_mv |
txt |
container_start_page |
15809 |
author_browse |
Ghosh, Arindam Chakraborty, Amartya Chakraborty, Dhruv Saha, Mousumi Saha, Sujoy |
container_volume |
14 |
format_se |
Elektronische Aufsätze |
author-letter |
Ghosh, Arindam |
doi_str_mv |
10.1007/s12652-019-01260-y |
normlink |
(ORCID)0000-0001-9700-3555 |
normlink_prefix_str_mv |
(orcid)0000-0001-9700-3555 |
title_sort |
ultrasense: a non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
title_auth |
UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
abstract |
Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. © Springer-Verlag GmbH Germany, part of Springer Nature 2019 |
abstractGer |
Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. © Springer-Verlag GmbH Germany, part of Springer Nature 2019 |
abstract_unstemmed |
Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring. © Springer-Verlag GmbH Germany, part of Springer Nature 2019 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_101 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2008 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4277 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 |
container_issue |
12 |
title_short |
UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment |
url |
https://dx.doi.org/10.1007/s12652-019-01260-y |
remote_bool |
true |
author2 |
Chakraborty, Amartya Chakraborty, Dhruv Saha, Mousumi Saha, Sujoy |
author2Str |
Chakraborty, Amartya Chakraborty, Dhruv Saha, Mousumi Saha, Sujoy |
ppnlink |
620775734 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s12652-019-01260-y |
up_date |
2024-07-04T00:21:36.447Z |
_version_ |
1803605766516506624 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">SPR054185610</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20231228070402.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">231228s2019 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s12652-019-01260-y</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR054185610</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s12652-019-01260-y-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Ghosh, Arindam</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0001-9700-3555</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2019</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag GmbH Germany, part of Springer Nature 2019</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Recognizing human activities non-intrusively has prevailed as a challenging and active area of research. In real life, it is a major requirement for human-centric applications like assisted living for elderly care, health-care and creating a smart home environment etc. Considering that people spend more than 90% (Klepeis et al. in J Exposure Sci Environ Epidemiol 11(3):231, 2001) of their time indoors, a proper indoor activity monitoring system will be helpful to monitor the abnormal behavior of the occupants. Existing approaches have implemented intrusive or invasive methods such as a camera or wearable devices. In this work, we present a non-invasive, non-intrusive sensing technique using an array of heterogeneous ultrasonic sensors for human activity monitoring. The ultrasonic sensors are placed in two separate deployments as sensor grids and in different positions of the door-frame. The proposed system senses a stream of events as the occupants perform different activities categorized as primary, postural and group activities. The primary activities considered are sitting, standing and fall. The postural activities are intermediate transitional states in the primary activities. These activities when performed in groups, are considered as a group activity. Other than activities it can identify different indoor movements, count room occupancy and identify occupants. Based on the collected data, the results show that the proposed system achieves an accuracy of more than 90% for detection of different activities and shows improvement over existing works. The final outcome of this work can be seen as developing the current prototype into smart ceiling panels that can be easily used in the indoor environment for human activity monitoring.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Human activity</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Non-invasive</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Non-intrusive</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Device free</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Group activity</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Postural activity</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Ultrasonic sensor</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Movement tracking</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Human counting</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Human identification</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chakraborty, Amartya</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Chakraborty, Dhruv</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Saha, Mousumi</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Saha, Sujoy</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Journal of ambient intelligence and humanized computing</subfield><subfield code="d">Berlin : Springer, 2010</subfield><subfield code="g">14(2019), 12 vom: 19. März, Seite 15809-15830</subfield><subfield code="w">(DE-627)620775734</subfield><subfield code="w">(DE-600)2543187-0</subfield><subfield code="x">1868-5145</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:14</subfield><subfield code="g">year:2019</subfield><subfield code="g">number:12</subfield><subfield code="g">day:19</subfield><subfield code="g">month:03</subfield><subfield code="g">pages:15809-15830</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s12652-019-01260-y</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_101</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2008</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2070</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2086</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2116</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2188</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2548</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">14</subfield><subfield code="j">2019</subfield><subfield code="e">12</subfield><subfield code="b">19</subfield><subfield code="c">03</subfield><subfield code="h">15809-15830</subfield></datafield></record></collection>
|
score |
7.4018173 |