dc.creator | Sanchez-Comas, Andres | |
dc.creator | Synnes, Kåre | |
dc.creator | Hallberg, Josef | |
dc.date | 2020-09-11T19:06:55Z | |
dc.date | 2020-09-11T19:06:55Z | |
dc.date | 2018 | |
dc.date.accessioned | 2023-10-03T20:08:53Z | |
dc.date.available | 2023-10-03T20:08:53Z | |
dc.identifier | 1424-3210 | |
dc.identifier | 1424-8220 | |
dc.identifier | https://hdl.handle.net/11323/7091 | |
dc.identifier | https://doi.org/10.3390/s20154227 | |
dc.identifier | Corporación Universidad de la Costa | |
dc.identifier | REDICUC - Repositorio CUC | |
dc.identifier | https://repositorio.cuc.edu.co/ | |
dc.identifier.uri | https://repositorioslatinoamericanos.uchile.cl/handle/2250/9174532 | |
dc.description | Activity recognition (AR) from an applied perspective of ambient assisted living (AAL) and smart homes (SH) has become a subject of great interest. Promising a better quality of life, AR applied in contexts such as health, security, and energy consumption can lead to solutions capable of reaching even the people most in need. This study was strongly motivated because levels of development, deployment, and technology of AR solutions transferred to society and industry are based on software development, but also depend on the hardware devices used. The current paper identifies contributions to hardware uses for activity recognition through a scientific literature review in the Web of Science (WoS) database. This work found four dominant groups of technologies used for AR in SH and AAL—smartphones, wearables, video, and electronic components—and two emerging technologies: Wi-Fi and assistive robots. Many of these technologies overlap across many research works. Through bibliometric networks analysis, the present review identified some gaps and new potential combinations of technologies for advances in this emerging worldwide field and their uses. The review also relates the use of these six technologies in health conditions, health care, emotion recognition, occupancy, mobility, posture recognition, localization, fall detection, and generic activity recognition applications. The above can serve as a road map that allows readers to execute approachable projects and deploy applications in different socioeconomic contexts, and the possibility to establish networks with the community involved in this topic. This analysis shows that the research field in activity recognition accepts that specific goals cannot be achieved using one single hardware technology, but can be using joint solutions, this paper shows how such technology works in this regard. | |
dc.format | application/pdf | |
dc.language | eng | |
dc.publisher | Corporación Universidad de la Costa | |
dc.relation | https://www.mdpi.com/1424-8220/20/15/4227 | |
dc.relation | 70. Khan, M.A.A.H.; Roy, N.; Hossain, H.M.S. Wearable Sensor-Based Location-Specific Occupancy Detection
in Smart Environments. Mob. Inf. Syst. 2018, 2018, 4570182. | |
dc.relation | 71. Iwasawa, Y.; Eguchi Yairi, I.; Matsuo, Y. Combining human action sensing of wheelchair users and machine
learning for autonomous accessibility data collection. IEICE Trans. Inf. Syst. 2016, E99D, 1153–1161. | |
dc.relation | 72. Gupta, H.P.; Chudgar, H.S.; Mukherjee, S.; Dutta, T.; Sharma, K. A Continuous Hand Gestures Recognition Technique for Human-Machine Interaction Using Accelerometer and gyroscope sensors. IEEE Sens. J. 2016,
16, 6425–6432. | |
dc.relation | 73. Saha, J.; Chowdhury, C.; Biswas, S. Two phase ensemble classifier for smartphone based human activity recognition independent of hardware configuration and usage behaviour. Microsyst. Technol. 2018, 24,
2737–2752. | |
dc.relation | 74. Liu, Z.; Yin, J.; Li, J.; Wei, J.; Feng, Z. A new action recognition method by distinguishing ambiguous postures. Int. J. Adv. Robot. Syst. 2018, 15, 1–8. | |
dc.relation | 75. Yao, B.; Hagras, H.; Alghazzawi, D.; Member, S.; Alhaddad, M.J. A Big Bang—Big Crunch Type-2 Fuzzy
Logic System for Machine-Vision-Based Event Detection and Summarization in Real-World AmbientAssisted Living. IEEE Trans. Fuzzy Syst. 2016, 24, 1307–1319. | |
dc.relation | 76. Trindade, P.; Langensiepen, C.; Lee, K.; Adama, D.A.; Lotfi, A. Human activity learning for assistive
robotics using a classifier ensemble. Soft Comput. 2018, 22, 7027–7039. | |
dc.relation | 77. Wang, S.; Chen, L.; Zhou, Z.; Sun, X.; Dong, J. Human fall detection in surveillance video based on PCANet. Multimed. Tools Appl. 2016, 75, 11603–11613. | |
dc.relation | 78. Eldib, M.; Deboeverie, F.; Philips, W.; Aghajan, H. Behavior analysis for elderly care using a network of
low-resolution visual sensors. J. Electron. Imaging 2016, 25, 041003. | |
dc.relation | 79. Wickramasinghe, A.; Shinmoto Torres, R.L.; Ranasinghe, D.C. Recognition of falls using dense sensing in an ambient assisted living environment. Pervasive Mob. Comput. 2017, 34, 14–24. | |
dc.relation | 80. Chen, Z.; Wang, Y. Infrared–ultrasonic sensor fusion for support vector machine–based fall detection. J. Intell. Mater. Syst. Struct. 2018, 29, 2027–2039. | |
dc.relation | 81. Chen, Z.; Wang, Y.; Liu, H. Unobtrusive Sensor based Occupancy Facing Direction Detection and Tracking using Advanced Machine Learning Algorithms. IEEE Sens. J. 2018, 18, 1–1. | |
dc.relation | 82. Wang, J.; Zhang, X.; Gao, Q.; Feng, X.; Wang, H. Device-Free Simultaneous Wireless Localization and
Activity Recognition With Wavelet Feature. IEEE Trans. Veh. Technol. 2017, 66, 1659–1669. | |
dc.relation | 83. Rus, S.; Grosse-Puppendahl, T.; Kuijper, A. Evaluating the recognition of bed postures using mutual
capacitance sensing. J. Ambient Intell. Smart Environ. 2017, 9, 113–127. | |
dc.relation | 84. Cheng, A.L.; Georgoulas, C.; Bock, T. Automation in Construction Fall Detection and Intervention based on Wireless Sensor Network Technologies. Autom. Constr. 2016, 71, 116–136. | |
dc.relation | 85. Hossain, H.M.S.; Khan, M.A.A.H.; Roy, N. Active learning enabled activity recognition. Pervasive Mob.
Comput. 2017, 38, 312–330. | |
dc.relation | 86. Aziz, S.; Id, S.; Ren, A.; Id, D.F.; Zhang, Z.; Zhao, N.; Yang, X. Internet of Things for Sensing: A Case Study in the Healthcare System. Appl. Sci. 2018, 8, 1–16. | |
dc.relation | 87. Jiang, J.; Pozza, R.; Gunnarsdóttir, K.; Gilbert, N.; Moessner, K. Using Sensors to Study Home Activities. J. Sens. Actuator Netw. 2017, 6, 32. | |
dc.relation | 88. Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous Indoor Tracking and Activity
Recognition Using Pyroelectric Infrared Sensors. Sensors 2017, 17, 1–18. | |
dc.relation | 89. Gill, S.; Seth, N.; Scheme, E. A multi-sensor matched filter approach to robust segmentation of assisted gait. Sensors (Switzerland) 2018, 18, 16–23. | |
dc.relation | 90. Sasakawa, D. Human Posture Identification Using a MIMO Array. Electronics 2018, 7, 1–13. | |
dc.relation | 91. Suyama, T. A network-type brain machine interface to support activities of daily living. IEICE Trans.
Commun. 2016, E99B, 1930–1937. | |
dc.relation | 92. Li, W.; Tan, B.O.; Piechocki, R. Passive Radar for Opportunistic Monitoring in E-Health Applications. IEEE J. Trans. Eng. Health Med. 2018, 6, 1–10. | |
dc.rights | CC0 1.0 Universal | |
dc.rights | http://creativecommons.org/publicdomain/zero/1.0/ | |
dc.rights | info:eu-repo/semantics/openAccess | |
dc.rights | http://purl.org/coar/access_right/c_abf2 | |
dc.source | Sensors | |
dc.subject | Smart home | |
dc.subject | AAL | |
dc.subject | Ambient assisted living | |
dc.subject | Activity recognition | |
dc.subject | Hardware | |
dc.subject | Review | |
dc.title | Hardware for recognition of human activities: a review of smart home and AAL related technologies | |
dc.type | Artículo de revista | |
dc.type | http://purl.org/coar/resource_type/c_6501 | |
dc.type | Text | |
dc.type | info:eu-repo/semantics/article | |
dc.type | info:eu-repo/semantics/publishedVersion | |
dc.type | http://purl.org/redcol/resource_type/ART | |
dc.type | info:eu-repo/semantics/acceptedVersion | |
dc.type | http://purl.org/coar/version/c_ab4af688f83e57aa | |