dc.creatorSanchez-Comas, Andres
dc.creatorSynnes, Kåre
dc.creatorHallberg, Josef
dc.date2020-09-11T19:06:55Z
dc.date2020-09-11T19:06:55Z
dc.date2018
dc.date.accessioned2023-10-03T20:08:53Z
dc.date.available2023-10-03T20:08:53Z
dc.identifier1424-3210
dc.identifier1424-8220
dc.identifierhttps://hdl.handle.net/11323/7091
dc.identifierhttps://doi.org/10.3390/s20154227
dc.identifierCorporación Universidad de la Costa
dc.identifierREDICUC - Repositorio CUC
dc.identifierhttps://repositorio.cuc.edu.co/
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/9174532
dc.descriptionActivity recognition (AR) from an applied perspective of ambient assisted living (AAL) and smart homes (SH) has become a subject of great interest. Promising a better quality of life, AR applied in contexts such as health, security, and energy consumption can lead to solutions capable of reaching even the people most in need. This study was strongly motivated because levels of development, deployment, and technology of AR solutions transferred to society and industry are based on software development, but also depend on the hardware devices used. The current paper identifies contributions to hardware uses for activity recognition through a scientific literature review in the Web of Science (WoS) database. This work found four dominant groups of technologies used for AR in SH and AAL—smartphones, wearables, video, and electronic components—and two emerging technologies: Wi-Fi and assistive robots. Many of these technologies overlap across many research works. Through bibliometric networks analysis, the present review identified some gaps and new potential combinations of technologies for advances in this emerging worldwide field and their uses. The review also relates the use of these six technologies in health conditions, health care, emotion recognition, occupancy, mobility, posture recognition, localization, fall detection, and generic activity recognition applications. The above can serve as a road map that allows readers to execute approachable projects and deploy applications in different socioeconomic contexts, and the possibility to establish networks with the community involved in this topic. This analysis shows that the research field in activity recognition accepts that specific goals cannot be achieved using one single hardware technology, but can be using joint solutions, this paper shows how such technology works in this regard.
dc.formatapplication/pdf
dc.languageeng
dc.publisherCorporación Universidad de la Costa
dc.relationhttps://www.mdpi.com/1424-8220/20/15/4227
dc.relation70. Khan, M.A.A.H.; Roy, N.; Hossain, H.M.S. Wearable Sensor-Based Location-Specific Occupancy Detection in Smart Environments. Mob. Inf. Syst. 2018, 2018, 4570182.
dc.relation71. Iwasawa, Y.; Eguchi Yairi, I.; Matsuo, Y. Combining human action sensing of wheelchair users and machine learning for autonomous accessibility data collection. IEICE Trans. Inf. Syst. 2016, E99D, 1153–1161.
dc.relation72. Gupta, H.P.; Chudgar, H.S.; Mukherjee, S.; Dutta, T.; Sharma, K. A Continuous Hand Gestures Recognition Technique for Human-Machine Interaction Using Accelerometer and gyroscope sensors. IEEE Sens. J. 2016, 16, 6425–6432.
dc.relation73. Saha, J.; Chowdhury, C.; Biswas, S. Two phase ensemble classifier for smartphone based human activity recognition independent of hardware configuration and usage behaviour. Microsyst. Technol. 2018, 24, 2737–2752.
dc.relation74. Liu, Z.; Yin, J.; Li, J.; Wei, J.; Feng, Z. A new action recognition method by distinguishing ambiguous postures. Int. J. Adv. Robot. Syst. 2018, 15, 1–8.
dc.relation75. Yao, B.; Hagras, H.; Alghazzawi, D.; Member, S.; Alhaddad, M.J. A Big Bang—Big Crunch Type-2 Fuzzy Logic System for Machine-Vision-Based Event Detection and Summarization in Real-World AmbientAssisted Living. IEEE Trans. Fuzzy Syst. 2016, 24, 1307–1319.
dc.relation76. Trindade, P.; Langensiepen, C.; Lee, K.; Adama, D.A.; Lotfi, A. Human activity learning for assistive robotics using a classifier ensemble. Soft Comput. 2018, 22, 7027–7039.
dc.relation77. Wang, S.; Chen, L.; Zhou, Z.; Sun, X.; Dong, J. Human fall detection in surveillance video based on PCANet. Multimed. Tools Appl. 2016, 75, 11603–11613.
dc.relation78. Eldib, M.; Deboeverie, F.; Philips, W.; Aghajan, H. Behavior analysis for elderly care using a network of low-resolution visual sensors. J. Electron. Imaging 2016, 25, 041003.
dc.relation79. Wickramasinghe, A.; Shinmoto Torres, R.L.; Ranasinghe, D.C. Recognition of falls using dense sensing in an ambient assisted living environment. Pervasive Mob. Comput. 2017, 34, 14–24.
dc.relation80. Chen, Z.; Wang, Y. Infrared–ultrasonic sensor fusion for support vector machine–based fall detection. J. Intell. Mater. Syst. Struct. 2018, 29, 2027–2039.
dc.relation81. Chen, Z.; Wang, Y.; Liu, H. Unobtrusive Sensor based Occupancy Facing Direction Detection and Tracking using Advanced Machine Learning Algorithms. IEEE Sens. J. 2018, 18, 1–1.
dc.relation82. Wang, J.; Zhang, X.; Gao, Q.; Feng, X.; Wang, H. Device-Free Simultaneous Wireless Localization and Activity Recognition With Wavelet Feature. IEEE Trans. Veh. Technol. 2017, 66, 1659–1669.
dc.relation83. Rus, S.; Grosse-Puppendahl, T.; Kuijper, A. Evaluating the recognition of bed postures using mutual capacitance sensing. J. Ambient Intell. Smart Environ. 2017, 9, 113–127.
dc.relation84. Cheng, A.L.; Georgoulas, C.; Bock, T. Automation in Construction Fall Detection and Intervention based on Wireless Sensor Network Technologies. Autom. Constr. 2016, 71, 116–136.
dc.relation85. Hossain, H.M.S.; Khan, M.A.A.H.; Roy, N. Active learning enabled activity recognition. Pervasive Mob. Comput. 2017, 38, 312–330.
dc.relation86. Aziz, S.; Id, S.; Ren, A.; Id, D.F.; Zhang, Z.; Zhao, N.; Yang, X. Internet of Things for Sensing: A Case Study in the Healthcare System. Appl. Sci. 2018, 8, 1–16.
dc.relation87. Jiang, J.; Pozza, R.; Gunnarsdóttir, K.; Gilbert, N.; Moessner, K. Using Sensors to Study Home Activities. J. Sens. Actuator Netw. 2017, 6, 32.
dc.relation88. Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous Indoor Tracking and Activity Recognition Using Pyroelectric Infrared Sensors. Sensors 2017, 17, 1–18.
dc.relation89. Gill, S.; Seth, N.; Scheme, E. A multi-sensor matched filter approach to robust segmentation of assisted gait. Sensors (Switzerland) 2018, 18, 16–23.
dc.relation90. Sasakawa, D. Human Posture Identification Using a MIMO Array. Electronics 2018, 7, 1–13.
dc.relation91. Suyama, T. A network-type brain machine interface to support activities of daily living. IEICE Trans. Commun. 2016, E99B, 1930–1937.
dc.relation92. Li, W.; Tan, B.O.; Piechocki, R. Passive Radar for Opportunistic Monitoring in E-Health Applications. IEEE J. Trans. Eng. Health Med. 2018, 6, 1–10.
dc.rightsCC0 1.0 Universal
dc.rightshttp://creativecommons.org/publicdomain/zero/1.0/
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rightshttp://purl.org/coar/access_right/c_abf2
dc.sourceSensors
dc.subjectSmart home
dc.subjectAAL
dc.subjectAmbient assisted living
dc.subjectActivity recognition
dc.subjectHardware
dc.subjectReview
dc.titleHardware for recognition of human activities: a review of smart home and AAL related technologies
dc.typeArtículo de revista
dc.typehttp://purl.org/coar/resource_type/c_6501
dc.typeText
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion
dc.typehttp://purl.org/redcol/resource_type/ART
dc.typeinfo:eu-repo/semantics/acceptedVersion
dc.typehttp://purl.org/coar/version/c_ab4af688f83e57aa


Este ítem pertenece a la siguiente institución