dc.creatorOsimani, César
dc.creatorOjeda-Castelo, Juan Jesus
dc.creatorPiedra-Fernandez, Jose A.
dc.date.accessioned2023-03-09T08:40:35Z
dc.date.accessioned2023-09-07T15:18:14Z
dc.date.available2023-03-09T08:40:35Z
dc.date.available2023-09-07T15:18:14Z
dc.date.created2023-03-09T08:40:35Z
dc.identifier1989-1660
dc.identifierhttps://reunir.unir.net/handle/123456789/14309
dc.identifierhttps://doi.org/10.9781/ijimai.2023.01.001
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/8731640
dc.description.abstractIn the last couple of years, there has been an increasing need for Human-Computer Interaction (HCI) systems that do not require touching the devices to control them, such as ATMs, self service kiosks in airports, terminals in public offices, among others. The use of hand gestures offers a natural alternative to achieve control without touching the devices. This paper presents a solution that allows the recognition of hand gestures by analyzing three-dimensional landmarks using deep learning. These landmarks are extracted by using a model created with machine learning techniques from a single standard RGB camera in order to define the skeleton of the hand with 21 landmarks distributed as follows: one on the wrist and four on each finger. This study proposes a deep neural network that was trained with 9 gestures receiving as input the 21 points of the hand. One of the main contributions, that considerably improves the performance, is a first layer of normalization and transformation of the landmarks. In our experimental analysis, we reach an accuracy of 99.87% recognizing of 9 hand gestures.
dc.languageeng
dc.publisherInternational Journal of Interactive Multimedia and Artificial Intelligence (IJIMAI)
dc.relation;In Press
dc.relationhttps://www.ijimai.org/journal/bibcite/reference/3238
dc.rightsopenAccess
dc.subjectartificial neural networks
dc.subjectcomputer vision
dc.subjecthand gesture
dc.subjectpoint cloud
dc.subjectIJIMAI
dc.titlePoint Cloud Deep Learning Solution for Hand Gesture Recognition
dc.typearticle


Este ítem pertenece a la siguiente institución