info:eu-repo/semantics/article
Hand gesture recognition in real world scenarios using approximate string matching
Fecha
2020-04-24Registro en:
Alonso, Diego Gabriel; Teyseyre, Alfredo Raul; Soria, Alvaro; Berdun, Luis Sebastian; Hand gesture recognition in real world scenarios using approximate string matching; Springer; Multimedia Tools And Applications; 24-4-2020; 1-22
1380-7501
CONICET Digital
CONICET
Autor
Alonso, Diego Gabriel
Teyseyre, Alfredo Raul
Soria, Alvaro
Berdun, Luis Sebastian
Resumen
New interaction paradigms combined with emerging technologies have produced the creation of diverse Natural User Interface (NUI) devices in the market. These devices enable the recognition of body gestures allowing users to interact with applications in a more direct, expressive, and intuitive way. In particular, the Leap Motion Controller (LMC) device has been receiving plenty of attention from NUI application developers because it allows them to address limitations on gestures made with hands. Although this device is able to recognize the position of several parts of the hands, developers are still left with the difficulttask of recognizing gestures. For this reason, several authors approached this problem using machine learning techniques. We propose a classifier based on Approximate String Matching (ASM). In short, we encode the trajectories of the hand joints as character sequences using the K-means algorithm and then we analyze these sequences with ASM. It should benoted that, when using the K-means algorithm, we select the number of clusters for each part of the hands by considering the Silhouette Coefficient. Furthermore, we define other important factors to take into account for improving the recognition accuracy. For the experiments, we generated a balanced dataset including different types of gestures and afterwards we performed a cross-validation scheme. Experimental results showed the robustness of the approach in terms of recognizing different types of gestures, time spent, and allocated memory. Besides, our approach achieved higher performance rates than well-known algorithmsproposed in the current state-of-art for gesture recognition.