dc.contributorUniversidade Estadual Paulista (UNESP)
dc.contributorUniversidade Federal de Sergipe (UFS)
dc.contributorBrazilian Institute of Neuroscience and Neurotechnology-BRAINN
dc.date.accessioned2022-04-28T19:45:31Z
dc.date.accessioned2022-12-20T01:26:20Z
dc.date.available2022-04-28T19:45:31Z
dc.date.available2022-12-20T01:26:20Z
dc.date.created2022-04-28T19:45:31Z
dc.date.issued2021-11-05
dc.identifierACM International Conference Proceeding Series, p. 61-64.
dc.identifierhttp://hdl.handle.net/11449/222587
dc.identifier10.1145/3470482.3479618
dc.identifier2-s2.0-85116586541
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/5402717
dc.description.abstractDue to the evolution of motion capture devices, natural user interfaces have been applied in several areas, such as neuromotor rehabilitation supported by virtual environments. This paper presents a smartphone application that allows the user to interact with the virtual environment and enables the captured data to be stored, processed, and used in machine learning models. The application submits the recordings to the remote database with information about the movement and in order to apply supervised machine learning. As a proof of concept, we generated a dataset capturing movement data using our application with 232 instances divided into 8 classes of movements. Moreover, we used this dataset for training models that classifies these movements. The remarkable accuracy of the models shows the feasibility of using body articulation data for a classification task after some data transformations.
dc.languageeng
dc.relationACM International Conference Proceeding Series
dc.sourceScopus
dc.subjectaugmented reality
dc.subjectComputer vision
dc.subjectmotion capture
dc.subjectsupervised machine learning
dc.titleUpper Limb Motion Tracking and Classification: A Smartphone Approach
dc.typeActas de congresos


Este ítem pertenece a la siguiente institución