dc.creatorYauri Vidalon
dc.creatorJose Elias; De Martino
dc.creatorJose Mario
dc.date2016
dc.date2017-11-13T13:56:18Z
dc.date2017-11-13T13:56:18Z
dc.date.accessioned2018-03-29T06:09:41Z
dc.date.available2018-03-29T06:09:41Z
dc.identifier978-3-319-48881-3; 978-3-319-48880-6
dc.identifierComputer Vision - Eccv 2016 Workshops, Pt Ii. Springer Int Publishing Ag, v. 9914, p. 391 - 402, 2016.
dc.identifier0302-9743
dc.identifierWOS:000389501700027
dc.identifier10.1007/978-3-319-48881-3_27
dc.identifierhttps://link.springer.com/chapter/10.1007%2F978-3-319-48881-3_27
dc.identifierhttp://repositorio.unicamp.br/jspui/handle/REPOSIP/329833
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/1366858
dc.descriptionThe simultaneous-sequential nature of sign language production, which employs hand gestures and body motions combined with facial expressions, still challenges sign language recognition algorithms. This paper presents a method to recognize Brazilian Sign Language (Libras) using Kinect. Skeleton information is used to segment sign gestures from a continuous stream, while depth information is used to provide distinctive features. The method was assessed in a new data-set of 107 medical signs selected from common dialogues in health-care centers. The dynamic time warping-nearest neighbor (DTW-kNN) classifier using the leave-one-out cross-validation strategy reported outstanding results.
dc.description9914
dc.description391
dc.description402
dc.description14th European Conference on Computer Vision (ECCV)
dc.descriptionOCT 08-16, 2016
dc.descriptionAmsterdam, NETHERLANDS
dc.languageEnglish
dc.publisherSpringer Int Publishing AG
dc.publisherCham
dc.relationComputer Vision - ECCV 2016 Workshops, PT II
dc.rightsfechado
dc.sourceWOS
dc.subjectSign Language
dc.subjectIsolated Sign Language Recognition
dc.subjectBrazilian Sign Language
dc.subjectLibras
dc.subjectDynamic Time Warping
dc.subjectK-nearest Neighbor
dc.titleBrazilian Sign Language Recognition Using Kinect
dc.typeActas de congresos


Este ítem pertenece a la siguiente institución