dc.creatorMartínez, Luz
dc.creatorRuiz del Solar, Javier
dc.creatorSun, Li
dc.creatorSiebert, J. Paul
dc.creatorAragon-Camarasa, Gerardo
dc.date.accessioned2019-10-30T15:40:22Z
dc.date.available2019-10-30T15:40:22Z
dc.date.created2019-10-30T15:40:22Z
dc.date.issued2019
dc.identifierRobotics and Autonomous Systems, Volumen 118,
dc.identifier09218890
dc.identifier10.1016/j.robot.2019.05.010
dc.identifierhttps://repositorio.uchile.cl/handle/2250/172602
dc.description.abstractWe present a robot vision approach to deformable object classification, with direct application to autonomous service robots. Our approach is based on the assumption that continuous perception provides robots with greater visual competence for deformable objects interpretation and classification. Our approach classifies the category of clothing items by continuously perceiving the dynamic interactions of the garment's material and shape as it is being picked up. For this, we extract continuously visual features of a RGB-D video sequence and we fuse features by means of the Locality Constrained Group Sparse Representation (LGSR) algorithm. To evaluate the performance of our approach, we created a fully annotated database featuring 150 garment videos in random configurations. Experiments demonstrate that by continuously observing an object deform, our approach achieves a classification score of 66.7%, outperforming state-of-the-art approaches by a ∼27.3% increase.
dc.languageen
dc.publisherElsevier B.V.
dc.rightshttp://creativecommons.org/licenses/by-nc-nd/3.0/cl/
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 Chile
dc.sourceRobotics and Autonomous Systems
dc.subjectContinuous perception
dc.subjectDeformable object classification
dc.subjectRobot vision
dc.titleContinuous perception for deformable objects understanding
dc.typeArtículos de revistas


Este ítem pertenece a la siguiente institución