dc.date.accessioned2021-08-23T22:53:36Z
dc.date.accessioned2022-10-19T00:21:38Z
dc.date.available2021-08-23T22:53:36Z
dc.date.available2022-10-19T00:21:38Z
dc.date.created2021-08-23T22:53:36Z
dc.date.issued2016
dc.identifierhttp://hdl.handle.net/10533/251209
dc.identifier1151018
dc.identifierWOS:000400012302004
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/4482472
dc.description.abstractIn this paper, we introduce a new hierarchical model for human action recognition using body joint locations. Our model can categorize complex actions in videos, and perform spatio-temporal annotations of the atomic actions that compose the complex action being performed. That is, for each atomic action, the model generates temporal action annotations by estimating its starting and ending times, as well as, spatial annotations by inferring the human body parts that are involved in executing the action. Our model includes three key novel properties: (i) it can be trained with no spatial supervision, as it can automatically discover active body parts from temporal action annotations only; (ii) it jointly learns flexible representations for motion poselets and actionlets that encode the visual variability of body parts and atomic actions; (iii) a mechanism to discard idle or non-informative body parts which increases its robustness to common pose estimation errors. We evaluate the performance of our method using multiple action recognition benchmarks. Our model consistently outperforms baselines and state-of-the-art action recognition methods.
dc.languageeng
dc.relationhttps://doi.org/10.1109/CVPR.2016.218
dc.relationhandle/10533/111557
dc.relation10.1109/CVPR.2016.218
dc.relationhandle/10533/111541
dc.relationhandle/10533/108045
dc.rightsinfo:eu-repo/semantics/article
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rightsAtribución-NoComercial-SinDerivadas 3.0 Chile
dc.rightshttp://creativecommons.org/licenses/by-nc-nd/3.0/cl/
dc.titleA Hierarchical Pose-Based Approach to Complex Action Understanding Using Dictionaries of Actionlets and Motion Poselets
dc.typeArticulo


Este ítem pertenece a la siguiente institución