dc.contributorUniversidade Estadual Paulista (Unesp)
dc.date.accessioned2018-11-26T17:48:34Z
dc.date.available2018-11-26T17:48:34Z
dc.date.created2018-11-26T17:48:34Z
dc.date.issued2017-01-01
dc.identifier2017 Latin American Robotics Symposium (lars) And 2017 Brazilian Symposium On Robotics (sbr). New York: Ieee, 5 p., 2017.
dc.identifierhttp://hdl.handle.net/11449/163959
dc.identifierWOS:000426897500033
dc.description.abstractORB-SLAM2 is one of the better-known open source SLAM implementations available. However, the dependence of visual features causes it to fail in featureless environments. With the present work, we propose a new technique to improve visual odometry results given by ORB-SLAM2 using a tightly Sensor Fusion approach to integrate camera and odometer data. In this work, we use odometer readings to improve the tracking results by adding graph constraints between frames and introduce a new method for preventing the tracking loss. We test our method using three different datasets, and show an improvement in the estimated trajectory, allowing a continuous tracking without losses.
dc.languageeng
dc.publisherIeee
dc.relation2017 Latin American Robotics Symposium (lars) And 2017 Brazilian Symposium On Robotics (sbr)
dc.rightsAcesso aberto
dc.sourceWeb of Science
dc.titleORB-ODOM: Stereo and Odometer Sensor Fusion for Simultaneous Localization and Mapping
dc.typeActas de congresos


Este ítem pertenece a la siguiente institución