dc.contributorMedeiros, Adelardo Adelino Dantas de
dc.contributor
dc.contributorhttp://lattes.cnpq.br/1423358177316450
dc.contributor
dc.contributorhttp://lattes.cnpq.br/6787525856497063
dc.contributorAlsina, Pablo Javier
dc.contributor
dc.contributorhttp://lattes.cnpq.br/3653597363789712
dc.contributorPedrosa, Diogo Pinheiro Fernandes
dc.contributor
dc.contributorhttp://lattes.cnpq.br/3276436982330644
dc.creatorSantiago, Gutemberg Santos
dc.date.accessioned2008-12-15
dc.date.accessioned2014-12-17T14:55:06Z
dc.date.accessioned2022-10-05T22:59:16Z
dc.date.available2008-12-15
dc.date.available2014-12-17T14:55:06Z
dc.date.available2022-10-05T22:59:16Z
dc.date.created2008-12-15
dc.date.created2014-12-17T14:55:06Z
dc.date.issued2008-05-30
dc.identifierSANTIAGO, Gutemberg Santos. Navegação cooperativa de um robô humanóide e um robô com rodas usando informação visual. 2008. 74 f. Dissertação (Mestrado em Automação e Sistemas; Engenharia de Computação; Telecomunicações) - Universidade Federal do Rio Grande do Norte, Natal, 2008.
dc.identifierhttps://repositorio.ufrn.br/jspui/handle/123456789/15197
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/3943756
dc.description.abstractThis work presents a cooperative navigation systemof a humanoid robot and a wheeled robot using visual information, aiming to navigate the non-instrumented humanoid robot using information obtained from the instrumented wheeled robot. Despite the humanoid not having sensors to its navigation, it can be remotely controlled by infra-red signals. Thus, the wheeled robot can control the humanoid positioning itself behind him and, through visual information, find it and navigate it. The location of the wheeled robot is obtained merging information from odometers and from landmarks detection, using the Extended Kalman Filter. The marks are visually detected, and their features are extracted by image processing. Parameters obtained by image processing are directly used in the Extended Kalman Filter. Thus, while the wheeled robot locates and navigates the humanoid, it also simultaneously calculates its own location and maps the environment (SLAM). The navigation is done through heuristic algorithms based on errors between the actual and desired pose for each robot. The main contribution of this work was the implementation of a cooperative navigation system for two robots based on visual information, which can be extended to other robotic applications, as the ability to control robots without interfering on its hardware, or attaching communication devices
dc.publisherUniversidade Federal do Rio Grande do Norte
dc.publisherBR
dc.publisherUFRN
dc.publisherPrograma de Pós-Graduação em Engenharia Elétrica
dc.publisherAutomação e Sistemas; Engenharia de Computação; Telecomunicações
dc.rightsAcesso Aberto
dc.subjectCooperação multi-robôs
dc.subjectSLAM visual
dc.subjectFusão sensorial
dc.subjectFiltro de Kalman
dc.subjectCalibração de câmera
dc.subjectTransformada de Hough
dc.subjectCooperative robotics
dc.subjectVisual SLAM
dc.subjectSensor fusion
dc.subjectKalman filter
dc.subjectCamera calibration
dc.subjectHough transformation
dc.titleNavegação cooperativa de um robô humanóide e um robô com rodas usando informação visual
dc.typemasterThesis


Este ítem pertenece a la siguiente institución