dc.contributorRomero Cano, Victor Adolfo
dc.creatorRojas Cediel, Davidson Daniel
dc.creatorCerquera Calderón, Andrés Felipe
dc.date.accessioned2020-04-23T13:18:47Z
dc.date.accessioned2022-09-22T18:43:33Z
dc.date.available2020-04-23T13:18:47Z
dc.date.available2022-09-22T18:43:33Z
dc.date.created2020-04-23T13:18:47Z
dc.date.issued2020-03-09
dc.identifierhttp://red.uao.edu.co//handle/10614/12242
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/3457329
dc.description.abstractThis project presents the development of a robotic perception system for estimating transitability in external and unstructured environments, as well as its subsequent implementation in the land mobile robot Jackal. Similarly, this project is framed in one of the lines drawn by the Semillero de Investigación en Robótica de la Universidad Autónoma de Occidente (UAO), in order to support forest characterization tasks. The system presented here allows to estimate transitability, through different techniques, based on the detection of obstacles in an explicit way, to later carry out the rethinking of the route defined at the beginning and, finally, its execution by Jackal. It is important to mention that, for the development of the same one, an intense search of the state of the art was made around the systems of estimation of transitability, then a sifting of the alternatives, combination (or addition of libraries like PCL or Costmap2d), the re-parametrization of the same ones and the use of sensors LIDAR, stereo camera, GPS and IMU, arriving at more robust solutions. Finally, physical tests were carried out to determine the correct functioning of the same; it should be clarified that ROS was used as middleware with the support of the C++ programming language. In addition, these tests were performed in two stages: inside the robotics laboratory of the UAO and outside it (external and unstructured environments), the latter being the most important for the present project, managing to determine the proper functioning of the algorithms, as long as the parameters are established correctly.
dc.languagespa
dc.publisherUniversidad Autónoma de Occidente
dc.publisherIngeniería Mecatrónica
dc.publisherDepartamento de Automática y Electrónica
dc.publisherFacultad de Ingeniería
dc.rightshttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rightsAtribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
dc.rightsDerechos Reservados - Universidad Autónoma de Occidente
dc.sourceinstname:Universidad Autónoma de Occidente
dc.sourcereponame:Repositorio Institucional UAO
dc.source[1] R. Hudda, C. Kelly, G. Long, A. Pandit, D. Phillips, L. Sheet y I. Sidhu. (2013, mayo). “Self-Driving Cars,” College of Engineering University of California, Berkeley, CA, Estados Unidos. [En línea]. Disponible: https://ikhlaqsidhu.files.wordpress.com/ 2013/06/self_driving_cars.pdf [2] D. Xie, Y. Xu y R. Wang, “Obstacle detection and tracking method for autonomous vehicle based on three-dimensional LiDAR,” International Journal of Advanced Robotic Systems, Mar, 2019. doi: 10.1177/1729881419831587, [3] B. Browning, J. Deschaud, D. Prasser y P. Rander, “3D Mapping for high-fidelity unmanned ground vehicle lidar simulation,” The International Journal of Robotics Research, vol. 31, pp. 1349–1376, Nov, 2012. doi: 10.1177/0278364912460288 [4] C. Yee y J. Borenstein, “Characterization of a 2D Laser Scanner for Mobile Robot Obstacle Negotiation,” presentado en la 2002 IEEE International Conference on Robotics and Automation [En línea]. Disponible: https://ieeexplore.ieee.org/document/1013609 [5] H. Baltzakis, A. Argyros y P. Trahanias. “Fusion of laser and visual data for robot motion planning and collision avoidance,” Machine Vision and Applications, vol. 15, issue 2, pp, 92–100, dic. 2003. doi: 10.1007/s00138-003-0133-2 [6] K. Nagatani et al., "Multi-robot exploration for search and rescue missions: A report of map building in RoboCupRescue 2009," presentado en la 2009 IEEE International Workshop on Safety, Security & Rescue Robotics (SSRR), Denver, CO. doi: 10.1109/SSRR.2009.5424148 [7] L. Smadja, J. Ninot, y T. Gavrilovic, “Road extraction and environment interpretation from lidar sensors,” ISPRS. Arch., vol. 38, no. 3a, pp. 281–286, sep. 2010. [8] C. Cadena, L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza, J. Neira, I. Reid y J. Leonard. “Simultaneous Localization and Mapping: Present, Future, and the Robust-Perception Age,” IEEE Transactions on Robotics, vol. 32 , issue 6, dic, 2016. doi: 10.1109/TRO.2016.2624754 [9] C. Goodin, M. Doude, C. Hudson y D. Carruth. “Enabling Off-Road Autonomous Navigation-Simulation of LIDAR in Dense Vegetation,” Electronics, vol. 7, no. 9, p. 154, ago, 2018. doi: 10.3390/electronics7090154 [10] Y. Broche, L. Herrera y E. Omar, “Bases neurales de la toma de decisiones,” Neurología, vol. 31, pp. 319-325, may, 2015. doi: 10.1016/j.nrl.2015.03.001 [11] L. Hofer. “Decision-making algorithms for autonomous robots,” Ph.D. disertación, École Doctorale de Mathématiques et D’Informatique, Université de Bordeaux, Bordeaux, Francia, 2017. Disponible en https://tel.archives-ouvertes.fr/tel-01684198 [12] I. Harner. (2017, Oct 23). “The 5 Autonomous Driving Levels Explained”. [Internet]. Disponible en https://iotforall.com/5-autonomous-driving-levels-explained [13] Waymo Industry (2009). “Our Mission”. [Internet]. Disponible en: https://waymo.com/mission [14] I. Bogoslavsky, O. Vysotska, J. Serafin, G. Grisetti y C. Stachniss, “Efficient traversability analysis for mobile robots using the Kinect sensor,” presentado en la 2013 European Conference on Mobile Robots, Barcelona, CA, doi: 10.1109/ECMR.2013.6698836 [15] G. Reina, A. Milella y R. Rouveure, “Traversability Analysis for Off-Road Vehicles using Stereo and Radar Data,” presentado en la 2015 IEEE International Conf. on Industrial Technology (ICIT), Sevilla, AND, doi: 10.1109/ICIT.2015.7125155 [16] X. Huang, C. Shuai y X. Wu, “Reactive navigation of autonomous vehicle,” Journal of Physics: Conf. Series, vol. 1176, pp. 52-79, mar, 2019. doi: 10.1088/1742-6596/1176/5/052079 [17] A. Souza, R. Maia, R. Aroca y L. Gonçalves, “Probabilistic robotic grid mapping based on occupancy and elevation information,” presentado en la 2013 16th International Conf. on Advanced Robotics, (ICAR), Montevideo, BA. doi: 10.1109/ICAR.2013.6766467 [18] A. Elfes, "Using occupancy grids for mobile robot perception and navigation," Computer, vol. 22, no. 6, pp. 46-57, jun, 1989. doi: 10.1109/2.30720 [19] Y. Liu, R. Emery, D. Chakrabarti, W. Burgard, y S. Thrun, “Using EM to learn 3D models with mobile robots,” en the International Conference on Machine Learning (ICML), San Francisco, 2001, pp. 329-336. [20] P. Pfaff, R. Triebel, y W. Burgard, “An efficient extension to elevation maps for outdoor terrain mapping and loop closing,” The International Journal of Robotics Research, vol. 26, no. 217, pp. 217–230, feb, 2007. doi: 10.1177/0278364906075165 [21] L. Berczi, I. Posner, y T. Barfoot, “Learning to Assess Terrain From Human Demonstration Using an Introspective Gaussian Process Classifier,” presentado en la 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA. doi: 10.1109/ICRA.2015.7139637 [22] S. Goldberg, W. Maimone, y L. Matthies, “Stereo Vision and Rover Navigation Software for Planetary Exploration,” Proceedings, IEEE Aerospace Conference, vol. 5, pp. 5-5, ene, 2002. doi: 10.1109/AERO.2002.1035370 [23] K. Ho, T. Peynot, y S. Sukkarieh, “Traversability Estimation for a Planetary Rover Via Experimental Kernel Learning in a Gaussian Process Framework,” presentado en la IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, BA, 2013. [24] P. Ross, A. English, D. Ball, B. Upcroft y P. Corke, “Finding the ground hidden in the grass: traversability estimation in vegetation,” Maestria, School of Electrical Engineering and Computer Science, Queensland University of Technology., Brisbane, Australia, 2015. Disponible en http://www.araa.asn.au/acra/acra2015/papers/pap105.pdf [25] J. Berrío, “Mapeo y Localización Simultánea de un robot móvil en ambientes estructurados basado en integración sensorial,” Maestria, Fac. Ingeniería., Universidad del Valle, Santiago de Cali, Valle del Cauca, Colombia. 2015. Disponible en http://hdl.handle.net/10893/8843 [26] J. Kocic, N. Jovičić y V. Drndarevic, “Sensors and Sensor Fusion in Autonomous Vehicles,” en 26th Telecommunications Forum (TELFOR), 2018, pp. 420-425. [27] J. Sock, J. Kim, J. Min y K. Kwak, "Probabilistic traversability map generation using 3D-LIDAR and camera," en IEEE International Conference on Robotics and Automation (ICRA), Stockholm, 2016, pp. 5631-5637. [28] N. Hirose, A. Sadeghian, F. Xia, R. Martin-Martin y S. Savarese, “VUNet: Dynamic Scene View Synthesis for Traversability Estimation using an RGB Camera,” IEEE Robotics and Automation Letters, vol. 4, issue 2, pp. 2062 – 2069, ene, 2019. doi: 10.1109/LRA.2019.2894869 [29] G. De Cubber, D. Doroftei, H. Sahli y Y. Baudoin, “Outdoor Terrain Traversability Analysis for Robot Navigation using a Time-Of-Flight Camera,” presentado en la RGB-D Workshop on 3D Perception in Robotics, Odense, DK, 2011. [30] NASA, (2003), “Rover Driving”. [Internet], Disponible en: https://www-robotics.jpl.nasa.gov/roboticVideos/vid1016-70-browse.jpg [31] ROSWIKI, (2019). “Turtlebot tango”. [Internet], Disponible en: http://ros.fei.edu.br/roswiki/attachments/rtabmap_ros(2f)Tutorials(2f)Tango(20)ROS(20)Streamer/turtlebot_tango.png [32] R. Chavez, J. Guzzi, L. Gambardella y A. Giusti, “Image Classification for Ground Traversability Estimation in Robotics,” en International Conference on Advanced Concepts for Intelligent Vision Systems(ICACI), 2017, pp. 325-336. [33] Jackal Unmanned Ground Vehicle (s.f). [Internet]. Disponible en https://clearpathrobotics.com/jackal-small-unmanned-ground-vehicle/ [34] GEFORCE (s.f). [Internet]. Disponible en https://www.nvidia.com/es-la/geforce/ products/10series/geforce-gtx-1050/ [35] T. Giménez y M. Ros. “SISTEMA DE POSICIONAMIENTO GLOBAL (GPS),” Universidad de Murcia, Región de Murcia, Murcia, España, 2010. [En línea]. Disponible en: https://webs.um.es/bussons/GPSresumen_TamaraElena.pdf [36] BigComerse (s.f) “Thingmagic Xpress Sensor Hub Plug-In Gps Interface Module”. [Internet]. Disponible en https://cdn1.bigcommerce.com/n-ww20x/ka7ofex/products/1758/images/4254/GPS__23111.1434050193.1280.1280.jpg [37] A. Chaudhry, C. Shih, A. Skillin y D. Witcpalek, “Inertial Measurement Units,” University of Michigan., Ann Arbor, MI, Estados Unidos, nov, 12, 2018. [En línea]. Disponible en: https://www.eecs.umich.edu/courses/eecs373/Lec/StudentF18/ 373%20IMU%20Presentation.pdf [38] Clearpath robotics (s.f). “Puck lite”. [Internet]. Disponible en https://store.clearpathrobotics.com/products/puck-lite [39] StereoLabs (s.f). “ZED”. [Internet]. Disponible en https://www.stereolabs.com/zed/ [40] ROS (s.f). “What is Ros?”. [Internet]. Disponible en http://wiki.ros.org/ROS/Introduction [41] Baxter Research Robot (s.f). “Rviz - sdk-wiki”. [Internet]. Disponible en http://sdk.rethinkrobotics.com/wiki/Rviz [42] pcl (s.f). “What is PCL?”. [Internet]. Disponible en http://pointclouds.org/about/ [43] ROS (s.f). “nodelet”. [Internet]. Disponible en http://wiki.ros.org/nodelet [44] ROS (s.f). “pluginlib”. [En línea]. Disponible en http://wiki.ros.org/pluginlib [45] ROS (s.f). “costmap_2d”. [Internet]. Disponible en http://wiki.ros.org/costmap_2d [46] ROS (s.f). “navigation”. [Internet]. Disponible en: http://wiki.ros.org/navigation [47] ROS (s.f). “PointCloud to LaserScan”. [Internet]. Disponible en: http://wiki.ros.org/pointcloud_to_laserscan [48] H. Oleynikova, Z. Taylor, M. Fehr, J. Nieto y R. Siegwart, “Voxblox: Incremental 3D Euclidean Signed Distance Fields for On-Board MAV Planning,” en IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). [En línea], Disponible en https://arxiv.org/abs/1611.03631 [49] S. Macenski (s.f). “Spatio-Temporal Voxel Layer”. [Internet]. Disponible en https://github.com/SteveMacenski/spatio_temporal_voxel_layer [50] ClearPath Robotics (s.f). “NAVIGATING WITH JACKAL”. [Internet]. Disponible en http://www.clearpathrobotics.com/assets/guides/jackal/navigation.html [51] S. Macenski. (s.f), On Use of the Spatio-Temporal Voxel Layer: A Fresh(er) look at 3D Perception for the Planar World. Presentado en ROSCon 2018. [Internet]. Disponible en: https://roscon.ros.org/2018/presentations/ROSCon2018_SpatioTemporalVoxelLayer.pdf [52] N. Hirose, A. Sadeghian, F. Xia y S. Savarese, “GONet++: Traversability Estimation via Dynamic Scene View Synthesis,” IEEE Robotics & Automation Magazine, Jun, 2018. [En línea]. Disponible en https://www.researchgate.net/publication/325986451_GONet_Traversability_Estimation_via_Dynamic_Scene_View_Synthesis [53] N. Kaiser (2017). Nu_jackal_autonav. [Internet]. Disponible en https://github.com/njkaiser/nu_jackal_autonav [54] ROS (s.f). “gps_goal”. [En línea]. Disponible en http://wiki.ros.org/ gps_goal
dc.subjectPrograma de Ingeniería Mecatrónica
dc.subjectEstimación de transitabilidad
dc.subjectROS
dc.subjectLIDAR
dc.subjectRobot móvil terrestre Jackal
dc.subjectDetección de obstáculos explícitos
dc.titleSistema de estimación de transitabilidad para robots móviles terrestres
dc.typeTrabajo de grado - Pregrado


Este ítem pertenece a la siguiente institución