Robotic module for lulo (Solanum Quitoense) classification using computer vision

dc.creatorContreras Parada, Pedro Alexander
dc.creatorPeña Cortés, César Augusto
dc.creatorRiaño Jaimes, Cristhian Iván
dc.date2018-11-26T13:12:32Z
dc.date2018-11-26T13:12:32Z
dc.date2014-07-01
dc.date.accessioned2023-10-03T18:56:59Z
dc.date.available2023-10-03T18:56:59Z
dc.identifierContreras Parada, P., Peña Cortés, C., & Riaño Jaimes, C. (2014). Módulo robótico para la clasificación de lulos (Solanum Quitoense) implementando visión artificial. INGE CUC, 10(1), 51-62. Recuperado a partir de https://revistascientificas.cuc.edu.co/ingecuc/article/view/343
dc.identifier0122-6517, 2382-4700 electrónico
dc.identifierhttp://hdl.handle.net/11323/1819
dc.identifier2382-4700
dc.identifierCorporación Universidad de la Costa
dc.identifier0122-6517
dc.identifierREDICUC - Repositorio CUC
dc.identifierhttps://repositorio.cuc.edu.co/
dc.identifier.urihttps://repositorioslatinoamericanos.uchile.cl/handle/2250/9166439
dc.descriptionEn este artículo se expone el diseño e implementación de un módulo robótico para la clasificación de lulos integrando técnicas de control, visión artificial y robótica. El proceso que permite la clasificación de lulos opera sobre algoritmos para el control de un brazo robótico de 5 GDL, que basado en la información obtenida con técnicas de visión de artificial permite seleccionar lulos con características previamente definidas por el usuario, como son tamaño, color e imperfecciones en el fruto. Se inicia realizando un diseño CAD del módulo, el cual permite establecer las propiedades físicas, definir materiales, técnicas de fabricación y realizar estudio de movimiento de los distintos mecanismos involucrados en el proceso. Se implementaron los algoritmos de control, la cinemática del brazo robótico y los algoritmos de visión artificial. El resultado se resume en un módulo robótico de carácter académico que permite la clasificación de lulos.
dc.descriptionThis paper describes the design and implementation of a robotic module for Solanum Quitoense classification using control techniques, computer vision and robotics. The process that allows the classification of Solanum Quitoense (or lulo) operates on algorithms to control a 5 degrees-of-freedom robotic arm that, based on the information obtained from computer vision techniques, selects lulos with features previously defined by the user, such as size, color, and imperfections in the fruit. Everything starts by making a CAD module design which allows establishing physical properties, defining materials and manufacturing techniques and performing a motion study of the different mechanisms involved in the process. Subsequently, control algorithms, robotic arm kinematics and computer vision algorithms are implemented. The result is summarized in an academic-based robotic module that allows lulo classification.
dc.formatapplication/pdf
dc.formatapplication/pdf
dc.languagespa
dc.publisherCorporación Universidad de la Costa
dc.relationINGE CUC; Vol. 10, Núm. 1 (2014)
dc.relationINGE CUC
dc.relationINGE CUC
dc.relation[1] J. Billingsley, D. Oetomo, J. Reid, “Agricultural robotics [TC Spotlight]”, Robotics & Automation Magazine, IEEE, vol.16, n° 4, pp.16,16, 19, December 2009.
dc.relation[2] A. Ming and H. Ma, “A blob detector in color images”, Proceedings of the 6th ACM international conference on Image and video retrieval, pp. 364,370, 2007.
dc.relation[3] T. Theodoridis and Hu Huosheng, “Toward Intelligent Security Robots: A Survey”, IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 42, n° 6, pp.1219, 1230, Nov. 2012.
dc.relation[4] C. Riaño, C. Peña, and A. Pardo, “Approach in the Optimal Development of Parallel Robot for Educational Applications”, Proceedings of the WSEAS international conference on Recent Advances in Intelligent Control, Modelling and Simulation (ICMS), pp.145,150, January 2014.
dc.relation[5] C. Peña, E. Yime, and I. Banfield, “Dimensional calculation optimal platform Stewart-Gough type parallel to pedagogical applications using genetic algorithms”, INGE CUC, vol. 7, n° 1, pp.127,138, Oct. 2011.
dc.relation[6] M. J. Schuster, D. Jain, M. Tenorth, and M. Beetz, “Learning organizational principles in human environments”, IEEE International Conference on Robotics and Automation (ICRA), pp. 3867, 3874, May 2012.
dc.relation[7] A. Ollero, Robótica: manipuladores y robots móviles. Marcombo, 2001, p. 464.
dc.relation[8] J. Marsot, “QFD: a methodological tool for integration of ergonomics at the design stage”, Applied Ergonomics, vol. 36, n° 2, pp.185,192, Mar. 2005.
dc.relation[9] L. Sooyeon, K. Youngshin, J. Youn, P. Sehyeok, and K. Jaehyun, “Contrast-preserved chroma enhancement technique using YCbCr color space”, IEEE Transactions on Consumer Electronics, vol. 58, n° 2, pp. 641, 645, May 2012.
dc.relation[10] A. Barrientos, L. Peñin, C. Balaguer, and R. Aracil, Fundamentos de robótica. McGraw-Hill, Interamericana de España, Ed. 2, pp. 512, 2007.
dc.relation[11] I. Zabalza, “Sintesis cinematica y dinamica de mecanismos. Manipulador paralelo 6-RKS”, Ph.D. Tesis, Universidad Pública de Navarra, pp.267, Dic. 1999.
dc.relation[12] R. Norton, Diseño de maquinaria, 4ª ed. Pearson, 2011, p. 888.
dc.relation[13] J. Marsot and L. Claudon, “Design and ergonomics. Methods for integrating ergonomics at hand tool design stage”, International Journal of Occupational Safety and Ergonomics, vol. 10, n° 1, pp. 13-23, 2004.
dc.relation[14] C. Jianjie, Y. Suihuai, C. Guoding, and W. Haijun W., “Research on product design quality control methods based on QFD”, IEEE 11th International Conference on Computer-Aided Industrial Design & Conceptual Design (CAIDCD), vol.1, n°1, pp. 35,39, 17-19 Nov. 2010.
dc.relation[15] Y. Akao, M. Sperry, and R. Fiorentino, “QFD: prendre en compte les besoins du client dans la conception du produit”, Afnor, Ed. 1, pp. 349, Dic. 1993.
dc.relation[16] Chang. Young-Chang and J. F. Reid, J.F., “RGB calibration for color image analysis in machine vision”, IEEE Transactions on Image Processing, vol. 5, n°10, pp.1414,1422, Oct 1996.
dc.relation[17] C. Damerval and S. Meignen, “Blob Detection With Wavelet Maxima Lines”, Signal Processing Letters, IEEE , vol.14, n°1, pp.39,42, Jan. 2007.
dc.relation[18] I. Young, J. Gerbrands, and L. Van, “Fundamentals of image processing”, Delft University of Technology, 2a ed., 1998, pp.111.
dc.relation[19] M. Moghimi and H. Pourghassem, “Shadow detection based on combinations of HSV color space and orthogonal transformation in surveillance videos”, Iranian Conference on Intelligent Systems (ICIS), vol.1, n° 1, pp.1,6, Feb. 2014.
dc.relation[20] M. Zanuy, Tratamiento digital de voz e imagen y aplicación a la multimedia, Marcombo, 2000, p. 288.
dc.relation[21] A. Pardo and J. Díaz, Fundamentos en sistemas de control automático, Universidad de Pamplona, 2004, p.155.
dc.relation[22] H. Bay, T. Tuytelaars, and L. Van, “Surf: Speeded up robust features”, in Computer Vision- ECCV 2006. Springer, 2006, pp. 404-417.
dc.relation[23] Hesheng Wang, Yun-Hui Liu, and Weidong Chen, “Uncalibrated Visual Tracking Control Without Visual Velocity”, IEEE Transactions on Control Systems Technology, vol.18, n°6, pp.1359,1370, Nov. 2010
dc.relation[24] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches”, Robotics & Automation Magazine, IEEE, vol.13, n°4, p. 82,90, Dec. 2006.
dc.relation[25] E. Martinez, C. Peña y P. Cárdenas, “Optimización dimensional de un robot paralelo tipo delta basado en el menor consumo de energía” Ciencia e Ingeniería Neogranadina, vol.1, nº 21, pp. 73 - 88, 2011.
dc.relationINGE CUC
dc.rightsinfo:eu-repo/semantics/openAccess
dc.rightshttp://purl.org/coar/access_right/c_abf2
dc.sourceINGE CUC
dc.sourcehttps://revistascientificas.cuc.edu.co/ingecuc/article/view/343
dc.subjectVisión Artificial
dc.subjectBrazo Robótico
dc.subjectCinemática Inversa
dc.subjectCinemática Directa
dc.subjectComputer Vision
dc.subjectRobotic Arm
dc.subjectInverse Kinematics
dc.subjectForward Kinematics
dc.titleMódulo robótico para la clasificación de lulos (Solanum Quitoense) implementando visión artificial
dc.titleRobotic module for lulo (Solanum Quitoense) classification using computer vision
dc.typeArtículo de revista
dc.typehttp://purl.org/coar/resource_type/c_6501
dc.typeText
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion
dc.typehttp://purl.org/redcol/resource_type/ART
dc.typeinfo:eu-repo/semantics/acceptedVersion
dc.typehttp://purl.org/coar/version/c_ab4af688f83e57aa


Este ítem pertenece a la siguiente institución