dc.contributorPedraza Bonilla, César Augusto
dc.contributorRodríguez Mújica, Leonardo
dc.contributorPLaS - Programming Languages and Systems
dc.creatorOsorio Delgado, Anderson Kavir
dc.date.accessioned2021-10-11T15:05:06Z
dc.date.available2021-10-11T15:05:06Z
dc.date.created2021-10-11T15:05:06Z
dc.date.issued2021-09-13
dc.identifierhttps://repositorio.unal.edu.co/handle/unal/80482
dc.identifierUniversidad Nacional de Colombia
dc.identifierRepositorio Institucional Universidad Nacional de Colombia
dc.identifierhttps://repositorio.unal.edu.co/
dc.description.abstractLa estimación de maleza es una de las tareas más importantes durante el proceso de control de maleza, pues de esta depende la estimación de costos que deberán emplearse para proteger el cultivo, por tanto este trabajo presenta el desarrollo de un método que utiliza imágenes multiespectrales capturadas por un vehículo aéreo no tripulado y redes neuronales convolucionales para realizar la estimación porcentual de maleza en cultivos de lechuga. El método presentado tiene una exactitud del 89% y un valor-F de 94% para la detección del cultivo, con un tiempo de ejecución promedio de 0.4 segundos sin GPU y una correlación de 0.57 en la evaluación de cobertura de maleza en relación con un Ph.D en malherbología. Estos resultados indican que la tarea de estimación de maleza usando CNNs es más precisa y rápida que la realizada por expertos, pero sin alejarse del conocimiento tácito el cual es importante en la estimación de costos y recursos para el control de la maleza. (Texto tomado de la fuente).
dc.description.abstractWeed estimation is one of the most important tasks during the weed control process. The estimation of costs to be used to protect the crop depends on it. Therefore, this work presents the development of a method, which uses multispectral images captured by an unmanned aerial vehicle and convolutional neural networks. In order to perform percentage quantification of weeds in lettuce crops. The presented method has an accuracy of 89% and an F-value of 94% for crop detection. Its average run time is 0.4 seconds without GPU. In addition to a correlation of 0.57 in the weed cover assessment in relation to a Ph.D. weed science expert. These results indicate that the weed estimation task using CNNs is more accurate and faster than that performed by experts. But without departing from the tacit knowledge which is important in estimating costs and resources for weed control.
dc.languagespa
dc.publisherUniversidad Nacional de Colombia
dc.publisherBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación
dc.publisherDepartamento de Ingeniería de Sistemas e Industrial
dc.publisherFacultad de Ingeniería
dc.publisherBogotá, Colombia
dc.publisherUniversidad Nacional de Colombia - Sede Bogotá
dc.relation[Abdulsalam and Aouf, 2020] Abdulsalam, M. and Aouf, N. (2020). Deep weed detec- tor/classifier network for precision agriculture. Mediterranean Conference on Control and Automation.
dc.relation[Ahmed et al., 2017] Ahmed, O. S., Shemrock, A., Chabot, D., Dillon, C., Williams, G., Wasson, R., and Franklin, S. E. (2017). Hierarchical land cover and vegetation classifi-cation using multispectral data acquired from an unmanned aerial vehicle. International Journal of Remote Sensing, 38(8):2037–2052.
dc.relation[Ambrosio et al., 2004] Ambrosio, L., Iglesias, L., Marin, C., and Del Monte, J. (2004). Eva- luation of sampling methods and assessment of the sample size to estimate the weed seedbank in soil, taking into account spatial variability. Weed Research, 44(3):224–236.
dc.relation[Barrero et al., 2016] Barrero, O. Rojas, D., Gonzalez, C., and Perdomo, S. (2016). Weed detection in rice fields using aerial images and neural networks. IEEE Xplore.Universidad de Ibagué.
dc.relation[Behmann et al., 2015] Behmann, J., Mahlein, A. K., Rumpf, T., Römer, C., and Plümer, L. (2015). A review of advanced machine learning methods for the detection of biotic stress in precision crop protection. Precision Agriculture, 16(3):239–260.
dc.relation[Bell et al., 2004] Bell, G., Howell, B., Johnson, G., Raun, W., Solie, J., and Stone, M. (2004). Optical sensing of turfgrass chlorophyll content and tissue nitrogen. HortScience HortSci, 39(5):1130–1132
dc.relation[Betancourt, 2014] Betancourt, G. D. (2014). Sistema de visión por computador para de tectar hierba no deseada en prototipo de cultivo de frijol usando ambiente controlado. Master’s thesis, Universidad Católica de Colombia.
dc.relation[Binch and Fox, 2017] Binch, A. and Fox, C. W. (2017). Controlled comparison of machine vision algorithms for rumex and urtica detection in grassland. Comput. Electron, 140:123– 138.
dc.relation[Brown and Noble, 2005] Brown, R. B. and Noble, S. D. (2005). Site-specific weed manage- ment : Sensing requirements : What do we need to see ? Weed Science, 53(2):252–258.
dc.relation[CCB, 2015] CCB, C. d. C. d. B. (2015). Manual Lechuga: Programa de apoyo agrícola y agroindustrial vicepresidencia del fortalecimiento empresarial.
dc.relation[Chavan and Nandedkar, 2018] Chavan, T. R. and Nandedkar, A. V. (2018). Agroavnet for crops and weeds classification: A step forward in automatic farming. Computers and Electronics in Agriculture, 154:361–372.
dc.relation[Cheng and Matson, 2015] Cheng, B. and Matson, E. T. (2015). A feature-based machine learning agent for automatic rice and weed discrimination. Lecture Notes in Computer Science, pages 517–527.
dc.relation[Corredor, 2011] Corredor, G. P. (2011). Desarrollo de un sistema de control en la aplicación de técnicas selectivas de eliminación de maleza. Master’s thesis, Universidad Nacional de Colombia.
dc.relation[Doll and Piedrahita, 1978] Doll, J. D. and Piedrahita, W. (1978). Métodos de control de maleza en yuca. Centro internacional de agricultura tropical Santiago de Cali, Colombia.
dc.relation[Dyrmann et al., 2017] Dyrmann, M., Jørgensen, R. N., and Midtiby, H. S. (2017). Ro- boweedsupport – detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. 11th European Conference on Precision Agriculture (EC- PA).
dc.relation[Dyrmann et al., 2016a] Dyrmann, M., Karstoft, H., and Midtiby, H. S. (2016a). Plant species classification using deep convolutional neural network. Biosystems Engineering, 151:72–80.
dc.relation[Dyrmann et al., 2016b] Dyrmann, M., Mortensen, A. K., Midtiby, H. S., and Jorgensen,R. N. (2016b). Pixel-wise classification of weeds and crops in images by using a fully convolutional neural network. In International Conference on Agricultural Engineering.
dc.relation[Elstone et al., 2020] Elstone, L., How, K. Y., Brodie, S., Ghazali, M. Z., Heath, W. P., and Grieve, B. (2020). High speed crop and weed identification in lettuce fields for precision weeding. Sensors, 20(2).
dc.relation[Fuentes and Romero, 1991] Fuentes, L. and Romero, C. (1991). Una visión del problema de las malezas en colombia. Agronomía Colombiana., 8(2):364–378.
dc.relation[Garcia and A., 1997] Garcia, B. and A., L. (1997). Malezas más comunes en colombia.Produmedios Bogotá-Colombia, 149.
dc.relation[Gómez, 1995] Gómez, J. F. (1995). Control de malezas. Ceñicaña. El cultivo de la caña en la zona azucarera de Colombia, pages 143-152.
dc.relation[Hamuda et al., 2018] Hamuda, E., Ginley, M., B., G. M., and Jones, E. (2018). Improved image processing-based crop detection using kalman filtering and the hungarian algorithm. Computers and Electronics in Agriculture, 148:37 – 44.
dc.relation[Hernández, 2017] Herna´ndez, S. (2017). Metodología para la discriminación de malezas basada en la respuesta espectral de la vegetación. Master’s thesis, Universidad Nacional de Colombia.
dc.relation[Huang et al., 2018a] Huang, H., Deng, J., Lan, Y., Yang, A., Deng, X., Wen, S., and Zhang, Y. . (2018a). Accurate weed mapping and prescription map generation based on fully convolutional networks using uav imagery. Sensors (Switzerland), 18(10).
dc.relation[Huang et al., 2018b] Huang, H., Lan, Y., Deng, J., Yang, A., Deng, X., Zhang, L., and Wen,S. (2018b). A semantic labeling approach for accurate weed mapping of high resolution uav imagery. Sensors (Switzerland), 18(7).
dc.relation[Huang et al., 2018c] Huang, H. Deng, J., Lan, Y. Yang, A. D. X., and Zhang, L. (2018c). A fully convolutional network for weed mapping of unmanned aerial vehicle (uav) imagery. PLoS ONE, 13(4).
dc.relation[Hung et al., 2014] Hung, C., Xu, Z., and Sukkarieh, S. (2014). Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a uav. Remote Sensing, 6.
dc.relation[INDAP, 2017] INDAP, I. d. d. a. C. (2017). Manual de producción de Lechuga.
dc.relation[Kamilaris and Prenafeta-Boldú, 2018] Kamilaris, A. and Prenafeta-Boldú, F. X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147:70–90.
dc.relation[Kharuf et al., 2018] Kharuf, G., Hernández, S., Orozco, M., cAday, D., de la C, O., and Delgado, M. (2018). Análisis de imágenes multiespectrales adquiridas con vehículos aéreos no tripulados. Ingeniería Electrónica, Automática y Comunicaciones, 39(2):79–91.
dc.relation[Koger et al., 2003] Koger, C. H., Shaw, D. R., Watson, C. E., and Reddy, K. N. (2003). Detecting late- season weed infestations in soybean (glycine max). Weed Technol, 17:696– 704.
dc.relation[Lameski et al., 2018] Lameski, P., Zdravevski, E., and Kulakov, A. (2018). Review of auto- mated weed control approaches: An environmental impact perspective. ICT Innovations, 940:132–147.
dc.relation[Liakos et al., 2018] Liakos, K. G., Busato, P., Moshou, D., Pearson, S., and Bochtis, D. . (2018). Machine learning in agriculture: A review.
dc.relation[Liu and Bruch, 2020] Liu, B. and Bruch, R. (2020). Weed detection for selective spraying: a review. AGRICULTURE ROBOTICS.
dc.relation[Liu, 2020] Liu, B., B. R. (2020). Weed detection for selective spraying: a review. Current Robotics Reports, 1.
dc.relation[Lopez-Granados, 2011] Lopez-Granados, F. (2011). Weed detection for site-specific weed management: mapping and real-time approaches. Weed Research, 51:1–11.
dc.relation[López-Granados et al., 2016] López-Granados, F., Torres-Sánchez, J., De Castro, A.-I., Serrano-Pérez, A., Mesas-Carrascosa, F.-J., and Pen˜a, J.-M. (2016). Object-based early monitoring of a grass weed in a grass crop using high resolution uav imagery. Agronomy for Sustainable Development, 36:4.
dc.relation[López-Granados et al., 2015] López-Granados, F., Torres-Sánchez, J., Serrano-Pérez, A., de Castro, A. I., Mesas-Carrascosa, F. J., and Pen˜a, J. M. (2015). Early season weed mapping in sunflower using uav technology: variability of herbicide treatment maps against weed thresholds. Precision Agriculture, 17(2):183–199.
dc.relation[McCool et al., 2017] McCool, C., Pérez, T., and Upcroft, B. (2017). Mixtures of light- weight deep convolutional neural networks: applied to agricultural robotics. IEEE Rob, 2(3):1344–1351.
dc.relation[McCool et al., 2017] McCool, C., Pérez, T., and Upcroft, B. (2017). Mixtures of light- weight deep convolutional neural networks: applied to agricultural robotics. IEEE Rob, 2(3):1344–1351.
dc.relation[Milioto et al., 2017] Milioto, A., Lottes, P., and Stachniss, C. (2017). Real-time blob-wise sugar beets vs weeds classification for monitoring fields using convolutional neural net- works. In Proceedings of the International Conference on Unmanned Aerial Vehicles in Geomatics.
dc.relation[Montenegro and Parada, 2015] Montenegro, B. A. and Parada, R. C. (2015). Diseño e implementación de un sistema de detección de malezas en cultivos cundiboyacenses. Master’s thesis, Universidad Católica de Colombia.
dc.relation[Nieuwenhuizen et al., 2007] Nieuwenhuizen, A. T., Tang, L., Hofstee, J. W., Müller, J., and van Henten, E. J. (2007). Colour based detection of volunteer potatoes as weeds in sugar beet fields using machine vision. Precision Agriculture, 8(6):267–278.
dc.relation[Osorio et al., 2020] Osorio, K., Puerto, A., Pedraza, C., Jamaica, D., and Rodríguez, L. (2020). A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering, 2(3):471–488.
dc.relation[Pantazi et al., 2016] Pantazi, X.-E., Moshou, D., and Bravo, C. (2016). Active learning system for weed species recognition based on hyperspectral sensing. Biosyst. Eng, 146:193–202.
dc.relation[Pantazi et al., 2017] Pantazi, X. E., Tamouridou, A. A., Alexandridis, T. K., Lagopodi,A. L., Kashefi, J., and Moshou, D. (2017). Evaluation of hierarchical self-organising maps for weed mapping using uas multispectral imagery. Computers and Electronics in Agriculture, 139:224–230.
dc.relation[Partel et al., 2018] Partel, V., Charan Kakarla, S., and Ampatzidis, Y. (2018). Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Computers and Electronics in Agriculture, 157:339–350.
dc.relation[Peña et al., 2013] Peña, J. M., Torres-S´anchez, J., de Castro, A. I., Kelly, M., and Lopez- Granados, F. (2013). Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (uav) images. PLoS ONE, 8:10.
dc.relation[Potena et al., 2016] Potena, C., Nardi, D., and Pretto, A. (2016). Fast and accurate crop and weed identification with summarized train sets for precision agriculture. In Interna- tional Conference on Intelligent Autonomous Systems., page 105–121.
dc.relation[Puerto, 2019] Puerto, L. (2019). Clasificación y cuantificación de maleza en cultivos de hortalizas por medio de procesamiento de imágenes digitales multiespectrales. Master’s thesis, Universidad Nacional de Colombia.
dc.relation[Raja et al., 2020] Raja, R., Nguyen, T. T., Slaughter, D. C., and Fennimore, S. A. (2020). Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosystems Engineering, 192:257–274.
dc.relation[Redmon, 2016] Redmon, J. (2013–2016). Darknet: Open source neural networks in c. http://pjreddie.com/darknet/.
dc.relation[Redmon and Farhadi, 2018] Redmon, J. and Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv.
dc.relation[Redmon J. and A., 2016] Redmon J., Divvala S., G. R. and A., F. (2016). You onlylook once: Unified,real-time object detection. arXiv.
dc.relation[Rumpf et al., 2012] Rumpf, T., Römer, C., Weis, M., Sökefeld, M., Gerhards, R., and Plümer, L. (2012). Sequential support vector machine classification for small-grain weed species discrimination with special regard to cirsium arvense and galium aparine. Computers and Electronics in Agriculture, 80:89–96.
dc.relation[Sa et al., 2018a] Sa, I., Chen, Z., Popovic, M., Khanna, R., Liebisch, F., Nieto, J., and Siegwart, R. (2018a). Weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robotics and Automation Letters, 3(1):588–595.
dc.relation[Sa et al., 2018b] Sa, I., Popovic, M., Khanna, R., Chen, Z., Lottes, P., Liebisch, F., and Siegwart, R. (2018b). Weedmap: A large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming.
dc.relation[Shaun M. Sharpe and Boyd, 2019] Shaun M. Sharpe, Arnold W. Schumann, J. Y. and Boyd, N. S. (2019). Vegetation detection and discrimination within vegetable plasticulture row-middles using a convolutional neural network. Precision Agriculture.
dc.relation[Simonyan and Zisserman, 2015] Simonyan, K. and Zisserman, A. (2015). Very deep convolutional networks for large-scale image recognition. CoRR, abs/1409.1556.
dc.relation[Sogamoso González, 2015] Sogamoso González, D. S., A. R. J. A. . S. G. L. E. (2015). Integraci´on de la mecatr´onica al desarrollo de la agricultura de precisión aplicada al control mecánico de maleza. Master’s thesis, Universidad Militar Nueva Granada. Bogotá- Colombia.
dc.relation[Srinivasan, 2006] Srinivasan, A. (2006). Precision agriculture: An overview a srinivasan. In Handbook of Precision Agriculture: Principles and Applications., pages 3–18.
dc.relation[Suh et al., 2018] Suh, H. K., IJsselmuiden, J., Hofstee, J. W., and van Henten, E. J. (2018). Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosystems Engineering, 174:50–65.
dc.relation[Sun et al., 2018] Sun, J., He, X., Ge, X., Wu, X., Shen, J., and Song, Y. (2018). Detection of key organs in tomato based on deep migration learning in a complex background. Agriculture, 8(12):196.
dc.relation[Tao et al., 2018] Tao, T., Wu, S., Li, L., Li, J., Bao, S., and Wei, X. (2018). Design and experiments of weeding teleoperated robot spectral sensor for winter rape and weed iden-tification. Advances in Mechanical Engineering, 10:5.
dc.relation[Tellaeche et al., 2008] Tellaeche, A., Artizzu, B., P., X., Pajares, G., Ribeiro, A., and Fernandez-Quintanilla, C. (2008). A new vision-based approach to differential spraying in precision agriculture. Computers and Electronics in Agriculture, 60(2):144–155.
dc.relation[Thorp and Tian, 2004] Thorp, K. R. and Tian, L. F. (2004). A review on remote sensing of weeds in agriculture. Precision Agriculture, 5(5):477–508.
dc.relation[Tzutalin, 2015] Tzutalin (2015). Labelimg. https://github.com/tzutalin/labelImg. Accessed: 2020-03-30.
dc.relation[Wang et al., 2019] Wang, A., Zhang, W., and Wei, X. (2019). A review on weed detection using ground-based machine vision and image processing techniques. Computers and Electronics in Agriculture, 158:226–240.
dc.relation[Xinshao and Cheng, 2015] Xinshao, W. and Cheng, C. (2015). Weed seeds classification based on pcanet deep learning baseline. IEEE Signal and Information Processing Association Annual Summit and Conference (APSIPA), pp, page 408–415.
dc.relation[Yunong Tian, 2019] Yunong Tian, Guodong Yang, Z. W. H. W. E. L. Z. L. (2019). Apple detection during different growth stages in orchards using the improved yolo-v3 model. Computers and Electronics in Agriculture, 157:417–426.
dc.rightsAtribución-NoComercial-CompartirIgual 4.0 Internacional
dc.rightshttp://creativecommons.org/licenses/by-nc-sa/4.0/
dc.rightsinfo:eu-repo/semantics/openAccess
dc.titleMétodo para la estimación de maleza en cultivos de lechuga utilizando aprendizaje profundo e imágenes multiespectrales
dc.typeTrabajo de grado - Maestría


Este ítem pertenece a la siguiente institución