Register      Login
Crop and Pasture Science Crop and Pasture Science Society
Plant sciences, sustainable farming systems and food quality
RESEARCH ARTICLE

Deep learning-based object detection model for location and recognition of weeds in cereal fields using colour imagery

Hossein Akhtari https://orcid.org/0009-0000-7479-1448 A , Hossein Navid https://orcid.org/0000-0002-3694-5175 A * , Hadi Karimi B and Karl-Heinz Dammer C
+ Author Affiliations
- Author Affiliations

A Department of Biosystems Engineering, Faculty of Agriculture, University of Tabriz, Tabriz, Iran.

B Agricultural Engineering Research Department, Kerman Agricultural and Resource Research and Education Center, Areeo, Kerman, Iran.

C Leibniz Institute for Agricultural Engineering and Bioeconomy (ATB), Max-Eyth-Allee 100, Potsdam 14469, Germany.

* Correspondence to: navid@tabrizu.ac.ir

Handling Editor: Davide Cammarano

Crop & Pasture Science 76, CP24243 https://doi.org/10.1071/CP24243
Submitted: 8 August 2024  Accepted: 19 March 2025  Published: 10 April 2025

© 2025 The Author(s) (or their employer(s)). Published by CSIRO Publishing

Abstract

Context

Automatic weed detection and control is crucial in precision agriculture, especially in cereal fields where overlapping crops and narrow row spacing present significant challenges. This research prioritized small weed detection and its performance in dense images by using innovative techniques.

Aims

This study investigated two recent convolutional neural networks (CNNs) with different architectures and detection models for weed detection in cereal fields. The feature pyramid network (FPN) technique was applied to improve performance. To tackle challenges such as high weed density and occlusion, a method of dividing images into smaller parts with pixel area thresholds was implemented, achieving an approximately 22% increase in average precision (AP).

Methods

The dataset includes red–green–blue (RGB) images of cereal fields captured in Germany (2018–2019) at varying growth stages. Images were annotated using ‘LabelImg’, assigning weed labels. Models were evaluated by precision, recall, prediction time, and detection rate.

Key results

The evaluation results showed that the FasterRCNN-ResNet50 with FPN had the best performance in terms of detection numbers. In the tests, the model successfully detected 508 of 535 annotated weeds in 36 images, achieving a detection rate of 94.95%, with a 95% confidence interval of [92.76%, 96.51%]. Additionally, a method was proposed to boost average precision and recall in high-density weed images, enhancing detection operations.

Conclusions

The results of this research showed that the presented algorithms and methods have a high ability to solve above-mentioned challenges.

Implications

This research evaluated deep learning models, and recommends the best and stresses reliable weed identification at all growth stages.

Keywords: AI in precision agriculture, CNN-based weed detection, deep learning, digital agriculture, field image analysis, fully dense weed detection, weed detection optimization, weed mapping.

References

Baheti P (2023) Train test validation split: how to & best practices. v7labs. Available at https://www.v7labs.com/blog/train-validation-test-set

Benbrook CM (2016) Trends in glyphosate herbicide use in the United States and globally. Environmental Sciences Europe 28, 1-15.
| Crossref | Google Scholar |

Brahimi M, Arsenovic M, Laraba S, Sladojevic S, Boukhalfa K, Moussaoui A (2018) Deep learning for plant diseases: detection and saliency map visualisation. In ‘Human and machine learning: visible, explainable, trustworthy and transparent’. (Eds J Zhou, F Chen) pp. 93–117. (Springer)

Buslaev A, Iglovikov VI, Khvedchenya E, Parinov A, Druzhinin M, Kalinin AA (2020) Albumentations: fast and flexible image augmentations. Information 11, 125.
| Crossref | Google Scholar |

Dyrmann M, Jørgensen RN, Midtiby HS (2017) RoboWeedSupport-Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Advances in Animal Biosciences 8, 842-847.
| Crossref | Google Scholar |

FAC FAAC (2020) Noxious Weeds Management. ARTICLE 1.7. California Legislature. 1.7. (CA Food & Agri Code)

Flessner ML, Burke IC, Dille JA, Everman WJ, Vangessel MJ, Tidemann B, Manuchehri MR, Soltani N, Sikkema PH (2021) Potential wheat yield loss due to weeds in the United States and Canada. Weed Technology 35, 916-923.
| Crossref | Google Scholar |

Hasan ASMM, Sohel F, Diepeveen D, Laga H, Jones MGK (2021) A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture 184, 106067.
| Crossref | Google Scholar |

He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 770–778. (IEEE)

Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Andreetto M, Adam H (2017) Mobilenets: efficient convolutional neural networks for mobile vision applications. arXiv preprint. arXiv:1704.04861.

Howard A, Sandler M, Chu G, Chen L-C, Chen B, Tan M, Wang W, Zhu Y, Pang R, Vasudevan V (2019) Searching for mobilenetv3. In ‘Proceedings of the IEEE/CVF international conference on computer vision’. pp. 1314–1324. (IEEE)

Islam N, Rashid MM, Wibowo S, Xu C-Y, Morshed A, Wasimi SA, Moore S, Rahman SM (2021) Early weed detection using image processing and machine learning techniques in an Australian chilli farm. Agriculture 11, 387.
| Crossref | Google Scholar |

Islam MM, Hossain MB, Akhtar MN, Moni MA, Hasan KF (2022) CNN based on transfer learning models using data augmentation and transformation for detection of concrete crack. Algorithms 15, 287.
| Crossref | Google Scholar |

Jin X, Bagavathiannan M, Maity A, Chen Y, Yu J (2022) Deep learning for detecting herbicide weed control spectrum in turfgrass. Plant Methods 18, 94.
| Crossref | Google Scholar |

Khan S, Tufail M, Khan MT, Khan ZA, Anwar S (2021) Deep learning-based identification system of weeds and crops in strawberry and pea fields for a precision agriculture sprayer. Precision Agriculture 22, 1711-1727.
| Crossref | Google Scholar |

Krizhevsky A, Sutskever I, Hinton GE (2017) ImageNet classification with deep convolutional neural networks. Communications of the ACM 60, 84-90.
| Crossref | Google Scholar |

Le VNT, Truong G, Alameh K (2021) Detecting weeds from crops under complex field environments based on Faster RCNN. In ‘2020 IEEE Eighth International Conference on Communications and Electronics (ICCE)’. pp. 350–355. (IEEE)

Lin T-Y, Dollár P, Girshick R, He K, Hariharan B, Belongie S (2017a) Feature pyramid networks for object detection. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’, pp. 2117–2125. (IEEE)

Lin T-Y, Goyal P, Girshick R, He K, Dollár P (2017b) Focal loss for dense object detection. In ‘Proceedings of the IEEE international conference on computer vision’. pp. 2980–2988. (IEEE)

Liu J, Wang X (2020) Tomato diseases and pests detection based on improved Yolo V3 convolutional neural network. Frontiers in Plant Science 11, 898.
| Crossref | Google Scholar |

Moazzam SI, Nawaz T, Qureshi WS, Khan US, Tiwana MI (2023) A W-shaped convolutional network for robust crop and weed classification in agriculture. Precision Agriculture 24, 2002-2018.
| Crossref | Google Scholar |

Muraina I (2022) Ideal dataset splitting ratios in machine learning algorithms: general concerns for data scientists and data analysts. In ‘7th International Mardin Artuklu Scientific Research Conference’. Mardin, Turkey.

Ong P, Teo KS, Sia CK (2023) UAV-based weed detection in Chinese cabbage using deep learning. Smart Agricultural Technology 4, 100181.
| Crossref | Google Scholar |

Oquab M, Bottou L, Laptev I, Sivic J (2014) Learning and transferring mid-level image representations using convolutional neural networks. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 1717–1724. (IEEE)

Puchalka R, Rutkowski L, Popa M-O, Pliszko A, Piwczynski M (2018) Bur-chervil Anthriscus caucalis M. Bieb. (Apiaceae): potentially invasive species in forests. Baltic Forestry 24, 189-200.
| Google Scholar |

Radoglou-Grammatikis P, Sarigiannidis P, Lagkas T, Moscholios I (2020) A compilation of UAV applications for precision agriculture. Computer Networks 172, 107148.
| Crossref | Google Scholar |

Rahman A, Lu Y, Wang H (2023) Performance evaluation of deep learning object detectors for weed detection for cotton. Smart Agricultural Technology 3, 100126.
| Crossref | Google Scholar |

Randall RP (2017) ‘A global compendium of weeds.’ (RP Randall)

Ren S, He K, Girshick R, Sun J (2015) Faster r-cnn: towards real-time object detection with region proposal networks. Advances in Neural Information Processing Systems 28, 91-99.
| Google Scholar |

Sandler M, Howard A, Zhu M, Zhmoginov A, Chen L-C (2018) Mobilenetv2: inverted residuals and linear bottlenecks. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 4510–4520. (IEEE)

Seelan SK, Laguette S, Casady GM, Seielstad GA (2003) Remote sensing applications for precision agriculture: a learning community approach. Remote Sensing of Environment 88, 157-169.
| Crossref | Google Scholar |

Tang Z, Gao Y, Karlinsky L, Sattigeri P, Feris R, Metaxas D (2020) OnlineAugment: online data augmentation with less domain knowledge. In ‘Computer Vision–ECCV 2020: 16th European Conference’, Glasgow, UK, 23–28 August 2020, Proceedings, Part VII. pp. 313–329. (Springer)

Tu S, Pang J, Liu H, Zhuang N, Chen Y, Zheng C, Wan H, Xue Y (2020) Passion fruit detection and counting based on multiple scale faster R-CNN using RGB-D images. Precision Agriculture 21, 1072-1091.
| Crossref | Google Scholar |

Wang A, Zhang W, Wei X (2019) A review on weed detection using ground-based machine vision and image processing techniques. Computers and Electronics in Agriculture 158, 226-240.
| Crossref | Google Scholar |

Wijnhoven RG, De With P (2010) Fast training of object detection using stochastic gradient descent. In ‘2010 20th International conference on pattern recognition’. pp. 424–427. (IEEE)

Wu S, Li X, Wang X (2020) IoU-aware single-stage object detector for accurate localization. Image and Vision Computing 97, 103911.
| Crossref | Google Scholar |

Yu J, Sharpe SM, Schumann AW, Boyd NS (2019) Deep learning for image-based weed detection in turfgrass. European Journal of Agronomy 104, 78-84.
| Crossref | Google Scholar |

Yu J, Schumann AW, Sharpe SM, Li X, Boyd NS (2020) Detection of grassy weeds in bermudagrass with deep convolutional neural networks. Weed Science 68, 545-552.
| Crossref | Google Scholar |

Zhang RF, Wang C, Hu XP, Liu YX, Chen S, Su BF (2020) Weed location and recognition based on UAV imaging and deep learning. International Journal of Precision Agricultural Aviation 3, 23-29.
| Crossref | Google Scholar |