Weed recognition using deep learning techniques on class-imbalanced imagery
A. S. M. Mahmudul Hasan A B , Ferdous Sohel A B * , Dean Diepeveen B C , Hamid Laga A D and Michael G. K. Jones BA Information Technology, Murdoch University, Murdoch, WA 6150, Australia.
B Centre for Crop and Food Innovation, Food Futures Institute, Murdoch University, Murdoch, WA 6150, Australia.
C Department of Primary Industries and Regional Development, South Perth, WA 6151, Australia.
D Centre of Biosecurity and One Health, Harry Butler Institute, Murdoch University, Murdoch University, Murdoch, WA 6150, Australia.
Crop & Pasture Science - https://doi.org/10.1071/CP21626
Submitted: 9 August 2021 Accepted: 9 December 2021 Published online: 11 April 2022
© 2022 The Author(s) (or their employer(s)). Published by CSIRO Publishing. This is an open access article distributed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND)
Abstract
Context: Most weed species can adversely impact agricultural productivity by competing for nutrients required by high-value crops. Manual weeding is not practical for large cropping areas. Many studies have been undertaken to develop automatic weed management systems for agricultural crops. In this process, one of the major tasks is to recognise the weeds from images. However, weed recognition is a challenging task. It is because weed and crop plants can be similar in colour, texture and shape which can be exacerbated further by the imaging conditions, geographic or weather conditions when the images are recorded. Advanced machine learning techniques can be used to recognise weeds from imagery.
Aims: In this paper, we have investigated five state-of-the-art deep neural networks, namely VGG16, ResNet-50, Inception-V3, Inception-ResNet-v2 and MobileNetV2, and evaluated their performance for weed recognition.
Methods: We have used several experimental settings and multiple dataset combinations. In particular, we constructed a large weed-crop dataset by combining several smaller datasets, mitigating class imbalance by data augmentation, and using this dataset in benchmarking the deep neural networks. We investigated the use of transfer learning techniques by preserving the pre-trained weights for extracting the features and fine-tuning them using the images of crop and weed datasets.
Key results: We found that VGG16 performed better than others on small-scale datasets, while ResNet-50 performed better than other deep networks on the large combined dataset.
Conclusions: This research shows that data augmentation and fine tuning techniques improve the performance of deep learning models for classifying crop and weed images.
Implications: This research evaluates the performance of several deep learning models and offers directions for using the most appropriate models as well as highlights the need for a large scale benchmark weed dataset.
Keywords: crop and weed classification, digital agriculture, Inception-ResNet-V2, Inception-V3, machine learning, MobileNetV2, precision agriculture, ResNet-50, VGG16.
References
Abadi M, Barham P, Chen J, et al. (2016) Tensorflow: a system for large-scale machine learning. In ‘Proceedings of the 12th USENIX symposium on operating systems design and implementation (OSDI ’16), 2–4 November 2016, Savannah, GA, USA’. pp. 265–283. (USENIX Association)Achanta R, Shaji A, Smith K, Lucchi A, Fua P, Süsstrunk S (2012) SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Transactions on Pattern Analysis and Machine Intelligence 34, 2274–2282.
| SLIC superpixels compared to state-of-the-art superpixel methods.Crossref | GoogleScholarGoogle Scholar | 22641706PubMed |
Ahmad A, Saraswat D, Aggarwal V, Etienne A, Hancock B (2021) Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Computers and Electronics in Agriculture 184, 106081
| Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems.Crossref | GoogleScholarGoogle Scholar |
Ali-Gombe A, Elyan E (2019) MFC-Gan: class-imbalanced dataset classification using multiple fake class generative adversarial network. Neurocomputing 361, 212–221.
| MFC-Gan: class-imbalanced dataset classification using multiple fake class generative adversarial network.Crossref | GoogleScholarGoogle Scholar |
Chavan TR, Nandedkar AV (2018) AgroAVNET for crops and weeds classification: a step forward in automatic farming. Computers and Electronics in Agriculture 154, 361–372.
| AgroAVNET for crops and weeds classification: a step forward in automatic farming.Crossref | GoogleScholarGoogle Scholar |
Chollet F (2015) Keras. https://github.com/fchollet/keras
Dargan S, Kumar M, Ayyagari MR, Kumar G (2020) A survey of deep learning and its applications: a new paradigm to machine learning. Archives of Computational Methods in Engineering 27, 1071–1092.
| A survey of deep learning and its applications: a new paradigm to machine learning.Crossref | GoogleScholarGoogle Scholar |
Deng J, Dong W, Socher R, Li L-J, Li K, Fei-Fei L (2009) Imagenet: a large-scale hierarchical image database. In ‘2009 IEEE conference on computer vision and pattern recognition’. pp. 248–255. (IEEE)
| Crossref |
Dyrmann M, Karstoft H, Midtiby HS (2016) Plant species classification using deep convolutional neural network. Biosystems Engineering 151, 72–80.
| Plant species classification using deep convolutional neural network.Crossref | GoogleScholarGoogle Scholar |
Espejo-Garcia B, Mylonas N, Athanasakos L, Fountas S, Vasilakoglou I (2020) Towards weeds identification assistance through transfer learning. Computers and Electronics in Agriculture 171, 105306
| Towards weeds identification assistance through transfer learning.Crossref | GoogleScholarGoogle Scholar |
Ferreira AdS, Freitas DM, da Silva GG, Pistori H, Folhes MT (2017) Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture 143, 314–324.
| Weed detection in soybean crops using convnets.Crossref | GoogleScholarGoogle Scholar |
Gando G, Yamada T, Sato H, Oyama S, Kurihara M (2016) Fine-tuning deep convolutional neural networks for distinguishing illustrations from photographs. Expert Systems with Applications 66, 295–301.
| Fine-tuning deep convolutional neural networks for distinguishing illustrations from photographs.Crossref | GoogleScholarGoogle Scholar |
Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 580–587.
Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial networks. Advances in neural information processing systems, 27.
Guo Y, Shi H, Kumar A, Grauman K, Rosing T, Feris R (2019) Spottune: transfer learning through adaptive fine-tuning. In ‘Proceedings of the IEEE/CVF conference on computer vision and pattern recognition’. pp. 4805–4814. (IEEE)
Harker KN, O’Donovan JT (2013) Recent weed control, weed management, and integrated weed management. Weed Technology 27, 1–11.
| Recent weed control, weed management, and integrated weed management.Crossref | GoogleScholarGoogle Scholar |
Hasan AMMM, Sohel F, Diepeveen D, Laga H, Jones MGK (2021) A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture 184, 106067
| A survey of deep learning techniques for weed detection from images.Crossref | GoogleScholarGoogle Scholar |
He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 770–778. (IEEE)
Hentschel C, Wiradarma TP, Sack H (2016) Fine tuning cnns with scarce training data – adapting imagenet to art epoch classification. In ‘2016 IEEE international conference on image processing (ICIP)’. pp. 3693–3697. (IEEE)
| Crossref |
Iqbal N, Manalil S, Chauhan BS, Adkins SW (2019) Investigation of alternate herbicides for effective weed management in glyphosate-tolerant cotton. Archives of Agronomy and Soil Science 65, 1885–1899.
| Investigation of alternate herbicides for effective weed management in glyphosate-tolerant cotton.Crossref | GoogleScholarGoogle Scholar |
Jensen TA, Smith B, Defeo LF (2020) An automated site-specific fallow weed management system using unmanned aerial vehicles. Paper presented at the GRDC Grains Research Update in Goondiwindi, Qld.
Jiang H, Zhang C, Qiao Y, Zhang Z, Zhang W, Song C (2020) CNN feature based graph convolutional network for weed and crop recognition in smart farming. Computers and Electronics in Agriculture 174, 105450
| CNN feature based graph convolutional network for weed and crop recognition in smart farming.Crossref | GoogleScholarGoogle Scholar |
Kamilaris A, Prenafeta-Boldú FX (2018) Deep learning in agriculture: a survey. Computers and Electronics in Agriculture 147, 70–90.
| Deep learning in agriculture: a survey.Crossref | GoogleScholarGoogle Scholar |
Khan SH, Hayat M, Bennamoun M, Sohel FA, Togneri R (2017) Cost-sensitive learning of deep feature representations from imbalanced data. IEEE Transactions on Neural Networks and Learning Systems 29, 3573–3587.
| Cost-sensitive learning of deep feature representations from imbalanced data.Crossref | GoogleScholarGoogle Scholar | 28829320PubMed |
Krawczyk B (2016) Learning from imbalanced data: open challenges and future directions. Progress in Artificial Intelligence 5, 221–232.
| Learning from imbalanced data: open challenges and future directions.Crossref | GoogleScholarGoogle Scholar |
Lameski P, Zdravevski E, Trajkovik V, Kulakov A (2017) Weed detection dataset with rgb images taken under variable light conditions. In ‘ICT Innovations 2017’. Communications in computer and information science. (Eds D Trajanov, V Bakeva) pp. 112–119 (Springer: Cham, Switzerland)
Lameski P, Zdravevski E, Kulakov A (2018) Review of automated weed control approaches: an environmental impact perspective. In ‘Proceedings of the 10th International Conference’. ICT Innovations 2018, 17–19 September 2018, Ohrid, Macedonia. pp. 132–147. (Springer)
Lin T-Y, Maire M, Belongie S, Hays J, Perona P, Ramanan D, Dolla’r P, Zitnick CL (2014). Microsoft coco: common objects in context. In ‘Computer Vision – ECCV 2014. ECCV 2014’. Lecture notes in computer science. vol. 8693. (Eds D Fleet, T Pajdla, B Schiele, T Tuytelaars) pp. 740–755. (Springer: Cham, Switzerland)
López-Granados F (2011) Weed detection for site-specific weed management: mapping and real-time approaches. Weed Research 51, 1–11.
| Weed detection for site-specific weed management: mapping and real-time approaches.Crossref | GoogleScholarGoogle Scholar |
McLeod R (2018) Annual costs of weeds in australia. Available at https://invasives.com.au/wp-content/uploads/2019/01/Cost-of-weeds-report.pdf
Medina-Pastor P, Triacchini G (2020) The 2018 european union report on pesticide residues in food. EFSA Journal 18, e06057
Nkemelu DK, Omeiza D, Lubalo N (2018) Deep convolutional neural network for plant seedlings classification. arXiv preprint arXiv:1811.08404.
Olsen A, Konovalov DA, Philippa B, et al. (2019) Deepweeds: a multiclass weed species image dataset for deep learning. Scientific Reports 9, 1–12.
Pan SJ, Yang Q (2010) A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering 22, 1345–1359.
| A survey on transfer learning.Crossref | GoogleScholarGoogle Scholar |
Peteinatos G, Reichel P, Karouta J, Andújar D, Gerhards R (2020) Weed identification in maize, sunflower, and potatoes with the aid of convolutional neural networks. Remote Sensing 12, 4185
| Weed identification in maize, sunflower, and potatoes with the aid of convolutional neural networks.Crossref | GoogleScholarGoogle Scholar |
Robocrop spot sprayer: weed removal (2018) Available at https://garford.com/products/robocrop-spot-sprayer/. [Retrieved January 25, 2021]
Sabottke CF, Spieler BM (2020) The effect of image resolution on deep learning in radiography. Radiology: Artificial Intelligence 2, e190015
| The effect of image resolution on deep learning in radiography.Crossref | GoogleScholarGoogle Scholar | 33937810PubMed |
Sahlsten J, Jaskari J, Kivinen J, Turunen L, Jaanio E, Hietala K, Kaski K (2019) Deep learning fundus image analysis for diabetic retinopathy and macular edema grading. Scientific Reports 9, 10750
| Deep learning fundus image analysis for diabetic retinopathy and macular edema grading.Crossref | GoogleScholarGoogle Scholar | 31341220PubMed |
Sandler M, Howard A, Zhu M, Zhmoginov A, Chen L-C (2018) Mobilenetv2: inverted residuals and linear bottlenecks. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 4510–4520. (IEEE)
Shao L, Zhu F, Li X (2015) Transfer learning for visual categorization: a survey. IEEE Transactions on Neural Networks and Learning Systems 26, 1019–1034.
| Transfer learning for visual categorization: a survey.Crossref | GoogleScholarGoogle Scholar | 25014970PubMed |
Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. In ‘International conference on learning representations (ICLR)’. 7–9 May 2015, San Diego, CA, USA. (ICLR).
Slaughter DC, Giles DK, Downey D (2008) Autonomous robotic weed control systems: a review. Computers and Electronics in Agriculture 61, 63–78.
| Autonomous robotic weed control systems: a review.Crossref | GoogleScholarGoogle Scholar |
Steinberg R (2017) 6 areas where artificial neural networks outperform humans. Available at https://venturebeat.com/2017/12/08/6-areas-where-artificial-neural-networks-outperform-humans/ [Accessed 25 December 2020]
Suh HK, Ijsselmuiden J, Hofstee JW, van Henten EJ (2018) Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosystems Engineering 174, 50–65.
| Transfer learning for the classification of sugar beet and volunteer potato under field conditions.Crossref | GoogleScholarGoogle Scholar |
Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z (2016) Rethinking the inception architecture for computer vision. In ‘Proceedings of the IEEE conference on computer vision and pattern recognition’. pp. 2818–2826. (IEEE)
Szegedy C, Ioffe S, Vanhoucke V, Alemi A (2017) Inception-v4, inception-resnet and the impact of residual connections on learning. In ‘Proceedings of the AAAI conference on artificial intelligence’. 31(1). (AAAI Press)
Teimouri N, Dyrmann M, Nielsen PR, Mathiassen SK, Somerville GJ, Jørgensen RN (2018) Weed growth stage estimator using deep convolutional neural networks. Sensors 18, 1580
| Weed growth stage estimator using deep convolutional neural networks.Crossref | GoogleScholarGoogle Scholar |
Tian H, Wang T, Liu Y, Qiao X, Li Y (2020) Computer vision technology in agricultural automation—a review. Information Processing in Agriculture 7, 1–19.
| Computer vision technology in agricultural automation—a review.Crossref | GoogleScholarGoogle Scholar |
Van der Walt S, Schönberger JL, Nunez-Iglesias J, Boulogne F, Warner JD, Yager N, Gouillart E, Yu T (2014) Scikit-image: image processing in python. PeerJ 2, e453
| Scikit-image: image processing in python.Crossref | GoogleScholarGoogle Scholar | 25024921PubMed |
Wäldchen J, Mäder P (2018) Plant species identification using computer vision techniques: a systematic literature review. Archives of Computational Methods in Engineering 25, 507–543.
| Plant species identification using computer vision techniques: a systematic literature review.Crossref | GoogleScholarGoogle Scholar | 29962832PubMed |
Weedseeker 2 spot spray system (n.d.) Available at https://agriculture.trimble.com/product/weedseeker-2-spot-spray-system/ [Accessed 25 January 2021]