Research on DETR-based Weed Detection Algorithm

Authors

  • Wenbing Liao
  • Wenwen Li

DOI:

https://doi.org/10.54097/7n2ye068

Keywords:

Transformer, CBAM, Focal Point Loss, Weed Detection

Abstract

To address the problem of complex field conditions and high similarity between corn seedlings and weeds, this study proposes an improved DETR (Detection Transformer) model for weed detection in corn fields, which uses the CBAM convolutional attention mechanism in the DETR model, and uses a focal loss function instead of the traditional cross-entropy loss function to balance the number of positive and negative class samples of the data. Compared with the original model, the mAP (Mean Average precision) of the improved model was increased by 1.12% to 91.56%.

Downloads

Download data is not yet available.

References

[1] CHEN K, YANG H, WU D, et al. Weed biology and management in the multi-omics era: Progress and perspectives[J]. Plant Communications, 2024.

[2] ZHANG Z, LI R, ZHAO C, et al. Reduction in weed infestation through integrated depletion of the weed seed bank in a rice-wheat cropping system[J]. Agronomy for sustainable development, 2021, 41(1): 10.

[3] CHEN L, JIN M, ZHANG W L, et al. Research advances on characteristics, damage and control measures of weedy rice[J]. 2020.

[4] YUAN Hongbo, ZHAO Nudong, CHENG Man. Research progress and prospect of field weed recognition based on image processing[J]. Journal of Agricultural Machinery, 2020, 51(S2): 323-334.

[5] MATHANKER S K, WECKLER P R, TAYLOR R K, et al. AdaBoost and support vector machine classifiers for automatic weed control: Canola and Wheat[C]//2010 Pittsburgh, Pennsylvania, June 20-June 23, 2010. American Society of Agricultural and Biological Engineers, 2010: 1.

[6] Miao Ronghui, Yang Hua, Wu Jinlong, et al. Recognition of overlapping leaves and weeds in spinach based on image chunking and reconstruction[J]. Transactions of the Chinese Society of Agricultural Engineering, 2020, 36(4).

[7] B. Zhong, J. Yang, Y. Liu, et al. Weed detection in pasture based on improved DINO [Z]. Henan Agricultural Science, 2023, 52(9): 156-163.

[8] JIN X, LIU T, MCCULLOUGH P E, et al. Evaluation of convolutional neural networks for herbicide susceptibility-based weed detection in turf[J]. Frontiers in Plant Science, 2023, 14: 1096802.

[9] MOAZZAM S I, KHAN U S, QURESHI W S, et al. Towards automated weed detection through two-stage semantic segmentation of tobacco and weed pixels in aerial Imagery[J]. Smart Agricultural Technology, 2023, 4: 100142.

[10] LI J, ZHANG W, ZHOU H, et al. Weed detection in soybean fields using improved YOLOv7 and evaluating herbicide reduction efficacy[J]. Frontiers in Plant Science, 2024, 14: 1284338.

[11] ONG P, TEO K S, SIA C K. UAV-based weed detection in Chinese cabbage using deep learning[J]. Smart Agricultural Technology, 2023, 4: 100181.

[12] KHAN S D, BASALAMAH S, LBATH A. Weed–Crop segmentation in drone images with a novel encoder–decoder framework enhanced via attention modules[J]. Remote Sensing, 2023, 15(23): 5615.

[13] CARION N, MASSA F, SYNNAEVE G, et al. End-to-end object detection with transformers[C]//European conference on computer vision. Cham: Springer International Publishing, 2020: 213-229.

[14] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.

[15] LIN T Y, GOYAL P, GIRSHICK R, et al. Focal loss for dense object detection[C]//Proceedings of the IEEE international conference on computer vision. 2017: 2980-2988.

[16] WOO S, PARK J, LEE J Y, et al. Cbam: Convolutional block attention module[C]//Proceedings of the European conference on computer vision (ECCV). 2018: 3-19.

[17] JIANG H, ZHANG C, QIAO Y, et al. CNN feature based graph convolutional network for weed and crop recognition in smart farming[J]. Computers and electronics in agriculture, 2020, 174: 105450.

Downloads

Published

21-01-2025

Issue

Section

Articles

How to Cite

Liao, W., & Li, W. (2025). Research on DETR-based Weed Detection Algorithm. Frontiers in Computing and Intelligent Systems, 11(1), 97-101. https://doi.org/10.54097/7n2ye068