Paper ID: 2112.13986
Deep-CNN based Robotic Multi-Class Under-Canopy Weed Control in Precision Farming
Yayun Du, Guofeng Zhang, Darren Tsang, M. Khalid Jawed
Smart weeding systems to perform plant-specific operations can contribute to the sustainability of agriculture and the environment. Despite monumental advances in autonomous robotic technologies for precision weed management in recent years, work on under-canopy weeding in fields is yet to be realized. A prerequisite of such systems is reliable detection and classification of weeds to avoid mistakenly spraying and, thus, damaging the surrounding plants. Real-time multi-class weed identification enables species-specific treatment of weeds and significantly reduces the amount of herbicide use. Here, our first contribution is the first adequately large realistic image dataset \textit{AIWeeds} (one/multiple kinds of weeds in one image), a library of about 10,000 annotated images of flax, and the 14 most common weeds in fields and gardens taken from 20 different locations in North Dakota, California, and Central China. Second, we provide a full pipeline from model training with maximum efficiency to deploying the TensorRT-optimized model onto a single board computer. Based on \textit{AIWeeds} and the pipeline, we present a baseline for classification performance using five benchmark CNN models. Among them, MobileNetV2, with both the shortest inference time and lowest memory consumption, is the qualified candidate for real-time applications. Finally, we deploy MobileNetV2 onto our own compact autonomous robot \textit{SAMBot} for real-time weed detection. The 90\% test accuracy realized in previously unseen scenes in flax fields (with a row spacing of 0.2-0.3 m), with crops and weeds, distortion, blur, and shadows, is a milestone towards precision weed control in the real world. We have publicly released the dataset and code to generate the results at \url{https://github.com/StructuresComp/Multi-class-Weed-Classification}.
Submitted: Dec 28, 2021