|

The Stress Detection and Segmentation Strategy in Tea Plant at Canopy Level.

Researchers

Journal

Modalities

Models

Abstract

As compared with the traditional visual discrimination methods, deep learning and image processing methods have the ability to detect plants efficiently and non-invasively. This is of great significance in the diagnosis and breeding of plant disease resistance phenotypes. Currently, the studies on plant diseases and pest stresses mainly focus on a leaf scale. There are only a few works regarding the stress detection at a complex canopy scale. In this work, three tea plant stresses with similar symptoms that cause a severe threat to the yield and quality of tea gardens, including the tea green leafhopper [Empoasca (Matsumurasca) onukii Matsuda], anthracnose (Gloeosporium theae-sinensis Miyake), and sunburn (disease-like stress), are evaluated. In this work, a stress detection and segmentation method by fusing deep learning and image processing techniques at a canopy scale is proposed. First, a specified Faster RCNN algorithm is proposed for stress detection of tea plants at a canopy scale. After obtaining the stress detection boxes, a new feature, i.e., RGReLU, is proposed for the segmentation of tea plant stress scabs. Finally, the detection model at the canopy scale is transferred to a field scale by using unmanned aerial vehicle (UAV) images. The results show that the proposed method effectively achieves canopy-scale stress adaptive segmentation and outputs the scab type and corresponding damage ratio. The mean average precision (mAP) of the object detection reaches 76.07%, and the overall accuracy of the scab segmentation reaches 88.85%. In addition, the results also show that the proposed method has a strong generalization ability, and the model can be migrated and deployed to UAV scenarios. By fusing deep learning and image processing technology, the fine and quantitative results of canopy-scale stress monitoring can provide support for a wide range of scouting of tea garden.Copyright © 2022 Zhao, Zhang, Tang, Yu, Yan, Chen and Yuan.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *