Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Jul 7:16:1609222.
doi: 10.3389/fpls.2025.1609222. eCollection 2025.

High-throughput end-to-end aphid honeydew excretion behavior recognition method based on rapid adaptive motion-feature fusion

Affiliations

High-throughput end-to-end aphid honeydew excretion behavior recognition method based on rapid adaptive motion-feature fusion

Zhongqiang Song et al. Front Plant Sci. .

Abstract

Introduction: Aphids are significant agricultural pests and vectors of plant viruses. Their Honeydew Excretion(HE) behavior holds critical importance for investigating feeding activities and evaluating plant resistance levels. Addressing the challenges of suboptimal efficiency, inadequate real-time capability, and cumbersome operational procedures inherent in conventional manual and chemical detection methodologies, this research introduces an end-to-end multi-target behavior detection framework. This framework integrates spatiotemporal motion features with deep learning architectures to enhance detection accuracy and operational efficacy.

Methods: This study established the first fine-grained dataset encompassing aphid Crawling Locomotion(CL), Leg Flicking(LF), and HE behaviors, offering standardized samples for algorithm training. A rapid adaptive motion feature fusion algorithm was developed to accurately extract high-granularity spatiotemporal motion features. Simultaneously, the RT-DETR detection model underwent deep optimization: a spline-based adaptive nonlinear activation function was introduced, and the Kolmogorov-Arnold network was integrated into the deep feature stage of the ResNet50 backbone network to form the RK50 module. These modifications enhanced the model's capability to capture complex spatial relationships and subtle features.

Results and discussion: Experimental results demonstrated that the proposed framework achieved an average precision of 85.9%. Compared with the model excluding the RK50 module, the mAP50 improved by 2.9%, and its performance in detecting small-target honeydew significantly surpassed mainstream algorithms. This study presents an innovative solution for automated monitoring of aphids' fine-grained behaviors and provides a reference for insect behavior recognition research. The datasets, codes, and model weights were made available on GitHub (https://github.com/kuieless/RAMF-Aphid-Honeydew-Excretion-Behavior-Recognition).

Keywords: Kolmogorov-Arnold networks; RT-DETR-RK50; aphid behavior recognition; honeydew excretion detection; rapid adaptive motion feature fusion.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Dataset images of different population densities and light intensities.
Figure 2
Figure 2
Flowchart of the RAMF framework.
Figure 3
Figure 3
RAMF processing workflow and visualization diagram.
Figure 4
Figure 4
Aphid behavior classification: (a) crawling locomotion; (b) Leg Flicking; (c) Aphid Honeydew Excretion.
Figure 5
Figure 5
Architecture diagram of RT-DETR-RK50 network.
Figure 6
Figure 6
Network architecture diagrams of normal blocks and RK50 blocks.
Figure 7
Figure 7
Network model diagram of KAN conv.
Figure 8
Figure 8
Visual representation of motion feature extraction across different temporal windows.
Figure 9
Figure 9
Feature activation heatmap comparison across RT-DETR-RK50 variants: (a) R50, (b) RK50-1, (c) RK50-2 (Ours), (d) RK50-3.
Figure 10
Figure 10
Confusion matrix comparison of detection models across different Aphid behavior categories.
Figure 11
Figure 11
Comparison of training loss curves across different model architectures.
Figure 12
Figure 12
Detection results based on the model with the highest mAP50 metric.
Figure 13
Figure 13
Grad-CAM++ visualization results under varying lighting conditions and Aphid densities: (a) Different lighting only, (b) Different density only, (c) HE target detection effect under different lighting and density.
Figure 14
Figure 14
Stage-wise honeydew excreting detection pipeline and results.

Similar articles

References

    1. Aharon N., Orfaig R., Bobrovsky B. Z. (2022). Bot-sort: Robust associations multi-pedestrian tracking. arXiv preprint arXiv 2206, 14651. doi: 10.48550/arXiv.2206.14651 - DOI
    1. Ahmed M. M., Hassanien E. E., Hassanien A. E. (2024). A smart IoT-based monitoring system in poultry farms using chicken behavioral analysis. Internet Things 25, 101010. doi: 10.1016/j.iot.2023.101010 - DOI
    1. Alzubaidi L., Zhang J., Humaidi A. J., Al-Dujaili A., Duan Y., Al-Shamma O., et al. (2021). Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. J. big Data 8, 1–74. doi: 10.1186/s40537-021-00444-8 - DOI - PMC - PubMed
    1. Carion N., Massa F., Synnaeve G., Usunier N., Kirillov A., Zagoruyko S. (2020). “End-to-end object detection with transformers,” in European conference on computer vision (Springer International Publishing, Cham: ), 213–229. doi: 10.1007/978-3-030-58452-8_13 - DOI
    1. Chattopadhay A., Sarkar A., Howlader P., Balasubramanian V. N. (2018). “Grad-cam++: Generalized gradient-based visual explanations for deep convolutional networks,” in 2018 IEEE winter conference on applications of computer vision (WACV) (Lake Tahoe, NV, USA: IEEE; ), 839–847. doi: 10.1109/WACV.2018.00097 - DOI

LinkOut - more resources