YOLO-SAM AgriScan: A Unified Framework for Ripe Strawberry Detection and Segmentation with Few-Shot and Zero-Shot Learning
- PMID: 41471673
- PMCID: PMC12736712
- DOI: 10.3390/s25247678
YOLO-SAM AgriScan: A Unified Framework for Ripe Strawberry Detection and Segmentation with Few-Shot and Zero-Shot Learning
Abstract
Traditional segmentation methods are slow and rely on manual annotations, which are labor-intensive. To address these limitations, we propose YOLO-SAM AgriScan, a unified framework that combines the fast object detection capabilities of YOLOv11 with the zero-shot segmentation power of the Segment Anything Model 2 (SAM2). Our approach adopts a hybrid paradigm for on-plant ripe strawberry segmentation, wherein YOLOv11 is fine-tuned using a few-shot learning strategy with minimal annotated samples, and SAM2 performs mask generation without additional supervision. This architecture eliminates the bottleneck of pixel-wise manual annotation and enables the scalable and efficient segmentation of strawberries in both controlled and natural farm environments. Experimental evaluations on two datasets, a custom-collected dataset and a publicly available benchmark, demonstrate strong detection and segmentation performance in both full-data and data-constrained scenarios. The proposed framework achieved a mean Dice score of 0.95 and an IoU of 0.93 on our collected dataset and maintained competitive performance on public data (Dice: 0.95, IoU: 0.92), demonstrating its robustness, generalizability, and practical relevance in real-world agricultural settings. Our results highlight the potential of combining few-shot detection and zero-shot segmentation to accelerate the development of annotation-light, intelligent phenotyping systems.
Keywords: SAM; YOLO; detection; few-shot; precision agriculture; segmentation; strawberries; zero-shot.
Conflict of interest statement
The authors declare no conflicts of interest.
Figures
References
-
- Yang Q., Liu L., Zhou J., Rogers M., Jin Z. Predicting the growth trajectory and yield of greenhouse strawberries based on knowledge-guided computer vision. Comput. Electron. Agric. 2024;220:108911. doi: 10.1016/j.compag.2024.108911. - DOI
-
- Júnior M.R.B., de Almeida Moreira B.R., dos Santos Carreira V., de Brito Filho A.L., Trentin C., de Souza F.L.P., Tedesco D., Setiyono T., Flores J.P., Ampatzidis Y., et al. Precision agriculture in the United States: A comprehensive meta-review inspiring further research, innovation, and adoption. Comput. Electron. Agric. 2024;221:108993. doi: 10.1016/j.compag.2024.108993. - DOI
-
- Bai Y., Yu J., Yang S., Ning J. An improved YOLO algorithm for detecting flowers and fruits on strawberry seedlings. Biosyst. Eng. 2024;237:1–12. doi: 10.1016/j.biosystemseng.2023.11.008. - DOI
-
- Bashir A., Ojo M., Zahid A. Real-time Estimation of Strawberry Maturity Level and Count Using CNN in Controlled Environment Agriculture; Proceedings of the 2023 ASABE Annual International Meeting; Omaha, Nebraska. 8–12 July 2023; St. Joseph, MI, USA: American Society of Agricultural and Biological Engineers; 2023. p. 1.
-
- He Z., Khanal S.R., Zhang X., Karkee M., Zhang Q. Real-time strawberry detection based on improved yolov5s architecture for robotic harvesting in open-field environment. arXiv. 20232308.03998
MeSH terms
LinkOut - more resources
Full Text Sources
