Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Oct 23:9:1544.
doi: 10.3389/fpls.2018.01544. eCollection 2018.

Aerial Imagery Analysis - Quantifying Appearance and Number of Sorghum Heads for Applications in Breeding and Agronomy

Affiliations

Aerial Imagery Analysis - Quantifying Appearance and Number of Sorghum Heads for Applications in Breeding and Agronomy

Wei Guo et al. Front Plant Sci. .

Abstract

Sorghum (Sorghum bicolor L. Moench) is a C4 tropical grass that plays an essential role in providing nutrition to humans and livestock, particularly in marginal rainfall environments. The timing of head development and the number of heads per unit area are key adaptation traits to consider in agronomy and breeding but are time consuming and labor intensive to measure. We propose a two-step machine-based image processing method to detect and count the number of heads from high-resolution images captured by unmanned aerial vehicles (UAVs) in a breeding trial. To demonstrate the performance of the proposed method, 52 images were manually labeled; the precision and recall of head detection were 0.87 and 0.98, respectively, and the coefficient of determination (R 2) between the manual and new methods of counting was 0.84. To verify the utility of the method in breeding programs, a geolocation-based plot segmentation method was applied to pre-processed ortho-mosaic images to extract >1000 plots from original RGB images. Forty of these plots were randomly selected and labeled manually; the precision and recall of detection were 0.82 and 0.98, respectively, and the coefficient of determination between manual and algorithm counting was 0.56, with the major source of error being related to the morphology of plants resulting in heads being displayed both within and outside the plot in which the plants were sown, i.e., being allocated to a neighboring plot. Finally, the potential applications in yield estimation from UAV-based imagery from agronomy experiments and scouting of production fields are also discussed.

Keywords: UAV remote sensing; breeding field; high-throughput phenotyping; image analysis; sorghum head detecting and counting.

PubMed Disclaimer

Figures

FIGURE 1
FIGURE 1
Experimental field layout.
FIGURE 2
FIGURE 2
An example of image preparation for algorithm development: (A) original image, (B) cropped image, and (C) manually labeled cropped image, the points represent the heads (dataset 1).
FIGURE 3
FIGURE 3
Challenges of head detection in a real field. (A) Changing light conditions within one flight: (1) images taken under sunny conditions; (2) images taken under cloudy conditions. (B) Complex background: (1) soil/ground (shadowed partially/fully), (2) dead leaves, (3) green leaves, (4) shadowed leaves, and (5) grass. (C) The sorghum heads vary in color: (1) white, (2) green, (3) brown, and (4) orange. (D) The sorghum heads vary in size and shape: (1) heads from main stem, (2) heads from tillers, and (3) overlapping heads; note that the shape of the heads is compact in 1 and 2 but is expanded in 3.
FIGURE 4
FIGURE 4
Dataset 0 comprised 17 images for training data collection of pixel-based segmentation model. The images were selected considering light condition, head color, head shape, and background.
FIGURE 5
FIGURE 5
The work flow of the proposed method of detecting and counting sorghum heads. (A) Original image. (B) Pseudo-color image demonstrating pixel classification result by DTSM: white head, yellow; soil, gray; shadows, black; dead leaves, off-white; leaves, green; orange heads, dark orange; and green heads, light orange. (C) Detected head regions (left) and overlapped with manually pointed head image (right). The black dots indicate heads pointed manually with Photoshop. (D) The head regions cropped from original images based on (C). (E) Detected head regions and number of heads counted. The numbers shown in the image indicate the number of the sorghum heads; 0 means incorrect detection..
FIGURE 6
FIGURE 6
An example of plot segmentation and identification from original images. (A) A plot is selected from a set of ortho-mosaic images. (B) The selected plot appears in several original images but in different locations. (C) The plots images are grouped and one is selected based on its distance from the central part of the image. (D) The selected plot is cropped from the corresponding original image. (E) The selected plot is rotated based the corner detection and orientation calculations of (D).
FIGURE 7
FIGURE 7
An example of head detection. Images contain (A) white heads, (B) green heads, and (C) brown heads. All of the images contain orange heads. The upper panels show the original images and the lower ones show the detected head regions (blue) and hand-labeled head centers (black dots). Almost all of the heads of the different colors were detected by the proposed model.
FIGURE 8
FIGURE 8
Accuracy of head number determined by the proposed method as compared with that done by manual counting: (A) dataset 1 test, (B) dataset 1 (all), and (C) dataset 2 (all).
FIGURE 9
FIGURE 9
Reasons for incorrect counting: (A) the plot segmentation was not perfect, so parts of some heads were cut out (upper red oval); (B) some regions included multiple overlapping heads (left red circle), and some heads were covered by leaves (right red circle). The upper panels show the original images and the lower ones show the detected head regions (blue) and hand-labeled head centers (black dots).

References

    1. Davis J., Goadrich M. (2006). “The relationship between precision-recall and roc curves,” in Proceedings of the 23rd International Conference on Machine Learning ICML ’06 (New York, NY: ACM; ) 233–240. 10.1145/1143844.1143874 - DOI
    1. Dreccer M. F., Wockner K. B., Palta J. A., McIntyre C. L., Borgognone M. G., Bourgault M., et al. (2014). More fertile florets and grains per spike can be achieved at higher temperature in wheat lines with high spike biomass and sugar content at booting. Funct. Plant Biol. 41 482–495. 10.1071/FP13232 - DOI - PubMed
    1. Duan T., Zheng B., Guo W., Ninomiya S., Guo Y., Chapman S. C., et al. (2016). Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV. Funct. Plant Biol. 44 169–183. 10.1071/FP16123 - DOI - PubMed
    1. Fuentes A., Yoon S., Kim S. C., Park D. S. (2017). A robust deep-learning-based detector for real-time tomato plant diseases and pests recognition. Sensors 17:E2022. 10.3390/s17092022 - DOI - PMC - PubMed
    1. Ghosal S., Blystone D., Singh A. K., Ganapathysubramanian B., Singh A., Sarkar S. (2018). An explainable deep machine vision framework for plant stress phenotyping. Proc. Natl. Acad. Sci. 115 4613–4618. 10.1073/pnas.1716999115 - DOI - PMC - PubMed