Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2021 Jun 25:12:611940.
doi: 10.3389/fpls.2021.611940. eCollection 2021.

Robotic Technologies for High-Throughput Plant Phenotyping: Contemporary Reviews and Future Perspectives

Affiliations
Review

Robotic Technologies for High-Throughput Plant Phenotyping: Contemporary Reviews and Future Perspectives

Abbas Atefi et al. Front Plant Sci. .

Abstract

Phenotyping plants is an essential component of any effort to develop new crop varieties. As plant breeders seek to increase crop productivity and produce more food for the future, the amount of phenotype information they require will also increase. Traditional plant phenotyping relying on manual measurement is laborious, time-consuming, error-prone, and costly. Plant phenotyping robots have emerged as a high-throughput technology to measure morphological, chemical and physiological properties of large number of plants. Several robotic systems have been developed to fulfill different phenotyping missions. In particular, robotic phenotyping has the potential to enable efficient monitoring of changes in plant traits over time in both controlled environments and in the field. The operation of these robots can be challenging as a result of the dynamic nature of plants and the agricultural environments. Here we discuss developments in phenotyping robots, and the challenges which have been overcome and others which remain outstanding. In addition, some perspective applications of the phenotyping robots are also presented. We optimistically anticipate that autonomous and robotic systems will make great leaps forward in the next 10 years to advance the plant phenotyping research into a new era.

Keywords: agricultural robotics; autonomous robotic technology; computer vision; high-throughput plant phenotyping; phenotyping robot.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

FIGURE 1
FIGURE 1
Plant phenotyping robotic systems for indoor environment: (A) A multi-robot system equipped with deep learning technique to determine optimal viewpoints for 3D model reconstruction (Wu et al., 2019), (B) Sensor-equipped robot to measure the reflectance spectra, temperature, and fluorescence of leaf (Bao et al., 2019c), (C) Robotic system to measure leaf reflectance and leaf temperate (Atefi et al., 2019), and (D) Robotic system for direct measurement of leaf chlorophyll concentrations (Alenyá et al., 2014).
FIGURE 2
FIGURE 2
Manual measurements of leaf reflectance (left), leaf temperature (middle), and chlorophyll content (right) (Atefi et al., 2019).
FIGURE 3
FIGURE 3
Plant phenotyping systems for outdoor environment: (A) Vinobot: robotic system including six DOF robotic manipulator and a 3D imaging sensor mounting on a mobile platform to measure plant height and LAI (Shafiekhani et al., 2017), (B) Robotanist: UGV-based robotic system equipped with a three DOF robotic manipulator and a force gauge for stalk strength measurement (Mueller-Sim et al., 2017), (C) A robotic system to slide LeafSpec across entire leaf to collect its hyperspectral images (Chen et al., 2021), (D) Thorvald II: VIS/NIR multispectral camera mounted on a mobile robot to measure NDVI (Grimstad and From, 2017), (E) BoniRob: autonomous robot platform using spectral imaging and 3D TOF cameras to measure plant height, stem thickness, biomass, and spectral reflection (Biber et al., 2012), (F) Ladybird: ground-based system consisted of a hyperspectral camera, a stereo camera, a thermal camera, and LIDAR to measure crop height, crop closure, and NDVI (Underwood et al., 2017), and (G) Flex-Ro: high-throughput plant phenotyping system equipped with a passive fiber optic, a RGB camera, an ultrasonic distance sensor, and an infrared radiometer for the measurement of NDVI, canopy coverage, and canopy height (Murman, 2019).
FIGURE 4
FIGURE 4
Summary statistics of the phenotyping robotic systems: (A) Targeted plants, (B) Plant/canopy traits measured by the robots, (C) Average accuracy (R2) to measure the phenotypic traits, (D) Robot vision/sensing systems, and (E) Robot software systems.
FIGURE 5
FIGURE 5
(A) Mobile Agricultural Robot Swarms (MARS) for seeding process (The European Coordination Mobile Agricultural Robot Swarms (MARS). PDF file. November 11, 2016. http://echord.eu/public/wp-content/uploads/2018/01/Final-Report-MARS.pdf), (B) UAV-UGV cooperative system to measure environmental variables in greenhouse (Roldán et al., 2016).

References

    1. Abel J. (2018). In-Field Robotic Leaf Grasping and Automated Crop Spectroscopy. Pittsburgh, PA: Carnegie Mellon University.
    1. Adamides G., Katsanos C., Parmet Y., Christou G., Xenos M., Hadzilacos T., et al. (2017). HRI usability evaluation of interaction modes for a teleoperated agricultural robotic sprayer. Appl. Ergon. 62 237–246. 10.1016/j.apergo.2017.03.008 - DOI - PubMed
    1. Afanasyev I., Mazzara M., Chakraborty S., Zhuchkov N., Maksatbek A., Kassab M., et al. (2019). Towards the internet of robotic things: analysis, architecture, components and challenges. arXiv [Preprint]. Avaliable online at: https://arxiv.org/abs/1907.03817 (accessed April 23, 2020).
    1. Afzal A., Duiker S. W., Watson J. E., Luthe D. (2017). Leaf thickness and electrical capacitance as measures of plant water status. Trans. ASABE 60 1063–1074.
    1. Aguiar A. S., Dos Santos F. N., De Sousa A. J. M., Oliveira P. M., Santos L. C. (2020). Visual trunk detection using transfer learning and a deep learning-based coprocessor. IEEE Access 8 77308–77320.