Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Mar 7;22(5):2065.
doi: 10.3390/s22052065.

Tree Trunk Recognition in Orchard Autonomous Operations under Different Light Conditions Using a Thermal Camera and Faster R-CNN

Affiliations

Tree Trunk Recognition in Orchard Autonomous Operations under Different Light Conditions Using a Thermal Camera and Faster R-CNN

Ailian Jiang et al. Sensors (Basel). .

Abstract

In an orchard automation process, a current challenge is to recognize natural landmarks and tree trunks to localize intelligent robots. To overcome low-light conditions and global navigation satellite system (GNSS) signal interruptions under a dense canopy, a thermal camera may be used to recognize tree trunks using a deep learning system. Therefore, the objective of this study was to use a thermal camera to detect tree trunks at different times of the day under low-light conditions using deep learning to allow robots to navigate. Thermal images were collected from the dense canopies of two types of orchards (conventional and joint training systems) under high-light (12-2 PM), low-light (5-6 PM), and no-light (7-8 PM) conditions in August and September 2021 (summertime) in Japan. The detection accuracy for a tree trunk was confirmed by the thermal camera, which observed an average error of 0.16 m for 5 m, 0.24 m for 15 m, and 0.3 m for 20 m distances under high-, low-, and no-light conditions, respectively, in different orientations of the thermal camera. Thermal imagery datasets were augmented to train, validate, and test using the Faster R-CNN deep learning model to detect tree trunks. A total of 12,876 images were used to train the model, 2318 images were used to validate the training process, and 1288 images were used to test the model. The mAP of the model was 0.8529 for validation and 0.8378 for the testing process. The average object detection time was 83 ms for images and 90 ms for videos with the thermal camera set at 11 FPS. The model was compared with the YOLO v3 with same number of datasets and training conditions. In the comparisons, Faster R-CNN achieved a higher accuracy than YOLO v3 in tree truck detection using the thermal camera. Therefore, the results showed that Faster R-CNN can be used to recognize objects using thermal images to enable robot navigation in orchards under different lighting conditions.

Keywords: faster R-CNN; low-light conditions; orchards; thermal image; tree trunk detection.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Object detection under different lighting conditions at a range of 25 m from the thermal camera with orientations of 0°, −30°, and 30° directions (green indicates the measured distance by camera).
Figure 2
Figure 2
Aerial view of experimental pear orchards: (a) conventional planted pear orchard, and (b) joint-tree training pear orchard in the Tsukuba-Plant Innovation Research Center (T-PIRC), University of Tsukuba, Japan.
Figure 3
Figure 3
Analysis of thermal images under different lighting conditions: (a,b) no-light conditions (7–8 PM), (c,d) high-light conditions (12–2 PM), and (e,f) low-light conditions (5–6 PM).
Figure 4
Figure 4
Faster R-CNN structure for tree truck detection used in this research.
Figure 5
Figure 5
Faster R-CNN network structure-focusing regional proposal network for feature map.
Figure 6
Figure 6
VGG16 for target of tree trunk detection.
Figure 7
Figure 7
Measured distance under different lighting and orientations. (ac) 0°, −30°, and 30° under high-light condition; (df) 0°, −30°, and 30° under low-light condition; (gi) 0°, −30°, and 30° under no-light condition.
Figure 7
Figure 7
Measured distance under different lighting and orientations. (ac) 0°, −30°, and 30° under high-light condition; (df) 0°, −30°, and 30° under low-light condition; (gi) 0°, −30°, and 30° under no-light condition.
Figure 8
Figure 8
Measurement errors of target objects from a distance of 0 to 25 m under different lighting conditions and orientations of the thermal camera.
Figure 9
Figure 9
Faster R-CNN loss images. (a) Total loss. (b) Bounding box loss. (c) Classification loss. (d) Regional Proposal Network classification loss. (e) Regional Proposal Network bounding box loss.
Figure 9
Figure 9
Faster R-CNN loss images. (a) Total loss. (b) Bounding box loss. (c) Classification loss. (d) Regional Proposal Network classification loss. (e) Regional Proposal Network bounding box loss.
Figure 10
Figure 10
Validation results of Faster R-CNN. (af) Original images. (gl) Randomly flipped and rotated images.
Figure 10
Figure 10
Validation results of Faster R-CNN. (af) Original images. (gl) Randomly flipped and rotated images.
Figure 11
Figure 11
Precision-recall curve of Faster R-CNN and YOLOv3 validation.
Figure 12
Figure 12
Precision-recall curve of Faster R-CNN and YOLOv3 testing.
Figure 13
Figure 13
Image results of Faster R-CNN test: (ad) no-light conditions, (eh) high-light conditions, and (il) low-light conditions.
Figure 13
Figure 13
Image results of Faster R-CNN test: (ad) no-light conditions, (eh) high-light conditions, and (il) low-light conditions.

Similar articles

Cited by

References

    1. Dong Y. Japan: Aging of the Agricultural Labor Force and its Solutions. World food Prize Foundation; Des Moines, IA, USA: 2018.
    1. Vadlamudi S. How Artificial Intelligence Improves Agricultural Productivity and Sustainability: A Global Thematic Analysis. Asia Pac. J. Energy Environ. 2019;6:91–100.
    1. Bergerman M., Billingsley J., Reid J., Van Henten E. Springer Handbook of Robotics. Springer Science and Business Media LLC; Berlin/Heidelberg, Germany: 2016. Robotics in agriculture and forestry; pp. 1463–1492.
    1. Takai R., Barawid O., Jr., Ishii K., Noguchi N. Development of Crawler-Type Robot Tractor based on GPS and IMU. IFAC Proc. Vol. 2010;43:151–156. doi: 10.3182/20101206-3-JP-3009.00026. - DOI
    1. Li M., Imou K., Wakabayashi K., Yokoyama S. Review of research on agricultural vehicle autonomous guidance. Int. J. Agric. Biol. Eng. 2009;2:1–16.

LinkOut - more resources