Object Detection in Adverse Weather for Autonomous Driving through Data Merging and YOLOv8
- PMID: 37896564
- PMCID: PMC10611033
- DOI: 10.3390/s23208471
Object Detection in Adverse Weather for Autonomous Driving through Data Merging and YOLOv8
Abstract
For autonomous driving, perception is a primary and essential element that fundamentally deals with the insight into the ego vehicle's environment through sensors. Perception is challenging, wherein it suffers from dynamic objects and continuous environmental changes. The issue grows worse due to interrupting the quality of perception via adverse weather such as snow, rain, fog, night light, sand storms, strong daylight, etc. In this work, we have tried to improve camera-based perception accuracy, such as autonomous-driving-related object detection in adverse weather. We proposed the improvement of YOLOv8-based object detection in adverse weather through transfer learning using merged data from various harsh weather datasets. Two prosperous open-source datasets (ACDC and DAWN) and their merged dataset were used to detect primary objects on the road in harsh weather. A set of training weights was collected from training on the individual datasets, their merged versions, and several subsets of those datasets according to their characteristics. A comparison between the training weights also occurred by evaluating the detection performance on the datasets mentioned earlier and their subsets. The evaluation revealed that using custom datasets for training significantly improved the detection performance compared to the YOLOv8 base weights. Furthermore, using more images through the feature-related data merging technique steadily increased the object detection performance.
Keywords: YOLOv8; autonomous driving; data merging; deep neural networks; harsh weather; object detection.
Conflict of interest statement
The authors declare no conflict of interest.
Figures
References
-
- Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles (J3016B) 2018. [(accessed on 10 October 2023)]. Available online: https://www.sae.org/standards/content/j3016_201806/
-
- Zhang Y., Carballo A., Yang H., Takeda K. Perception and Sensing for Autonomous Vehicles under Adverse Weather Conditions: A Survey. ISPRS J. Photogramm. Remote Sens. 2023;196:146–177. doi: 10.1016/j.isprsjprs.2022.12.021. - DOI
-
- Bijelic M., Gruber T., Mannan F., Kraus F., Ritter W., Dietmayer K., Heide F. Seeing through Fog without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse Weather; Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR); Seattle, WA, USA. 13–19 June 2020; New York, NY, USA: IEEE; 2020.
-
- Lee U., Jung J., Shin S., Jeong Y., Park K., Shim D.H., Kweon I.S. EureCar Turbo: A Self-Driving Car That Can Handle Adverse Weather Conditions; Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); Daejeon, Republic of Korea. 9–14 October 2016; New York, NY, USA: IEEE; 2016.
-
- Qian K., Zhu S., Zhang X., Li L.E. Robust Multimodal Vehicle Detection in Foggy Weather Using Complementary Lidar and Radar Signals; Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR); Nashville, TN, USA. 20–25 June 2021; New York, NY, USA: IEEE; 2021.
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous
