Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Aug 24;13(1):13837.
doi: 10.1038/s41598-023-40961-5.

Radar sensor based machine learning approach for precise vehicle position estimation

Affiliations

Radar sensor based machine learning approach for precise vehicle position estimation

Muhammad Sohail et al. Sci Rep. .

Abstract

Estimating vehicles' position precisely is essential in Vehicular Adhoc Networks (VANETs) for their safe, autonomous, and reliable operation. The conventional approaches used for vehicles' position estimation, like Global Positioning System (GPS) and Global Navigation Satellite System (GNSS), pose significant data delays and data transmission errors, which render them ineffective in achieving precision in vehicles' position estimation, especially under dynamic environments. Moreover, the existing radar-based approaches proposed for position estimation utilize the static values of range and azimuth, which make them inefficient in highly dynamic environments. In this paper, we propose a radar-based relative vehicle positioning estimation method. In the proposed method, the dynamic range and azimuth of a Frequency Modulated Continuous Wave radar is utilized to precisely estimate a vehicle's position. In the position estimation process, the speed of the vehicle equipped with the radar sensor, called the reference vehicle, is considered such that a change in the vehicle's speed changes the range and azimuth of the radar sensor. For relative position estimation, the distance and relative speed between the reference vehicle and a nearby vehicle are used. To this end, only those vehicles are considered that have a higher possibility of coming in contact with the reference vehicle. The data recorded by the radar sensor is subsequently utilized to calculate the precision and intersection Over Union (IOU) values. You Only Look Once (YOLO) version 4 is utilized to calculate precision and IOU values from the data captured using the radar sensor. The performance is evaluated under various real-time traffic scenarios in a MATLAB-based simulator. Results show that our proposed method achieves 80.0% precision in position estimation and obtains an IOU value up to 87.14%, thereby outperforming the state-of-the-art.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
A reference vehicle equipped with multiple sensors.
Figure 2
Figure 2
A reference vehicle using multiple radar sensors for vehicle detection.
Figure 3
Figure 3
A reference vehicle equipped with a single radar sensor using dynamic range and azimuth value.
Figure 4
Figure 4
Framework of the proposed method.
Figure 5
Figure 5
Speed of reference vehicle is less than 40 Km/h; range = 30 m, azimuth = 120°.
Figure 6
Figure 6
Speed of reference vehicle is less than 40 to 120 Km/h; range = 50 m, azimuth = 80°.
Figure 7
Figure 7
Speed of reference vehicle is 121 Km/h or more; range = 100 m, azimuth = 50°.
Figure 8
Figure 8
Design of the proposed method in a flowchart.
Figure 9
Figure 9
Architecture of YOLO algorithm.
Figure 10
Figure 10
Bounding box regression example: yellow box representing a bounding box around an object.
Figure 11
Figure 11
Residual blocks example: image is divided into grids.
Figure 12
Figure 12
Intersection over union example: diagram showing predicted and real bounding box using IOU technique.
Figure 13
Figure 13
Radar sensor signal processing mechanism.
Figure 14
Figure 14
Failure scenario-1 when reference vehicle is moving at 30 Km/h, and proposed solution.
Figure 15
Figure 15
Failure scenario-2 when reference vehicle is moving at 25 Km/h speed, and proposed solution.
Figure 16
Figure 16
Error distance with different speeds of the reference vehicle.
Figure 17
Figure 17
Comparison of average precision.
Figure 18
Figure 18
Precision comparison with the previous studies.
Figure 19
Figure 19
IOU comparison of the proposed method with the previous works.

References

    1. Ullah S, Abbas G, Waqas M, Abbas ZH, Khan AU. Rsu assisted reliable relay selection for emergency message routing in intermittently connected vanets. Wirel. Netw. 2023;29(3):1311–1332. doi: 10.1007/s11276-022-03159-7. - DOI
    1. Merriman SE, Plant KL, Revell KM, Stanton NA. Challenges for automated vehicle driver training: A thematic analysis from manual and automated driving. Transp. Res. F. 2021;76:238–268. doi: 10.1016/j.trf.2020.10.011. - DOI
    1. Sohail M, Latif Z, Javed S, Biswas S, Ajmal S, Iqbal U, Raza M, et al. Routing protocols in vehicular adhoc networks (vanets): A comprehensive survey. Internet Things. 2023;1:100837. doi: 10.1016/j.iot.2023.100837. - DOI
    1. Akbar, R. Z. Performance analysis fsr and dsr routing protocol in vanet with v2v and v2i models. in 3rd International Seminar on Research of Information Technology and Intelligent Systems (ISRITI). IEEE, 158–163 (2020).
    1. Tasgaonkar PP, Garg RD, Garg PK. Vehicle detection and traffic estimation with sensors technologies for intelligent transportation systems. Sens. Imaging. 2020;21(1):1–28. doi: 10.1007/s11220-020-00295-2. - DOI