Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Sep 29;13(1):5724.
doi: 10.1038/s41467-022-33450-2.

Metasurface-enhanced light detection and ranging technology

Affiliations

Metasurface-enhanced light detection and ranging technology

Renato Juliano Martins et al. Nat Commun. .

Abstract

Deploying advanced imaging solutions to robotic and autonomous systems by mimicking human vision requires simultaneous acquisition of multiple fields of views, named the peripheral and fovea regions. Among 3D computer vision techniques, LiDAR is currently considered at the industrial level for robotic vision. Notwithstanding the efforts on LiDAR integration and optimization, commercially available devices have slow frame rate and low resolution, notably limited by the performance of mechanical or solid-state deflection systems. Metasurfaces are versatile optical components that can distribute the optical power in desired regions of space. Here, we report on an advanced LiDAR technology that leverages from ultrafast low FoV deflectors cascaded with large area metasurfaces to achieve large FoV (150°) and high framerate (kHz) which can provide simultaneous peripheral and central imaging zones. The use of our disruptive LiDAR technology with advanced learning algorithms offers perspectives to improve perception and decision-making process of ADAS and robotic systems.

PubMed Disclaimer

Conflict of interest statement

A patent has been filed on this technology/Renato J Martins, Samira Khadir, Massimo Giudici, and Patrice Genevet, SYSTEM AND METHOD FOR IMAGING IN THE OPTICAL DOMAIN, EP21305472 (2021).

Figures

Fig. 1
Fig. 1. Concept of a metasurface-augmented FoV LidAR.
a Schematic representation of the LIDAR system. A triggered laser source, emitting single pulses for ToF detection, is directed to a synchronized acousto-optic deflector (AOD) offering ultrafast light scanning with low FoV (~2°). The deflected beam is directed to a scanning lens to scan the laser spot on the metasurface at different radial and azimuthal positions. The transmitted light across the metasurface is deviated according to the position of the impinging beam on the component to cover a scanning range between 75 and 75. The scattered light from the scene is collected using a fast detector. Data are processed to extract the single echo ToF for 2D and 3D imaging of the scene. b Detail of the cascaded AOD-metasurface assembled deflection system. c Top view photography of the optical setup. d Bottom: Graphical representation of the metasurface phase distribution along the radial axis. Top: Representations of beam deflection according to the incident beam positioning on the metasurface. Inset equation represents the phase function designed. e Illustration of axial symmetry for the laser impact point. f Photography of the 1 cm MS fabricated using nanoimprinting lithography. g SEM image of the sample showing the nanopillar building blocks of varying sizes employed to achieve beam deflection by considering lateral effective refractive index variations.
Fig. 2
Fig. 2. 1D time-of-flight imaging.
a Photography of the scene. b Ranging image of three objects displaced on a table using high reflective tapes to improve the intensity of the returned signal. In (1) a post with a small reflector was used in (2) a round object with a reflector and in (3) there is a box reflector with a tape around it. The graph shows the image in the correct ranging distance X (scanning dimension) and Z (ranging dimension) showing the capabilities to sense all of the three objects. c Position of single objects according to ranging image in (b). d Raw signal collected for the respective image, showing that objects oriented in the normal direction have bigger scattering intensity, the inset display single pulses used to determine the ToF ranging distance.
Fig. 3
Fig. 3. 3D imaging and wide-angle scanning capabilities.
a LIDAR line scanning of our laboratory room that show the large FoV on both Elevation (top) and Azimuth (bottom) angles. Note the top picture showing a scanning line profile covering the whole range from the ground to the ceiling of the testing room over 150°. b 3D ranging demonstration (top): the scene (bottom) was set up with actors wearing reflective suits positioned in the scene at distance Z varying from 1.2 to 4.9 m. Colors encodes distance. c Lissajous scanning using deflecting functions as θ=Asinαt+Ψ and =Bsinβt for different parameters α and β to illustrate the laser projection capabilities on a fast beam scanning, in a large FoV configuration. Ψ was set to be 0° and A = B = 30, although any configuration can be actively changed.
Fig. 4
Fig. 4. Multizone imaging.
a Schematic representation of a human multizone viewing with the concept to be adapted in ADAS systems. Such mimicking characteristics enables double vision for dual-purpose imaging features for high-resolution, long range, in the center and lower resolution, bigger FoV, for the peripherical view. b Experimental realization to test the dual-zone imaging functionality of the LIDAR system, including dual detection scheme (inset) for simultaneous image multiplexed collection. The central 0th diffraction order beam scans a small area with high resolution directed at the center of the image while the 1st diffracted order scans the whole field. c Top: We show the result of the scanned scenes described in (b). Top represents the LIDAR large FoV ranging image. The image is obtained by blocking the central part of the numerical aperture using an obstacle as sketched in (b). The bottom LIDAR ranging high-resolution image presents the central part scene captured using the 0th diffraction beam, covering a FoV of about 2°.
Fig. 5
Fig. 5. Measurement of fast in real-time-series events.
a Top: Illustration of the scene: a mechanical chopper of was set up with a nominal speed of 100 Hz and some slabs were covered using a reflective tape. Bottom: Measurement of the rotation speed for three different frame rates. b Top: Normalized intensity map for the radial axis, illustrating the dynamics of the wheel. Note the different slope for the rotation angles around 3π/2 representing a lessening of the speed. Bottom: Single-frame intensity data illustrating various angular positions. c Top: photography of the chopper and the size of the reflective tape. Bottom: Ranging image for t = 1.0 ms and the measurement of the tape from the recovered data.

References

    1. Kolhatkar, C. & Wagle, K. In Review of SLAM Algorithms for Indoor Mobile Robot with LIDAR and RGB-D Camera Technology BT—Innovations in Electrical and Electronic Engineering (eds. Favorskaya, M. N. et al.) 397–409 (Springer Singapore, 2021).
    1. Sujiwo A, Ando T, Takeuchi E, Ninomiya Y, Edahiro M. Monocular vision-based localization using ORB-SLAM with LIDAR-aided mapping in real-world robot challenge. J. Robot. Mechatron. 2016;28:479–490. doi: 10.20965/jrm.2016.p0479. - DOI
    1. Royo S, Ballesta-Garcia M. An overview of lidar imaging systems for autonomous vehicles. Appl. Sci. 2019;9:4093. doi: 10.3390/app9194093. - DOI
    1. Smullin LD, Fiocco G. Optical echoes from the moon. Nature. 1962;194:1267–1267. doi: 10.1038/1941267a0. - DOI
    1. Heide F, Xiao L, Kolb A, Hullin MB, Heidrich W. Imaging in scattering media using correlation image sensors and sparse convolutional coding. Opt. Express. 2014;22:26338. doi: 10.1364/OE.22.026338. - DOI - PubMed

Publication types

LinkOut - more resources