Metasurface-enhanced light detection and ranging technology

Using metasurfaces to enhance LiDAR systems we amplified the field of view of ultrafast deflectors, providing beyond-human features.
Published in Physics
Metasurface-enhanced light detection and ranging technology
Like

Fast autonomous decision-making

High degree of automation and ultra-fast decision-making are indispensable elements to pave the way towards smart and sustainable societies. To this end, technological terms such as Internet-of-Things (IoT) and Autonomy-of-Things (AoT) have emerged to describe networks of intercommunicating objects continuously exchanging information in the form of data. While the term IoT already existing since 90s, refers to a cluster of physical objects that are in principle static, its descendant AoT holds for a dynamic ensemble of moving elements.  Characteristic examples of AoT objects are self-driven cars coexisting in the same highway, or robots instrumenting unmanned industrial lines. In a dynamic AoT environment, the various constituents must be able to recognize other static and moving sub-systems as well as to identify their position in space and in time. For instance, autonomous vehicles must be able to detect obstacles in the highway and immediately take decisions to avoid them. One of the key technologies representing realistic AoT cases is the so-called Light imaging Detection and Ranging (LiDAR). Particularly, Time-of-Flight (ToF) LiDAR is a depth sensing technique using a pulsed laser in the NIR to illuminate a scene in a point-by-point configuration and to detect the reflected signal from it by measuring the time needed for the full signal round trip. Then 2D imaging and ranging of every point of the object enable its dynamic 3D reconstruction.


Metasurface-enhanced beam steering devices

LiDAR systems are generally classified into scanning and non-scanning (Flash) ones, depending on whether the laser beam is scanning the scene of interest. Scanning LiDARs, currently dominating the continuously growing LiDAR market are basically described in terms of three modules, i.e., the illumination, the scanning and, the detection one. In applications such as autonomous cars, among the most important requirements are a large Field-of-View and a high imaging speed overpassing the human eye capabilities. These two fundamental aspects are in principle dictated by the LiDAR scanning module.

Fig1. Metasurface

The beam scanning systems of interest in our work are based on flat nanostructured interfaces, also dubbed Metasurfaces (MS), made by combining the optical responses of objects with subwavelength size and periodicity. These artificial interfaces emerged one decade ago unveiling the ability to engineer light’s wavefront at will. However, due to the intrinsically passive optical properties of MS elements, these components are not considered in LiDAR scanners. Here we propose for the first time an active scanning scheme based on a passive MS by cascading the MS with ultrafast acousto-optic deflectors (AoDs). The role of MS is to amplify the ultra-narrow FoV of AODs while benefiting from their MHz speed.


Wide-angle dynamic scanning

Our approach leverages the compactness and the sub-wavelength topography of MS, as well as the AODs speed to allow random-beam accessing times of 100’s of nanoseconds. Particularly, the incident beam is initially scanned over a narrow FoV at both azimuthal  and elevation  angles by combining two AODs in crossed configuration. Then, the deflection angle of the focused beam impinging on the MS is dynamically controlled by the impact position on the MS plane. Benefiting from its sub-wavelength periodicity, MS leads to strong amplification of AOD FoV by enabling deflection at steep angles with high efficiency. Our technology meets both a large 2D FoV (>150x150deg) and a MHz beam scanning modulation without compromising between them (See Fig. 2).

Fig1
Fig.2. Schematic illustration of MS-enhanced LiDAR scanner

Multi-zone imaging

Fig. 3 Human vision

The versability of metasurfaces also also enables multi-zonal LiDAR imaging. To this end, a single interface can be employed to split the beam in different scanning regions: i.e., (i) high resolution-low FoV beam scanning at central view and, (ii) low-resolution high-FoV peripherical scanning (Fig.3). Such configuration allows a single system to be multiplexed and sense two different regions at the same time. Robotic systems interested in reproducing human vision require peripheral and central visual fields, where several zones featuring different spatial resolutions are scanned simultaneously. Low-resolution peripheral field provides coarse scene exploration, usually needed for humans to guide the eye to focus on a highly resolved fovea region for sharp imaging. For this purpose, in our system, we used a double-detector monitoring scheme. As illustrated in (Fig4.-detection scheme), The first detector collects light from the full numerical aperture (~2solid angle) but it blocks the central small numeric aperture (a beam blocker is placed in front of the detector). The second detector covers only a small NA for the narrow FoV resulted from zero-order light which is not deflected by the MS (a spatial filter is used to select the observation area. 

Fig4. Multizone LiDAR 

Ultrafast frame rate imaging

Additionally, using the developed system we evaluated the possibility of achieving real-time frame rate imaging by relying on the AOD scanning performance. To verify the very fast scanning speed of our system, we acquired dynamic images of the fastest object in our lab, i.e. of an optical chopper.  

For that we placed a reflective tape on one of the slabs of the rotating instrument and we collected the corresponding ToF and reflected amplitude signals. We tracked the center position of the reflective  tape in both the space and time domains showing the ability of LiDAR imaging at a frame-rate of 1000’s of frames-per-second (See GIF ).


Perspectives

In summary, we realized an ultrafast beam scanning system composed of a fast deflector and a passive metasurface to achieve beam steering at MHz speed over a 150x150° 2D FoV, improving the wide-angle scanning rate of mechanical devices by five orders of magnitude. Our approach also offers random-access beam steering capabilities. Multi-zone ranging images mimicking human vision at high frame rate have been realized. Further in the future, incorporating this system in ADAS could provide a disruptive solution for medium/long range perception, in which the central view scans the front scene, while the peripheral view provides additional sensing for pedestrian safety for example. We finally demonstrated time-event series for imaging at the real-time regime (>1kfps and up to MHz frame rate for 1D scanning). Outperforming existing LiDAR technologies, our tool offers perspective for future applications, particularly associated with the reduction of low decision-making latency of robotic and advanced driver-assistance systems.

*You're welcome to take a look at our youtube channel here (https://www.youtube.com/channel/UCmezaBH-xOxMjqk3bvqnlAg)

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Subscribe to the Topic

Physics and Astronomy
Physical Sciences > Physics and Astronomy

Related Collections

With collections, you can get published faster and increase your visibility.

Applied Sciences

This collection highlights research and commentary in applied science. The range of topics is large, spanning all scientific disciplines, with the unifying factor being the goal to turn scientific knowledge into positive benefits for society.

Publishing Model: Open Access

Deadline: Ongoing