Automotive safety drives advances in radar, LiDAR and cameras

Radar and LiDAR sensors and cameras work together to deliver advanced automotive safety solutions to protect drivers, passengers and pedestrians.

The latest generation of vehicles offers increasingly higher safety standards and an enhanced driving experience. At the core of these achievements lies a trio of cutting-edge technologies: radar, LiDAR and cameras. Each of these sensors brings unique capabilities to the table, and their synergistic interplay propels automotive safety and comfort to new heights.

The real power of these technologies is in their ability to work together seamlessly. Radar provides long-range detection and speed information, while LiDAR offers precise object location and classification. Cameras interpret the visual scene, adding context to the data collected by the other sensors.

By combining these capabilities, vehicles can achieve situational awareness that surpasses human capabilities. As technology continues to evolve, we can expect even more sophisticated sensor systems to emerge. This article presents some of the latest trends driving the evolution of these sensors and their integration into the vehicle to implement increasingly advanced capabilities.

Trends in radar technology

Automotive radar technology is essential for implementing advanced driver-assistance systems (ADAS). Modern vehicles are equipped with radar sensors that facilitate several advanced functionalities, such as automatic emergency braking systems, front-collision warning, blind-spot recognition, lane-change assist, rear-collision warning system, adaptive cruise control and stop-and-go.

The shift to 77 GHz

Since 24-GHz ultra-wideband (UWB) was phased out in 2022 in Europe and the U.S., automotive radar is now operating in these markets at 77–81 GHz, reserved for high-precision and short-range detection. This wider bandwidth significantly improves range resolution and accuracy, allowing sensors to separate two closely spaced objects. For instance, a 77-GHz radar has a range resolution of 4 cm, while a 24-GHz radar achieves a range resolution of only 75 cm. This disparity in resolution enables enhanced detection capabilities for many objects nearby.

4D radar

4D radar offers enhanced precision and comprehensive data regarding objects in 3D space, encompassing their vertical position (also known as elevation), in addition to the distance, horizontal locations and velocity that this radar already provides. The capacity to accurately determine the vertical displacement of objects using 4D and imaging radar is a crucial requirement for autonomous vehicles (AVs) to accurately interpret objects in the vertical domain.

NXP Semiconductors recently disclosed a partnership with Tier 1 supplier sinPro to create an entry-level 4D imaging radar solution that will start OEM production in the latter half of 2024. The solution uses NXP’s specialized chipset that incorporates the 16-nm FinFET S32R41 automotive imaging radar processor and the TEF82xx RFCMOS transceiver in a dual-cascading configuration.

Satellite architecture

The typical “edge” architecture envisions the integration of advanced radar sensors that transmit processed data over a CAN or Ethernet interface to an ADAS electronic control unit (ECU). This architectural design is undergoing a transformation toward satellite architectures.

In these designs, the sensor heads located around the vehicle transmit pre-processed data to a central ECU via a high-speed, 1-Gbit Ethernet interface. Satellite architecture facilitates the consolidation of data processing by using minimally processed data at the central processor, in contrast to the edge architecture, where individual radar sensors perform complete data processing autonomously.

Centralized processing facilitates the integration of efficient sensor fusion algorithms, leading to enhanced precision in decision-making processes. The integration of sensor inputs in conjunction with these algorithms enhances overall sensing performance and yields a rather accurate perception map.

For example, Texas Instruments Inc. (TI) has developed the AWR2544 FMCW 77-GHz millimeter-wave radar-on-chip sensor (Figure 1) for satellite radar architectures. The device incorporates a 77-GHz transceiver that is seamlessly integrated with four transmitters and four receivers, resulting in enhanced range detection capabilities, higher-accuracy decision-making for ADAS and improved overall performance.

In addition, the satellite radar chip incorporates a radar processing accelerator that is tailored for cost-effectiveness, as well as a 1-Gbit/s Ethernet interface that enhances throughput for the generation and streaming of range fast Fourier transform compressed data. The device can implement Automotive Safety Integrity Level (ASIL) B and offers a safe execution environment through a hardware security module.

CPD systems

The child presence detection (CPD) feature detects children left behind inside the vehicle and triggers a warning within seconds. This system, based on UWB technology with sensing features, enables vehicle manufacturers to meet future safety targets of Euro NCAP and U.S. regulations for 2025.

Leveraging its CoSmA UWB Digital Access solution, Continental AG has developed a CPD system (Figure 2) that enables drivers to use their smartphone as a car key for hands-free access. To detect children who were left behind, the UWB system acts as a radar, receiving its own transmitted UWB signals back from the micro motions of an object.

By detecting a change in the frequency or phase of a returned signal, the distance and velocity of the moving target can be measured. Even the tiniest motion, such as the movement of a child’s chest while breathing, can be detected by the sensors. If the child is with an adult, the CPD system does not generate any alarm.

Trends in LiDAR

LiDAR uses laser pulses to create a 3D map of the surroundings, thus enabling accurate object detection, distance measurement and environment understanding. This information is critical for achieving higher levels of autonomous driving.

Today, the first Level 3 AVs are commercially available in all three major automotive markets: the U.S., China and Europe. Level 3 autonomous driving is classified as conditional driving automation, meaning drivers must be adequately prepared to assume control of the vehicle upon request. At Levels 4 and 5, drivers are permitted to be disengaged. Despite its higher cost, LiDAR is considered a more accurate sensing technology than radar sensors, providing high-resolution and 3D mapping under almost all weather conditions.

A key strategy for bringing the cost of LiDAR down to an acceptable level is product innovation. Manufacturers of LiDAR are currently focusing their research and development efforts on two areas: the transition from mechanical to solid-state LiDAR and the use of semiconductor chips to replace the conventional discrete design.

Perception software

Perception software combines advanced LiDAR sensors with state-of-the-art AI algorithms. The sensors operate as the eyes of the vehicle, facilitating the perception of the surroundings and acquiring data from the environment to produce a point cloud.

One example is MicroVision’s MAVIN N, a compact and customizable laser beam scanner that uses perception software to enable object detection, classification and tracking. The device, shown in Figure 3, integrates multiple solutions into a single, low-profile box.

MicroVision has optimized the perception software to process sensor measurements directly on the LiDAR sensors, thereby reducing power consumption, thanks to a highly efficient system-on-chip. Because the sensor-specific perception processing is already done on-chip, compared with using expensive external ECU hardware, it simplifies the system architecture and reduces costs.

Trends in cameras

ADAS solutions are increasingly reliant on cameras for features such as adaptive cruise control, automatic emergency braking, lane-keeping assistance and traffic-sign recognition. Combining cameras with other sensors such as radar and LiDAR enhances the precision and reliability of these systems.

The integration of cameras and LiDAR sensors in particular combines vision-based detection with depth perception for enhanced accuracy in object detection and localization, particularly critical for autonomous-driving applications.

Surround-view cameras

Cameras with 360° surround view are gaining popularity for their ability to provide a bird’s-eye perspective, aiding in parking and maneuvering. Surround-view cameras provide a complete visual image of the vehicle from above, warning drivers about pedestrians, vehicles or obstacles that may be in the vehicle’s path but not within their immediate line of sight.

Multiple wide-angle cameras are strategically positioned around the vehicle, typically at the front, rear and under the side mirrors, to form a typical surround-view camera system. Each camera photographs a specific area around the vehicle, covering all directions. These images are combined using advanced software to produce a bird’s-eye view of the vehicle’s surroundings. This view is then displayed on the car’s infotainment screen.

Thermal imaging

When applied to the automotive industry, thermal imaging can be used to enhance the safety of road users. For example, Valeo and Teledyne FLIR LLC are collaborating to deliver the first ASIL B thermal imaging technology for night-vision ADAS.

This system (Figure 4) will use Valeo’s ADAS software stack to provide nighttime functionality for applications such as automatic emergency braking in autonomous, commercial and passenger vehicles. Valeo will integrate Teledyne FLIR’s thermal vision technology to provide comprehensive night vision.

Original article source:

https://www.electronicproducts.com/automotive-safety-drives-advances-in-radar-lidar-and-cameras/?_gl=1*yhvbpg*_gcl_au*MTM3NTk0Njk3LjE3MjYyNzczODE.*_ga*MTEwMzA3NjcuMTcyNjI3NzM4OQ..*_ga_ZLV02RYCZ8*MTcyNjI3NzM4OC4xLjAuMTcyNjI3NzM4OC42MC4wLjA.

FAQ

  1. Why are radar, LiDAR, and cameras important for automotive safety?

Radar, LiDAR, and cameras are essential for advanced driver assistance systems (ADAS) and autonomous vehicles. They provide real-time data about the vehicle’s surroundings, allowing for safer navigation, collision avoidance, and pedestrian detection.

 

  1. What role does radar play in vehicle safety?

Radar systems detect objects and their speed by using radio waves. This helps with adaptive cruise control, automatic emergency braking, blind-spot monitoring, and lane-change assistance by providing precise measurements of distance and velocity.

 

  1. How does LiDAR improve automotive safety?

LiDAR (Light Detection and Ranging) uses laser pulses to create detailed 3D maps of the environment. This allows vehicles to perceive objects with high accuracy, making it essential for tasks like obstacle detection, high-precision mapping, and enhancing the safety of autonomous driving systems.

 

  1. What are the benefits of cameras in vehicle safety systems?

Cameras provide visual data to detect lane markings, traffic signs, and pedestrians. They enable features like lane-keeping assistance, traffic sign recognition, and surround-view systems, improving situational awareness for drivers and enhancing vehicle safety.

 

  1. How are radar, LiDAR, and cameras integrated in modern vehicles?

Modern vehicles often use a combination of these sensors in a sensor fusion approach. By combining the strengths of radar (long-range detection), LiDAR (3D mapping), and cameras (visual context), vehicles can make more accurate decisions in real-time, improving overall safety and reliability.

 

  1. What challenges do radar, LiDAR, and camera systems face?

Some challenges include sensor cost, integration complexity, environmental sensitivity (e.g., poor visibility or weather), and the need for large amounts of data processing. Additionally, achieving redundancy across systems to ensure reliable performance is a key concern.

 

  1. How will future advances in radar, LiDAR, and cameras improve automotive safety?

Ongoing improvements in sensor technology will lead to better resolution, faster processing, and lower costs. This will allow vehicles to detect and respond to potential hazards even more quickly, making self-driving cars safer and enhancing the effectiveness of driver assistance features in all vehicles.

Leave a Reply

Your email address will not be published. Required fields are marked *