Sensor Simulation

Autonomous vehicles accurately find their way with the aid of sensors. To ensure this in every possible situation, vehicles, controllers, and sensors are efficiently validated at early development stages: they are tested in virtual test drives with an immense variety of test cases. Fengco and dSPACE precisely address this requirement by offering “Sensor Simulation” – a unique and complete solution based on simulating a sensor’s physical phenomena and properties.

Sensor Simulation contains models that create a virtual 3D world to represent real objects in the surroundings of the vehicle. It also provides models of camera, radar, and lidar sensors to simulate the perception of this world. For this purpose, the model types simulate two essential aspects:

  • Sensor front end: The front end is the detection component of a sensor. In cameras, these are the lens and the analog-digital image converter. A radar front end model contains the signal modulation and the antenna pattern, for example. Whereas a lidar front end consists of the laser and LED characteristics as well as the composition of the photodetectors (receiving LEDs).
  • Sensor environment: The environment model follows the physical laws of signal propagation affected by material properties and channel attenuation. It can simulate all details of the environment that can be detected by the relevant sensor. These include fellow vehicles, roads, traffic signs, roadside structures, etc.

The sensor models provide sensor-realistic simulation data that can be used to validate functions for autonomous driving or a subset of them, such as data fusion or object detection. For best performance, the sensor models can be executed on platforms, such as the Sensor Simulation PC, which is equipped with powerful graphics processor units (GPU).

Seamless Simulation

Sensor Simulation supports the reuse of models and test scenarios on different platforms. Therefore, tests created and used on the developer’s PC can be performed on a simulator or in the cloud and vice versa. This enables easy and fast scaling of parallelized tests. The entire validation process from Software-In-the-Loop (SIL) simulation to Hardware-In-the-Loop (HIL) simulation is supported. This enables versatile test methods with real and virtual ECUs or sensors.

Flexible Integration

Sensor Simulation provides flexibility via an application programming interface (API) to support customized solutions. The online postprocessing API is an efficient means of implementing sensor-specific extensions directly in the product. It can be used for adapting the output format or integrating a sensor model of the sensor supplier, for example. The resulting custom code is executed on the GPU during the overall process. Therefore, you can use the same code in SIL and HIL domains.

Camera Module

The Camera Module is designed for the simulation and validation of camera-based assistance and automated driving functions. It allows for injecting data in raw format in the electronics behind the imager chip. The software module enables camera simulation based on the light propagation and measurement principle of the imager chip. The module provides a realistic, physics-based simulation of camera sensors with a high variety of configuration options, such as fish-eye effects. The Camera Module supports the simulation of several imager chips and multiple cameras for a 360° view.

Lidar Module

The Lidar Module allows for lidar sensor simulation based on the design principle of the sensor (laser and photodetector). The module provides a physics-based simulation of lidar sensors with a high variety of configuration options. Environmental simulation is based on raytracing technology. It allows for injecting lidar data in raw or point cloud format into a processing chain. For each point of a point cloud, different values are calculated, e.g., surface normal vector, azimuth angle, elevation angle, distance, reflectivity, relative velocity, material ID, and optical power. Additionally, the ray chronology of the scanning sensors can be simulated and is configurable. Multiple echoes (e.g., from the transmission of semi-transparent materials) are also considered.

Radar Module

The module provides a realistic, physics-based simulation of radar signals in a virtual environment with a large variety of configuration options. Environment simulation is based on raytracing technology. It allows for injecting radar data into a processing chain in various formats. In addition, an API can be used to extend the functionality of the radar simulation for data processing algorithms. By default, two processing templates are shipped with the module. One returns the channel impulse response of the environment as rays that contain information about azimuth angle, elevation angle, distance, and the relative velocity of hit objects. In addition, the reflected electric field is returned as a polarization matrix. The customer can further process this data, e.g., to generate raw-data, such as voltages for the analog-digital converters (ADC) of a real sensor.

The other processing template returns a detection list. The radar response of the environment is evaluated and the signals are sorted in an adjustable four-dimensional grid (three spatial and one velocity dimension).

Get in Touch