Autonomous Vehicles & ADAS

Global automobile manufacturers and technology suppliers rely on dSPACE to implement the idea of autonomous driving. We provide the required simulation and validation solutions, including SIL, HIL, prototyping, data logging, data replay, data enrichment, sensor realism, scenario-based testing, scenario generation as well as data and test management. Our portfolio includes solutions for use on a PC, a HIL simulator, or in the cloud. We also offer consulting services, if required.  

End-to-End Development and Test Environment

To help you put the idea of autonomous driving on the road, we offer comprehensive solutions and services for the data-driven development and validation. This ensures seamless, efficient data processing at all development stages, from data logging to release or sign-off tests. 

  • Data logging: robust in-vehicle data logging system with outstanding performance to record sensor raw data and vehicle bus data 
  • Data enrichment: automated data anonymization and data annotation (ground truth) with best-in-class quality 
  • Scenario generation: automated generation of simulation scenarios from sensor raw data or object lists 
  • Sensor-realistic simulations: highly realistic, physics-based simulation of camera, lidar, and radar sensors 
  • Data replay: time-synchronous replay of sensor raw data and vehicle bus data with exceptionally high streaming bandwidth 
  • Scenario-based testing: validating perception, fusion and planning algorithms for autonomous driving by automatically performing and evaluating millions of tests 
  • Data and test management: central management of simulation and test data (including variant handling and workflow management) 
  • Simulation platform: end-to-end solution for SIL, HIL, and large scale simulation in the cloud 
  • Release testing: planning the validation and verification strategy according to ISO 26262 and ISO/PAS 21448 (SOTIF) to achieve homologation using optimized processes 

Chain of Effects

The chain of effects in autonomous driving generally consists of different processing stages. First, the sensor’s raw data has to be preprocessed (perception). The goal is to detect features and static or dynamic objects as well as free spaces in the environment of the vehicle on the basis of single images or reflection points. During the subsequent stage, the results are merged and collated to a consistent environment model (data fusion). For this, time synchronization and correlation of sensor data is important. In addition, it is necessary to know the exact location and lane position of the vehicle based on a high-definition map (localization). 

Based on the environment model, the situation around the vehicle is analyzed, the potential driving trajectories are planned, the decision for a certain maneuver is made, and the longitudinal and lateral control is executed.

Sensors in the Simulation

A detailed and comprehensive simulation of the real world is the basis for a successful validation. Using suitable sensor models and the integration of real sensors with the test environment plays an important role. The range of sensor models extends from technology-independent variants, which generate object lists directly from information provided by the environmental model, to phenomenological or physical models, which are typically calculated on a high-performance GPU and feed raw data to the connected real sensors such as camera or radar. There are different integration options for sensors depending on the type of data and the layer to stimulate. These options can range as far as direct stimulation of the sensor front end, either over-the-air, such as radar, or via HF cable with GNSS (Global Navigation Satellite System) or V2X (Vehicle-to-X) signals. Using the real sensors in the test environment is often indispensable since the signal preprocessing, the sensor data fusion, and creating the environment model in the sensor’s control unit have a deep impact on the chain of effects.   

Get in Touch!