A Blueprint for New ADAS/AD Test Strategies

Scenario-based Open Road Testing

This user journey describes open road testing as it is used for the validation of ADAS functionality. The term open road refers to the fact that the system under test (SUT) is assessed in an open environment on public roads. This makes it on the one hand very difficult to control all test parameters in a desired state (especially the environment and the behavior of other vehicles), but on the other hand allows the SUT to be tested under diverse and unknown environmental conditions.

Open road testing is used at the end of the V-cycle of a component with ADAS functionality. Beforehand, the basic functionality of the SUT is ensured by using other test methods like proving-ground testing that can control all test parameters and that can reproduce identical test conditions if necessary. Open road testing on the other hand can test the SUT with a broad coverage of different and complex environmental situations. Moreover, it allows the tester to assess the vehicle under test (VUT) in the real world with all its unforeseeable circumstances, and in this way gives a very good idea how the SUT would behave when used in production vehicles. Open road testing gives feedback on functionality in unknown scenarios where later analysis can reduce the number of unknown/unsafe scenarios for the SUT.

User Journey
At the beginning the user must define tests by writing test specifications. Open road tests can be divided into four different categories:

  • Completely free test drives with little or no specification to examine the function of the SUT on the open road
  • Test drives with unspecific and semiformal driving conditions using an operational design domain (ODD) or driving instructions from regulations like NCAP. The specification may include only the general setting (urban/rural/motorway) or an abstract definition of junction situations.
  • Abstract test specifications including an ODD or abstract scenario descriptions (comparable to functional and logical scenarios from the PEGASUS definition)
  • Concrete test specifications including concrete scenarios. This type is difficult to realize due to the uncontrollable environmental conditions in open road testing

In the latter two categories a specific route and time for the test drive can be selected to include specific junction scenarios and to determine weather and traffic conditions. In addition to the general setting, the test specifications define the variables and parameters of interest which must be recorded during the test drive.

It is important to emphasize that in most cases it is sufficient to only define the operational design domain of the tests instead of using scenarios. This allows a more general view on the SUT and on the different situations that can occur during the execution of the test. ASAM will soon (November 2021) release an ASAM OpenODD concept paper Regulations like the NCAP regulation for automated lane keeping systems (ALKS) often play a big role when designing open road tests. In this case, instead of an ODD or a scenario description, the regulation itself will provide driving instructions and thereby define the open road test.

After the specification phase the SUT is assessed in the predefined setting, where typical durations of test drives are around an hour. The main limit for this time is a high generation of data, which at some point exceeds the large capacity of the on-board recorder. Dependent on the use case more drives are planned and executed to generate enough data for coverage. In addition to all the vehicle data, the behavior of the SUT is monitored and all associated data is recorded for later analysis.

Usually, the recorded data is not preselected during the drive but all data is recorded on a SSD hard disk with large capacity (around 10 TB). This data contains raw data from sensors (camera, lidar, and radar if needed), which takes up most of the space, as well as vehicle data such as speed, acceleration, and steering angle. Additionally, localization data, e.g. from GPS devices, is recorded. One possibility to reduce the amount of recorded data is to restrict recording to relevant situations. This approach is investigated by the KIsSME research project, which is described here.

After the test drive the raw data is processed and enriched. As an example, objects are identified from raw sensor data and tracked (if not already done on board) and the associated road network description is created and linked to the localization data of the vehicle.

As a next step the data is preselected in different ways:

  • Focus on relevant situations: Relevant situations that assess the functionality of the SUT are identified. Data associated with these situations are selected by time
  • Focus on test specifications: Situations that satisfy conditions of previously defined tests are selected by time
  • Focus on data type: Data which are needed for the evaluation of a specific test or situation are selected by type (e.g. just raw data from a specific sensor if this sensor is to be examined)

These steps will result in final data which exactly defines concrete test specifications. By analyzing the VUT data recorded during the test drive, the associated test results are generated. Together they result in concrete test cases which give valuable feedback on the functionality of the VUT during the open road test drive.

The final test data might be used to generate standardized scenario files which then can be used as an input for simulation-based testing.

Dependent on the outcome of the analysis, more test drives are planned with adjusted settings. If for example the VUT performed especially badly in specific weather conditions like rain combined with a specific traffic situation like a traffic jam, more test drives with a high probability of this combination will be planned and executed. The variability of open road testing allows a flexible readjustment of the testing in process together with the advantage of having very realistic test conditions.

Requirements concerning Test Specifications


Requirement Evaluation
Requires scenario optional, depending on the use case
Requires hardware Test vehicle
Requires coordination of tools and models Yes: Mostly analysis of recorded data (enrichment and preselection/analysis of data)
Requires exchangeability among test instances Not required but beneficial
Configuration of stubs/drivers/mock-up Similar to production vehicle since at the end of the V-cycle. Test driver for operation is required
Kind of interface between test and scenario Usage of operational design domain or optionally abstract scenarios at the beginning to define the setting of the test. Possible usage of concrete scenarios or ODD descriptions as test artifacts or as generated result of the test drive
Covered by best practice and regulations and standards ISO OTX (test sequences), ASAM MDF (recorded data), ASAM OpenODD, optionally ASAM OpenDrive (road networks) and ASAM OpenScenario for part of the scenario descriptions