Skip to main content
Jump to: navigation, search


Revision as of 06:45, 2 February 2018 by (Talk | contribs) (Added TESIS DYNAware Gmbh as interested party in the OpenADx testbed)

Welcome to OpenADx

Automated Driving (AD) is clustered into three equally important technology areas:

1) In-vehicle technology

2) Cloud technology (backend)

3) Design, development, test and validation tools (tool chain)

OpenADx is focused on the AD tool chain. The goal is to accelerate AD development through open collaboration and open source.

OpenADx' vision is to ensure transparency and make the complex AD tool landscape more easily accessible for enterprise users.


AD is a complex challenge and therefore requires a multifaceted development process incorporating a variety of software tools. The tools the industry currently uses are very good, but they don’t seamlessly work with one another. This is a result of the tools not being designed to work together. This is an industry-wide issue that slows us down in the race to AD development. By pooling resources, we can remove the “friction” between widely used tools. We can create something of use to all of us: open, compatible and accessible.

OpenADx Grafiken weiß 72dpi Automated driving is a complex challenge.jpg

Problem and benefits for OEMs and Tier1s

User insight: "Developing automated driving functions is extremely complicated and requires the use of many complex software tools which do not work efficiently with one another. What I need is a set of tools which work with each other seamlessly so that my teams can move through the development process more quickly and efficiently."

Benefit: The automated driving tool chain allows your team to work together more efficiently with a suite of highly integrated tools by enabling seamless transfer of data and code through each step of the automated driving development process.

Problem and benefits for tool and technolgy providers

User insight: "Currently, tools used to create automated driving applications do not work efficiently with one another. If our tool/technology is compatible with other widely used technologies and tools, it will ease the development process for our customers and make our products even more attractive to them."

Benefit: The seamless integration of your technology in the automated driving tool chain makes it more attractive to organizations developing automated driving applications by increasing their development efficiency.

Integrated tool chain for AD system development

Leveraging the current tool landscape and tying in players from industry and academia is a must. Therefore, our approach is two-fold. First, we will fine-tune the development tool chain to the needs of our industry. We do this by integrating existing products in the market, adjusting existing tools to our needs, and developing additional tools through Open Source Software (OSS) where none today currently exist. Second, we will bring areas of expertise together in order to make the complex AD tool landscape more easily accessible for all stakeholders.

OpenADx Grafiken weiß 72dpi Integrated toolchain for AD system development .jpg


We believe an initiative like this should be inclusive, not exclusive. It’s about removing barriers to efficient development with widely established tools. It’s about bundling industry competencies and sharing development. We plan to demonstrate our ability to work together on joint testbeds in an open source setting. This allows potential partners to engage with a limited initial investment. The testbeds produce demonstrable results and strengthen confidence in the approach.

The Idea of Testbeds

Testbeds are setup to produce demonstable results that incubate potential open source projects. To realize a testbed the idea is to prepare a use case/topic in a series of workshops and to execute so called Hack-Fests which assemble developers from the cooperation partners for a defined period of time, e.g. 3-4 days, in which they realize a demonstrator or prototype.

To identify testbed candidates, everybody is invited to propose ideas here to build a starting point for development of the idea towards the requirements for the execution of a Hack-Fest and for winning further interested parties.

To execute a Hack-Fest, we have identified these minimum requirements to make this a fruitful event:

  • A minimum of two partners collaborating on the testbed, be it companies, universities or research organizations
  • A minimum of 5 committed participants

Workflow for testbed candidates

  1. Verbalization of a task in use case form, which has potential for further work
  2. Contribution of the use cases by interested partners as testbed candidates
  3. Preparation of the testbed candidates in one or a series of workshops to a state that is sufficient for the participants of a HackFest to produce results
  4. Execution of the Hack-Fest
  5. Evaluation of the Hack-Fest results to decide on whether to follow the idea or to stop the effort
  6. Reworking the Hack-Fest results to build a contribution either to an existing project or as an initial contribution of a new open source project

Who We Are and How to Join

The initiative is still in an early stage, but more then twenty organizations have already shown interest. As we have a public website now it would be nice to give newcomers an understanding who is involved and how to interact with us.

Interested Parties

Please add the name of your organization if you are interested in OpenADx or tell us to do it for you.

  • Bosch
  • Microsoft
  • TESIS DYNAware GmbH
  • itemis
  • German Aerospace Center (DLR)
  • Dassault Systemes (3DS)
  • MathWorks
  • Elektrobit
  • Renesas
  • CEA
  • ZF Friedrichshafen AG


  • We have a mailing list: Subscribe for news and discussions: Mailing list
  • We have workshops with introduction sessions and have just started to work on concrete testbeds to identify topics that we agree to collaborate on. Currently these workshops are weekly telecons. Please check the mailing list for invitations or ask questions regarding content or participation

Testbed Candidates

Candidate 1: Simulation


The challenge in the simulation of AD functions is the big amount of test cases which can be easily generated by variation of parameters. This big amount of test cases should be handled with parallel execution of test case in multiple instances of the simulation environment.

Use Case Step 1

Target for step 1 of the testbed is the easy integration of simulation tools and function under test allowing to feed a test scenario into the simulation and to get out data for further anaysis and visualization.

Acceptance Criteria:

  • The simulation can easily be set up, necessary functionality for integrating and coordinating the different components of the simulation is available.
  • Sinks for simulation data out of the single components but also out of the communication between the components can be attached and the data can be logged and processed during the simulation.
  • The framework allows to integrate multiple simulation tools and offers a standard communication mechanism. New tools can be integrated by providing a connector between the communication mechanism and the tool.
  • The simulation execution setup is prepared for containerization in order to support easy setup and execution in parallel on a scalable hardware platform without extensive installation and maintenance efforts.

Goal for Step 1: Create demonstrators that combine a simple function under test provided by Bosch and the different simulation tools which come with a relevant plant model. For scenarios to demonstrate it is planned to look into Pegasus with the potential addition of a traffic simulated by Eclipse Sumo. The scenario should be based on a closed-loop simulation that uses a model of identified objects as input for the control algorithm.

Base Architecture Step 1

The following picture shows the basic idea for the realization of step 1.


The base decision is to use DDS as the central messaging service between the simulation components. The DDS component has to be chosen in a ROS2 compatible way in order to use the ROS2 mechanisms for data access. DDS/ROS2 compliant components are connected directly to the DDS layer, whereas simulation components not compatible with DDS are attached using a dedicated connector between the component and the DDS-API.

The Simulation Framework is basically a conglomeration of support functionality which builds the glue between the participating components. It offers functionality like the above mentioned connectors, but also time management and other functionality needed to smoothly run the simulation. This component is the main target for the testbeds step 1, since it contains the stuff needed but not provided by existing tools.

On top of these base functionalities, there are simulation tools and the function under test which build the core of the simulation, i.e., the representation of the environment and the control algorithms which observe or even act in this environment. Both provide information on the simulation state in addition to ROS2 means which record the communication between the components.

All recorded data is stored in defined measurement data formats that are stored for further processing. This is represented by the Measurement Data component in the picture which is basically an interface to testbed candidate 2, resp., storing and further processing the recorded data in offline analysis.

For online analysis, tools for visualization or introspection can connect to the DDS communication in the same fashion as the simulation components. This connection allows to control the simulation, e.g., for debugging purposes. The mechanisms rely on the possibilities of ROS2 for interacting with the simulation components.

Tools under investigation concerning usage in the demonstrator currently are:

  • DDS - implementation (tbd)
  • Matlab/Simulink
  • DYNA4
  • AirSim
  • Gazebo
  • Dymola (Modelica) ?

Topics for Step 2

Step 1 creates a simulation environment, that can run "out-of-the-box" with simple configuration. A logical step is to embed this stuff in a management framework that allows to manage scenarios and test cases and on the other hand side supports in gathering the results and evaulate the outcome whether important test criteria have been met or whether test cases have failed while doing that on a large scale in a parallel fassion.


Relevant technologies for this testbed are

Candidate 2: Massive data ingest and management


Developing and validating AD functions requires the collection of data from various sensors, actuators and other sources. This ideally takes place in a fleet operation. However, due to the required scale and diversity of measurement systems, the AD community faces a massive amount of complex and heterogeneous data.

Descriptions / Goal

This Challenge, i.e. AD data, fulfills all aspects of 'Big Data'. It is of massive volume, there exists a large variety, it is recorded in a high velocity, and usually exists in various formats. However, most available, automotive systems for handling data captured in such an environment cannot process this data in an efficient workflow. Additionally, there exists a painful gap between measurement system and tools, and data analytics and processing platforms.

To solve this challenge, there is the need for a common and open-source handling of measurement data. This should ideally be in the form of common ETL processes, which fill the gap between the heterogeneous measurement data and a suitable backend that already as an industry standard for data analysis.

This solution should incorporate the various formats such as ADTFs .dat, ROS .bag, Vectors .mdf, ...

Suggested work packages

Work package 1 - Problem space

Understand all necessary aspects to clearly define a common problem space. The above the description of the challenge may only cover some parts and is only in the perspective of one stakeholder.

Work package 2 - ETL

Start a sample ETL implementation that provides an efficient ingest of ROS .bag files into a Hadoop environment.

Work package 3 - Data management

Evaluated what is needed to manage this massive, heterogeneous and globally distributed data.


Bosch Connected World, Feb. 21st and 22nd, 2018 (Berlin, Germany) - Upcoming

Meet us at Bosch Connected World, we will provide the possibility to experiment with OpenADx tooling in the Bosch Connected Experience, the Hackathon at BCW, and we will be around for further discussion on the topics.

Workshop Testbed 1, Jan. 31st, 2018 (Stuttgart, Germany) - Current State of Planning

In this one day event, we will discuss the tools available for the demonstrator and plan the realization, i.e.:

  • Identify the use cases for the demonstrator
  • Decide on the data used in the demonstrator
  • Identify things to be implemented in the different tools and in general
  • Plan the timeframe and identify preparation tasks for the participating companies and organizations

Workshop 3, Oct. 23rd, 2017 (Ludwigsburg, Germany) - Results

The workshop took place on the Unconference day of Eclipse Con Europe 2017.


  • 08:30 - 09:00: Welcome Coffee and Registration
  • 09:00 - 12:30: Automotive Unconference including an introduction of OpenADx
  • 12:30 - 14:00: Lunch and Poster Session
  • 14:00 - 17:30: Working sessions on OpenADx testbed candidates
  • 17:30 - 18:30: Meet & Greet

Sessions for testbed candidates:

Testbed candidate 1 (simulation), Moderator: Lars Geyer-Blaumeiser (Bosch)


  • Lars Geyer-Blaumeiser, Bosch Software Innovations
  • Thomas Titze, Microsoft
  • Maximilian Chucholowski, TESIS DYNAware GmbH
  • Diego Barral, MathWorks
  • Joachim Stroop, dSpace
  • Boutheina Bannour, CEA
  • Michael Langhammer, ITEMIS
  • Adam Fuhl, Bosch
  • Stefan Griesche, Bosch
  • Johannes Jeising, Bosch
  • Gerald Stieglbauer, AVL
  • Andreas Holzner, TNG
  • Andreas Graf, ITEMIS
  • Loic Cantat, SYSTEMX
  • Christof Hammel, Bosch

Identified and classfied technologies:

  • Category Simulation Input:
    • Open Drive
    • Open CRG
    • Open Scenario
    • Oddlot
    • Pegasus
    • Matlab
    • Diversity Testcase Generator
    • Enable S3
  • Category Simulation:
    • Matlab/Simulink
    • VTD
    • OpenPass
    • DYNA4
    • CarMaker
    • Gazebo
    • Sumo
    • AirSim
    • Dymola (Modelica)
  • Category Data Evaluation:
    • Matlab
    • Diversity Test Verdict Computation (Log Analysis)
    • ROS
  • Category Visualization:
    • DYNA4 / Unity
    • AirSim / Unreal
    • ROS/Rviz
  • Category Simulation Framework:
    • Test Management Tools
    • FMU
    • Open Simulation Interface
    • DDS
    • ROS
    • Franca IDL
    • Arcosar
    • Autosar Adaptive

Possible Use Cases for Testbed Simulation:

  • Open loop simulation with recorded sensor data, DuT: Perception
  • Closed loop simulation without artificial sensor data, DuT: Planning
  • Closed look simulation with artificial sensor data, DuT: Whole Car/System

Decision: First use case is closed loop simulation without artificial sensor data

Focus Technologies for Testbed Simulation:

  • Simulation Input: Open Drive, Open Scenario
  • Simulation: DYNA4, Matlab/Simulink, AirSim, Gazebo
  • Simulation Evaluation: Matlab
  • Visualization: AirSim, DYNA4, Rviz
  • Simulation Framework: DDS, ROS

Technology selection is furthermore based on participation of interested parties

Known contact persons:

  • DYNA4: TESIS, Maximilian Chucholowski
  • AirSim: Microsoft, Thomas Titze
  • Matlab/Simulink: MathWorks, Diego Barral
  • ROS: Bosch, Johannes Jeising

Open Issues:

  • Open Drive/Open Scenario: Potential Contact Orga Automotive Simulation Center Stuttgart of Univ. Stuttgart, they provide Oddlot as editor for scenarios, idea is to use contacts, e.g., from OpenPass to get in touch
  • Gazebo: Open Robotics, if they are interested
  • DDS: DDS is only specification, concrete implementation is needed. All used tools need to have a DDS connection to ensure data transport from tool to tool. DDS should be ROS compatible for the Testbed, Potential DDS Contact Orgas: PrismTech, Eprosima

Potential constraints for the simulation:

  • Multiple scenario simulation in parallel
  • One simulation time throughout all simulation tools

Next steps:

  • The first HackFest is planned for the first two weeks in December. Prior to that an architecture workshop will take place most likely as phone conference mid of November.
  • As a preparation, the testbed idea above has to be made more precise, for that one or two preparation calls will be scheduled.
  • We will ask for concrete commitments for HackFest participation, the concrete testbed content is of course driven by the participating organisations.

Testbed candidate 2 (massive data ingest), Moderator: Thomas Reinhardt (Bosch)

The idea:

The core vision for this testbed candidate, is to close the gap between 'data generation' optimized infrastructure, i.e. measurement electronics, and state-of-the-art 'data analytic' frameworks. This gap is ever so present in the struggle with heterogeneous data formats and the use case specific tool suites that come along. A global, query-able index that offers a (integrated) analytic engine has yet not been established.

To close this gap, this testbed suggests the implementation of ETL (Extraction, Transformation and Lad) processes that ingests a complete raw measurement data set into an data analytic data structure and perform evaluations / processes. In other words, use well established "Big Data" driven analytic methods to enrich, get better insight and easy access.

The scope:

To narrow down the scope of this testbed, only a development data pipeline should be in focus. This pipeline outputs extreme amounts of raw and processed data, with a limited size of (fleet) vehicles. The testbed implements a suitable ETL to ingest major data formats into a unified view (i.e. into a suitable data structure). A set of simple queries allows to access this transformed data, independent from its original format. As a second step, the testbed implements a data enrichment by demonstrating an easy "plug-and-play" of existing open source components, e.g. by using a neuronal network to create new labels. This newly created (or enriched) information is then feed back into the unified view. As a stretch goal, the testbed implements a connection to feed back this information into existing (automotive) development tools. For example, the testbed offers a playback of queried and enriched data using an Rviz visualization.

The challenge:

  • High complexity of measurement data. AD data consists of 1D, 2D and 3D time series with a high velocity. Rates can be as high as up to 3GB/s
  • Source of usable data.

Open Questions

  • All about the data? A to some extent complex and open source-able raw data set is required.
    • Is it possible to record a new data set for the testbed?
    • Clear an internal data set of a testbed partner?
    • Use data from an autonomous racing team?

Next Steps

More Marketing

Invitation for more participants.

Follow-Up Call

Follow up Skype call on Monday, November 6th at 2pm CEST. Meeting will be provided by Microsoft (to be added here). Invitation also via OpenADx mailing list.

Follow-Up Architecture Workshop (OpenADx Workshop #4)

Architecture Workshop to refine testbed idea. In preparation for the upcoming hackfest:

  • Data sets for 'base line'.
  • Definition of concrete interfacing tools, formats, requirements, ...
  • Definition of a common architecture for a Data Ingest testbed.
  • Coordination of hackfest details

Workshop 2, Sept. 19, 2017 (Stuttgart, Germany) - Results

Strategy discussion results


- Certification organizations

- Universities

- Research institutes

- Standardization organizations like AUTOSAR can provide areas to watch out for

- These are in addition to already identified OEM, tier1, hi-tech, tool providers, silicon chip


- Need slide deck with a narrative to socialize inside of respective organizations

- Set up a public community under Eclipse management for contribution and preparation work prior to next workshop 23 October

- Weekly telecom to disseminate information – attendance voluntary based on availability

- Hosted communication at e.g. 3D Experience Forums by e.g. Dassault

Execution and milestones

- Further workshop to detail testbeds/incubators - October eclipsecon Europe

- Testbed/incubator execution (Hackathon/Hackfest) – November 2017

- Group composition for hackathon to be determined based on results of testbed/incubator layout

- Testbed/incubator completed by the end of 2017

- Announcement – venue dependent on testbed/incubator results and feasible timing (forums: CES, BCW, Software in Automobiles, Toulouse, Embedded World, WCX – sponsor is SAE International)

Forms of contribution by community members

- Manpower – e.g. programming capacity

- Monetary investment

- Material support - e.g. a car for demonstration purposes

Testbed/Incubator discussion results

1. Perception simulation and high quality rendering vs functional validation & simulation (FX to handle different simulation use cases)

- Simulation (formerly #11) – Workshop 1

- ROS in the loop

2. Driver to Driver communications

- Hybrid human -vs- autonomous driving behavior differenced – very hard problem

3. Car to X communications

4. Standardization

- Reference / canonical data set - Data Sharing across companies

- Share crash incident data across companies rapidly to accelerate safety resolution

- Open Scenarios - Common ways of defining scenarios

5. API standardization – common way of calling / creating API’s

6. Management & usage of big data

- Massive data ingest

7. Standardization interface for authority approvals based on defined data (independent of producer tool chain)

- There’s major value in driving agreement on how data is stored; can we drive agreement in which sensors are used and the format of the sensor data.

- This could benefit government agency that will test using the data to determine the safety of the vehicles

8. Heterogeneous data formats and data fusion

- Compatibility of measured data

- Visualization of data from different sources (lidar, video, measurements) could be very valuable

OpenADx testbed incubator discussion results.png


- It was agreed that we should concentrate on the top ranked topics #4, #6, #8.

- Proposed procedure is to set up a frequent call to discuss the topics and identify concrete tasks to be executed in "Hackfests“

- On the Unconference of the EclipseCon Europe on Oct. 23rd, the topics should be discussed further. Ideal result is to have them prepared for a "Hackfest".

- General agreement for participation in the concretization of the topics by XXXXXXX and XXXXX employees.

Workshop 1, Aug. 2, 2017 (Redmond, US) - Results

1. Gain points and pain points to address

Opportunities / gains

- Manage complexity / risk

- Reallocate portions of resource pool to more user relevant activities

- Commonality reduces risk

- Reduce cost in a non-automobile implementation area; the tool chain is outside of the vehicle

- AD development accelerator consists of known and certified components and services => ensure

Risks / pains

- What if this group of companies come together but is not actually able to complete the project

- Will this be fast enough to satisfy TTM needs of the OEMs

- How will sufficient safety levels be reached and assured

- Preserve independence: This initiative might actually preserve the second source needs OEM and Tier1 somewhat fearful of IT industry

2. Value proposition

- Reduced investment risk and cost reduction through shared development

- Improved capability to focus resources in more saleable areas

- Improved SW reliability and safety

- Standardization facilitates greater AD understanding and therefore greater acceptance in the political, legal and insurance communities

- Standardization promotes interchangeability @ OEM

3. Key success factors

=> Build momentum for the initiative

- Demonstrate that we can deliver; lofty vision is OK, but initial projects must be

=> Realistic / achievable

=> Clearly scoped

=> Measurable

- Start with a “coalition of the willing”

=> Not everyone will want to participate initially; reduce skepticism with results

- A common data pool to accelerate development

- Limit scope to non-differentiating factors

- Open APIs, Open framework and open collaboration => industry standard

=> Mix of multiple partners in each area… ”coopetition”

- Extensibility within the system to add modules without the OEM being required to open up all data access – that must remain a choice@OEM

Potential naming options

OpenADx Naming Options.jpg


Pitch deck to be used inside of respective organizations

Back to the top