Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

Difference between revisions of "BaSyx / Documentation / Components / DataBridge"

m
 
(10 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
|}
 
|}
  
= Asset Integration made Easy with the DataBridge =
+
= DataBridge Component =
The DataBridge supports integrating various protocols with Asset Administration Shells. Data can be acquired from various endpoints, be transformed and pushed into SubmodelElements. By being provided as easy-to-use off-the-shelf component on [https://hub.docker.com/r/eclipsebasyx/databridge DockerHub], it can easily be integrated in own use cases.
+
The DataBridge supports integrating various protocols with Asset Administration Shells. Data can be acquired from various endpoints, be transformed and pushed into SubmodelElements. By being provided as easy-to-use off-the-shelf component on [https://hub.docker.com/r/eclipsebasyx/databridge DockerHub], it can easily be integrated in own use cases. For a comprehensive example, see the [[BaSyx_/_Scenarios_/_Asset_Integration | Asset Integration Scenario]].
  
Currently, the DataBridge supports three different types of integration:
+
== Download ==
* Integration via Cyclic Updates
+
* Event-triggered Integration
+
* Integration via Delegation
+
  
For all three integration patterns, transformation of data is supported. A running example utilizing the DataBridge is provided on [https://github.com/eclipse-basyx/basyx-java-examples/tree/main/basyx.examples.deviceintegration/src/main/resources GitHub]
+
The DataBridge image is made available via [https://hub.docker.com/r/eclipsebasyx/databridge Docker Hub] and can be pulled by:
  
== Integration via Cyclic Updates ==
+
docker pull eclipsebasyx/databridge:1.0.0-SNAPSHOT
In this integration pattern, the DataBridge periodically queries the data source and pushes it into the AAS Server. Thus, data is automatically kept up-to-date with aconfigurable time interval. If necessary, the retrieved data can be transformed
+
  
== Event-triggered Integration ==
+
Alternatively, the command described in Startup section will download the image.
Similar to the ''Integration via Cyclic Updates'', the DataBridge can transform and push data if an event is received. Currently, it is assumed that the event payload contains the data to-be-integrated.
+
  
== Integration via Delegation ==
+
== Startup ==
This integration pattern synergizes with the [[BaSyx_/_Documentation_/_Components_/_AAS_Server_/_Features_/_Property_Delegation | Property Delegation feature]] of the AAS Server.
+
In contrast to the two integration patterns above, a pull principle is utilized, i.e., the AAS Server pulls the data from the DataBridge on demand. In this integration approach, the DataBridge provides an HTTP/REST endpoint that delivers a Property value, thus enabling the AAS Server to retrieve the data. Thus, the data integration is triggered by the AAS Server, i.e., in a pull principle.
+
  
By supporting various endpoints and complex transformations, the data can be tailored to the specific Property's needs.
+
To easily start the DataBridge component, you can use the following command:
 +
 
 +
docker run --name=databridge -p 8085:8085 -v C:/tmp:/usr/share/config eclipsebasyx/databridge:1.0.0-SNAPSHOT
 +
 
 +
The host port 8085 is mapped to container port 8085. The configuration files are located in '''C:/tmp''' and are available to the docker container at '''/usr/share/config''' via volume mapping. Alternatively, the DataBridge can be configured by passing the configuration files via environment variables.
 +
 
 +
== Configuration via Environment Variables ==
 +
The DataBridge expects the environment variables to follow the same naming scheme and content as the config files. For example, for routes configuration, "routes.json" environment variable needs to be defined with the content described in the [[BaSyx_/_Documentation_/_Components_/_DataBridge_/_Features_/_Routes_Configuration | routes.json]] documentation. Additionally, the name of the JSONata transformation files need to be explicitly configured as JSON array via ''jsonatatransformers'' variable, e.g., ''jsonatatransformers = ["jsonataA.json", "jsonataB.json"].
 +
 
 +
'''Note:'''
 +
* Either volume mapping or environment variable configuration is mandatory because if there are no configuration files defined/mapped, then running the image would throw an exception.
 +
* Make sure that other components such as Data Source, Transformer, and Data Sink components defined in the configuration are up and running before starting the DataBridge component.
 +
* Please use the latest version of the DataBridge image.

Latest revision as of 05:36, 4 October 2023

Overview | Interface | Component | Features

DataBridge Component

The DataBridge supports integrating various protocols with Asset Administration Shells. Data can be acquired from various endpoints, be transformed and pushed into SubmodelElements. By being provided as easy-to-use off-the-shelf component on DockerHub, it can easily be integrated in own use cases. For a comprehensive example, see the Asset Integration Scenario.

Download

The DataBridge image is made available via Docker Hub and can be pulled by:

docker pull eclipsebasyx/databridge:1.0.0-SNAPSHOT

Alternatively, the command described in Startup section will download the image.

Startup

To easily start the DataBridge component, you can use the following command:

docker run --name=databridge -p 8085:8085 -v C:/tmp:/usr/share/config eclipsebasyx/databridge:1.0.0-SNAPSHOT

The host port 8085 is mapped to container port 8085. The configuration files are located in C:/tmp and are available to the docker container at /usr/share/config via volume mapping. Alternatively, the DataBridge can be configured by passing the configuration files via environment variables.

Configuration via Environment Variables

The DataBridge expects the environment variables to follow the same naming scheme and content as the config files. For example, for routes configuration, "routes.json" environment variable needs to be defined with the content described in the routes.json documentation. Additionally, the name of the JSONata transformation files need to be explicitly configured as JSON array via jsonatatransformers variable, e.g., jsonatatransformers = ["jsonataA.json", "jsonataB.json"].

Note:

  • Either volume mapping or environment variable configuration is mandatory because if there are no configuration files defined/mapped, then running the image would throw an exception.
  • Make sure that other components such as Data Source, Transformer, and Data Sink components defined in the configuration are up and running before starting the DataBridge component.
  • Please use the latest version of the DataBridge image.

Back to the top