Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

BaSyx / Documentation / Components / DataBridge / Features / Protocol Integration / Kafka

Kafka

The Kafka source can be integrated with DataBridge. The Kafka component is used for communicating with the Apache Kafka message broker.

Configuration

To configure Kafka source in DataBridge you need to provide the unique id, and the Kafka broker server details like host, port, along with other Kafka consumer query parameters like the topic, maxPollRecords, groupId, consumersCount and seekTo. For more information on query parameters Kafka Query Parameters

Sample Configuration

[
	{
		"uniqueId": "property1",
		"serverUrl": "localhost",
		"serverPort": 9092,
		"topic": "first-topic",
		"maxPollRecords": 5000,
		"groupId": "basyx-updater",
		"consumersCount": 1,
		"seekTo": "BEGINNING"
	}
]

Disclaimer: Please note that only the query parameters listed in the sample configuration are supported as of now.

Similarly, you can configure multiple Kafka consumers inside the configuration file.

Naming Convention

The name of the Kafka consumer configuration file should be kafkaconsumer.json.

Working Example

The integration example with Kafka as a data source, JsonAta as a transformer, and AAS as a data sink is on GitHub DataBridge Example.

Back to the top