The Airbyte Kafka source allows you to sync data from Kafka. Each Kafka topic is written to the corresponding stream.
Each Kafka topic will be output into a stream.
Currently, this connector only reads data with JSON format. More formats (e.g. Apache Avro) will be supported in the future.
Full Refresh Sync
Incremental - Append Sync
To use the Kafka source, you'll need:
A Kafka cluster 1.0 or above.
Make sure your Kafka brokers can be accessed by Airbyte.
Airbyte should be allowed to read messages from topics, and these topics should be created before reading from Kafka.
You can determine the topics from which messages are read via the topic_pattern configuration parameter. Messages can be read from a hardcoded, pre-defined topic.
To read all messages from a single hardcoded topic, enter its name in the topic_pattern field e.g: setting topic_pattern to my-topic-name will read all messages from that topic.
You can determine the topic partitions from which messages are read via the topic_partitions configuration parameter.
Setup the Kafka destination in Airbyte
You should now have all the requirements needed to configure Kafka as a destination in the UI. You can configure the following parameters on the Kafka destination (though many of these are optional or have default values):