The official MongoDB Kafka Connect Sink connector.
The MongoDB Kafka sink connector is a Kafka Connect connector that reads data from Kafka topics and writes data to MongoDB.
readWrite
role on the database. For more granular access control, you can specify a custom role that allows insert
, remove
, and update
actions on the databases or collections.connection.uri
is in form of mongodb+srv://username:password@cluster0.xxx.mongodb.net
Setup the kcctl client: doc
Create a MongoDB Cluster, you can create one in k8s cluster with below yaml file:
Initialize the local MongoDB cluster:
Create a JSON file like the following:
Run the following command to create the connector:
If you want to use the MongoDB CDC handler for data sourced from MongoDB instances by MongoDB source connector, you will need to select STRING
or BYTES
as the value converter for both MongoDB source and MongoDB sink connectors. Details can be found here.
The MongoDB Kafka sink connector is configured using the following Required properties:
Parameter | Description |
---|---|
connection.uri | The connection URI for the MongoDB server. |
database | The MongoDB database name. |
topics | A list of Kafka topics that the sink connector watches. (You can define either the topics or the topics.regex setting, but not both.) |
topics.regex | A regular expression that matches the Kafka topics that the sink connector watches. (You can define either the topics or the topics.regex setting, but not both.) |
The full properties are also available from the offical MongoDB Kafka Sink Connector documentation.
The official MongoDB Kafka Connect Sink connector.
The MongoDB Kafka sink connector is a Kafka Connect connector that reads data from Kafka topics and writes data to MongoDB.
readWrite
role on the database. For more granular access control, you can specify a custom role that allows insert
, remove
, and update
actions on the databases or collections.connection.uri
is in form of mongodb+srv://username:password@cluster0.xxx.mongodb.net
Setup the kcctl client: doc
Create a MongoDB Cluster, you can create one in k8s cluster with below yaml file:
Initialize the local MongoDB cluster:
Create a JSON file like the following:
Run the following command to create the connector:
If you want to use the MongoDB CDC handler for data sourced from MongoDB instances by MongoDB source connector, you will need to select STRING
or BYTES
as the value converter for both MongoDB source and MongoDB sink connectors. Details can be found here.
The MongoDB Kafka sink connector is configured using the following Required properties:
Parameter | Description |
---|---|
connection.uri | The connection URI for the MongoDB server. |
database | The MongoDB database name. |
topics | A list of Kafka topics that the sink connector watches. (You can define either the topics or the topics.regex setting, but not both.) |
topics.regex | A regular expression that matches the Kafka topics that the sink connector watches. (You can define either the topics or the topics.regex setting, but not both.) |
The full properties are also available from the offical MongoDB Kafka Sink Connector documentation.