Connect to your cluster using the Kafka Go client
Note
This QuickStart assumes that you have created a Pulsar cluster with the Kafka protocol enabled, created a service account, and granted the service account produce
and consume
permissions to a namespace for the target topic.
This document describes how to connect to your Pulsar cluster using the Kafka Go client through Token authentication.
Before you begin
Note
- Before getting the token of a service account, verify that the service account is authorized as a superuser or an admin of the tenants and namespaces.
- A token has a system-defined Time-To-Live (TTL) of 7 days. Before a token expires, ensure that you generate a new token for your service account.
Get the JWT token.
On the left navigation pane, click Service Accounts.
In the row of the service account you want to use, in the Token column, click Generate new token, then click the Copy icon to copy the token to your clipboard.
Get the service URL of your Pulsar cluster.
- On the left navigation pane, in the Admin area, click Pulsar Clusters.
- Select the Details tab, and in the Access Points area, click Copy at the end of the row of the Kafka Service URL (TCP).
Steps
Install the Kafka Go client.
go get -u github.com/confluentinc/confluent-kafka-go/v2/kafka
Build a Go application to produce and consume messages.
package main import ( "fmt" "github.com/confluentinc/confluent-kafka-go/v2/kafka" "time" ) func main() { // Step 1: replace with your configurations serverUrl := "SERVER-URL" jwtToken := "YOUR-TOKEN" topicName := "persistent://public/default/test-go-topic" namespace := "public/default" password := "token:" + jwtToken // Step 2: create a producer to send messages producer, err := kafka.NewProducer(&kafka.ConfigMap{ "bootstrap.servers": serverUrl, "security.protocol": "SASL_SSL", "sasl.mechanism": "PLAIN", "sasl.username": namespace, "sasl.password": password, }) if err != nil { panic(err) } defer producer.Close() err = producer.Produce(&kafka.Message{ TopicPartition: kafka.TopicPartition{Topic: &topicName, Partition: kafka.PartitionAny}, Value: []byte("hello world"), }, nil) if err != nil { panic(err) } producer.Flush(1000) // wait for delivery report e := <-producer.Events() message := e.(*kafka.Message) if message.TopicPartition.Error != nil { fmt.Printf("failed to deliver message: %v\n", message.TopicPartition) } else { fmt.Printf("delivered to topic %s [%d] at offset %v\n", *message.TopicPartition.Topic, message.TopicPartition.Partition, message.TopicPartition.Offset) } // Step 3: create a consumer to read messages consumer, err := kafka.NewConsumer(&kafka.ConfigMap{ "bootstrap.servers": serverUrl, "security.protocol": "SASL_SSL", "sasl.mechanisms": "PLAIN", "sasl.username": namespace, "sasl.password": password, "session.timeout.ms": 6000, "group.id": "my-group", "auto.offset.reset": "earliest", }) if err != nil { panic(fmt.Sprintf("Failed to create consumer: %s", err)) } defer consumer.Close() topics := []string{topicName} err = consumer.SubscribeTopics(topics, nil) if err != nil { panic(fmt.Sprintf("Failed to subscribe topics: %s", err)) } // read one message then exit for { fmt.Println("polling...") message, err = consumer.ReadMessage(1 * time.Second) if err == nil { fmt.Printf("consumed from topic %s [%d] at offset %v: %+v", *message.TopicPartition.Topic, message.TopicPartition.Partition, message.TopicPartition.Offset, string(message.Value)) break } } }
serverUrl
: the Kafka service URL of your Pulsar cluster.jwtToken
: the token of your service account.
Run the Go application and you should see the following output:
delivered to topic persistent://public/default/test-go-topic [0] at offset 29 polling... polling... polling... polling... polling... polling... polling... consumed from topic persistent://public/default/test-go-topic [0] at offset 15: hello world