StreamNative Cloud provides two ways to use Kafka:
-
Kafka Service — A fully managed, native Apache Kafka service. Kafka Service runs native Apache Kafka on the Ursa Engine with no compatibility layer. Your existing Kafka clients, tools, and ecosystems work without modification. This is the recommended option for new Kafka workloads.
-
Kafka Compatibility on Pulsar Clusters (via KSN) — Pulsar Clusters can serve Kafka clients through KSN, which translates the Kafka protocol to underlying storage. This option lets you access your Pulsar data using Kafka clients. All clusters are powered by the Ursa Engine. The previously called “Ursa” clusters are cost-optimized Pulsar Clusters with Kafka compatibility.
If you are starting a new Kafka project or migrating from Amazon MSK, Confluent, or self-managed Kafka, use Kafka Service for full native Kafka support.
Kafka Compatibility on Pulsar Clusters (KSN)
The rest of this page covers Kafka compatibility when using Pulsar Clusters with KSN enabled. For native Kafka Service documentation, see the Kafka Service overview.
KSN is a protocol handler that enables Kafka clients to connect to Pulsar Clusters. Because KSN translates Kafka requests to Pulsar operations, some Kafka features have differences in behavior. The following tables detail the features supported by KSN on Latency Optimized and Cost Optimized Pulsar Clusters.
Kafka Protocol Support
KStreams and KSqlDB support on Cost-Optimized clusters has certain limitations. It does not support functionalities that require transactions and topic compaction.
| Latency Optimized | Cost Optimized | Open-source KoP |
|---|
| Publish & Consume | YES | YES | YES |
| Kafka Schema Registry | YES | YES | YES |
| Transactions | YES | Coming Soon | YES |
| Compacted Topic | YES | Coming Soon | |
| KStreams Integration | YES | Yes, with limitations | |
| KSqlDB Integration | YES | Yes, with limitations | |
Production Readiness
| Latency Optimized | Cost Optimized | Open-source KoP |
|---|
| OAuth Authentication | YES | YES | - |
| Kubernetes Authentication | YES | YES | - |
| RBAC | YES | YES | - |
| Schema Registry OAuth Authentication | YES | YES | - |
| Schema Registry RBAC | Yes | Yes | - |
| TLS | YES | YES | YES |
| Authorization | YES | YES | YES |
| Multi-AZ / Multi-Region Clusters | YES | YES | - |
| Geo-replication | YES | YES | - |
Deployments and Efficient Operations
| Latency Optimized | Cost Optimized | Open-source KoP |
|---|
| Serverless / Dedicated | YES | YES | - |
| BYOC | YES | YES | - |
| Private Cloud | YES | Coming Soon | - |
| On-prem (self-managed) | YES | Coming Soon | YES |
| Auto-scaling | YES | YES | - |
| Cloud Console / UI | YES | YES | - |
You can use StreamNative Cloud for your existing Kafka applications and services without migrating the code. See a full list of supported Kafka clients and Kafka Compatibility.
Get started
To get started with StreamNative Cloud using Kafka, see the QuickStart guide to learn how to set up a cluster with Kafka protocol enabled and configure a Kafka client for producing and consuming messages.
For language-specific setup instructions, refer to our Kafka Client Guides which provide QuickStart tutorials for your preferred programming language.
Use Kafka Client Setup Wizard
After provisioning your cluster, StreamNative Console provides a step-by-step wizard to help you set up Kafka client libraries and tools. The wizard guides you through the basic setup and configuration process, including selecting or creating service accounts, downloading key files or tokens, installing client libraries, and generating sample code to run.
To get started with the Kafka client setup wizard, follow these steps.
-
Navigate to your StreamNative Cloud cluster.
-
On the left navigation pane, in the Clients section, click Kafka Clients.
-
Follow the wizard to generate the sample code you need for connecting to your StreamNative cluster.
With a copy-and-paste, you can run the given sample code to produce and consume messages.
Kafka Clients
| Language | References | Description |
|---|
| Kafka Java client | QuickStart | Client Guide | Tutorial | Java producer and consumer shipped with Apache Kafka. |
| Kafka C/C++ client | QuickStart | Tutorial | librdkafka, a C/C++ library for Kafka. |
| Kafka Python client | QuickStart | Tutorial | Python client that provides high-level producer, consumer and AdminClient. |
| Kafka Go client | QuickStart | Tutorial | Go client that offers a producer and consumer for Kafka. |
| Kafka Node.js client | QuickStart | Tutorial | Node.js client that provides Kafka APIs |
| Kafka .NET client | Tutorial | .NET client that provides a high-level producer, consumer and AdminClient. |
You can use the Kafka CLI tools to connect to your StreamNative cluster. You can see a quickstart guide here. For more information, see Use Kafka Tools with StreamNative Cloud.
Kafka Connect
StreamNative Cloud provides full compatibility with Kafka Connect, offering two deployment options: you can self-host Kafka Connect connectors in your own environment (see examples here), or leverage fully managed Kafka Connect connectors running directly in StreamNative Cloud.
Kafka Streams
You can also build data streaming applications using Kafka Streams. See the Kafka Streams QuickStart for more information. Please note that StreamNative Cloud doesn’t host any Kafka Streams applications.
KSQL
You can also build data streaming applications using KSQL. See the KSQL QuickStart for more information. Please note that StreamNative Cloud doesn’t host the KSQL service.
Learn more advanced topics
Integrations