Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This article shows you how to add Apache Kafka source to a Fabric eventstream.
Apache Kafka is an open-source, distributed platform for building scalable, real-time data systems. By integrating Apache Kafka as a source within your eventstream, you can seamlessly bring real-time events from your Apache Kafka and process them before routing to multiple destinations within Fabric.
Prerequisites
Access to the Fabric workspace with Contributor or above permissions.
An Apache Kafka cluster running.
Your Apache Kafka must be publicly accessible and not be behind a firewall or secured in a virtual network. If it resides in a protected network, connect to it by using Eventstream connector virtual network injection.
If you plan to use TLS/mTLS settings, make sure the required certificates are available in an Azure Key Vault:
- Import the required certificates into Azure Key Vault in .pem format.
- The user who configures the source and previews data must have permission to access the certificates in the Key Vault (for example, Key Vault Certificate User or Key Vault Administrator).
- If the current user doesn’t have the required permissions, data can’t be previewed from this source in Eventstream.
Add Apache Kafka as a source
If you haven't added any source to your eventstream yet, select the Connect data sources tile. You can also select Add source > Connect data sources on the ribbon.
If you're adding the source to an already published eventstream, switch to Edit mode. On the ribbon, select Add source > Connect data sources.
On the Select a data source page, or Data sources page, select Apache Kafka.
Configure and connect to Apache Kafka
On the Connect page, select New connection.
In the Connection settings section, for Bootstrap Server, enter one or more Kafka bootstrap server addresses. Separate multiple addresses with commas (,).
In the Connection credentials section, If you have an existing connection to the Apache Kafka cluster, select it from the dropdown list for Connection. Otherwise, follow these steps:
- For Connection name, enter a name for the connection.
- For Authentication kind, confirm that API Key is selected.
- For Key and Secret, enter API key and key Secret.
Note
If you only use mTLS to do the authentication, you can add any string in the Key section during connection creation.
Select Connect.
Now, on the Connect page, follow these steps.
For Topic, enter the Kafka topic.
For Consumer group, enter the consumer group of your Apache Kafka cluster. This field provides you with a dedicated consumer group for getting events.
Select Reset auto offset to specify where to start reading offsets if there's no commit.
For Security protocol, select one of the following options:
- SASL_SSL: Use this option when your Kafka cluster uses SASL-based authentication. By default, the Kafka broker’s server certificate must be signed by a Certificate Authority (CA) included in the trusted CA list. If your Kafka cluster uses a custom CA, you can configure it by using TLS/mTLS settings.
- SSL (mTLS): Use this option when your Kafka cluster requires mTLS authentication, and you must configure both a custom server CA certificate and a client certificate in TLS/mTLS settings.
The default SASL mechanism is typically PLAIN, unless configured otherwise. You can select the SCRAM-SHA-256 or SCRAM-SHA-512 mechanism that suits your security requirements.
If your Kafka cluster uses a custom CA or requires mTLS, expand TLS/mTLS settings and configure the following options as needed:
- Trust CA certificate: Enable this option to configure the server CA certificate. Select your subscription, resource group, and key vault, and then provide the certificate name.
- Client certificate and key: Enable this option to configure the client certificate and key.
- Use the same CA certificate key vault: Select this checkbox when both certificates are stored in the same key vault. Then provide the certificate name.
- If you don't select this checkbox, select the subscription, resource group, and key vault, and then provide the certificate name.
Note
TLS/mTLS settings in this section are currently in preview.
For sources in a private network, ensure that the Azure Key Vault containing your certificates is connected to the Azure virtual network used by the streaming virtual network data gateway for Eventstream connector virtual network injection (for example, via a private endpoint).
TLS/mTLS certificate requirements
If you configured TLS/mTLS settings, refer to this section for certificate format specifications and common configuration mistakes when uploading to Azure Key Vault.
Certificate chain
| Certificate | Key size | Signed by | Purpose |
|---|---|---|---|
| CA certificate | 4096-bit RSA | Self-signed | Trust anchor - the broker verifies client certificates against this CA. |
| Server certificate | 2048-bit RSA | CA | Broker identity - the client verifies the broker is who it claims to be. |
| Client certificate | 2048-bit RSA | CA | Client identity - the broker verifies that the connector is authorized. |
Server certificate SAN requirements
The server certificate must include the broker's IP address and DNS name in the Subject Alternative Name (SAN) to pass hostname verification (ssl.endpoint.identification.algorithm=https):
subjectAltName:
DNS.1 = {broker FQDN}
DNS.2 = localhost
IP.1 = {broker public IP}
IP.2 = 127.0.0.1
Upload certificates to Azure Key Vault
Certificates are uploaded as Azure Key Vault certificate objects in PEM format. The PEM bundle file is certificate + private key concatenated in one file:
-----BEGIN CERTIFICATE-----
MIIExjCCA...
-----END CERTIFICATE-----
-----BEGIN RSA PRIVATE KEY-----
MIIEpAIB...
-----END RSA PRIVATE KEY-----
Use an import policy that matches the key properties:
{
"secretProperties": {
"contentType": "application/x-pem-file"
},
"keyProperties": {
"exportable": true,
"keyType": "RSA",
"keySize": 4096,
"reuseKey": false
},
"issuerParameters": {
"name": "Unknown"
}
}
To import the certificate, run the following command:
az keyvault certificate import \
--vault-name {kvName} \
--name {certName} \
--file {pemBundleFile} \
--policy @{policyFile}
Common mistakes
| Avoid | Do this instead |
|---|---|
| Upload as PKCS#12/PFX | Use PEM format with contentType: application/x-pem-file. |
| Upload certificate without private key | The PEM bundle must contain both the certificate and the key. |
Set keySize: 2048 for a 4096-bit key |
The keySize value must match the actual key size. |
Set issuerParameters.name: "Self" |
Use "Unknown" for externally signed certificates. |
| Use Windows line endings (CRLF) | The PEM file must use Unix line endings (LF only). |
View updated eventstream
You can see the Apache Kafka source added to your eventstream in Edit mode.
After you complete these steps, the Apache Kafka source is available for visualization in Live view.
Note
To preview events from this Apache Kafka source, ensure that the key used to create the cloud connection has read permission for consumer groups prefixed with "preview-".
For Apache Kafka source, only messages in JSON format can be previewed.
Related content
Other connectors: