Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This article shows you how to add an Amazon MSK Kafka source to an eventstream.
Amazon MSK Kafka is a fully managed Kafka service that simplifies the setup, scaling, and management. By integrating Amazon MSK Kafka as a source within your eventstream, you can seamlessly bring the real-time events from your MSK Kafka and process it before routing them to multiple destinations within Fabric.
Prerequisites
Access to a workspace in the Fabric capacity license mode (or) the Trial license mode with Contributor or higher permissions.
An Amazon Managed Streaming for Kafka (MSK) cluster in active status.
Your Amazon MSK Kafka cluster must be publicly accessible and not be behind a firewall or secured in a virtual network. If it resides in a protected network, connect to it by using Eventstream connector virtual network injection.
If you plan to use TLS/mTLS settings, make sure the required certificates are available in an Azure Key Vault:
- Import the required certificates into Azure Key Vault in .pem format.
- The user who configures the source and previews data must have permission to access the certificates in the Key Vault (for example, Key Vault Certificate User or Key Vault Administrator).
- If the current user doesn’t have the required permissions, data can’t be previewed from this source in Eventstream.
Add Amazon MSK Kafka as a source
In Fabric Real-Time Intelligence, select Eventstream to create a new eventstream.
On the next screen, select Connect data sources, or select Add source -> Connect data sources.
On the Select a data source page, select View all sources.
Search for Amazon MSK Kafka, and then select Connect on the tile.
Configure and connect to Amazon MSK Kafka
On the Connect page, select New connection.
In the Connection settings section, for Bootstrap Server, enter one or more public Kafka bootstrap server endpoints. Use commas (,) to separate multiple servers.
To get the public endpoint:
In the Connection credentials section, If you have an existing connection to the Amazon MSK Kafka cluster, select it from the dropdown list for Connection. Otherwise, follow these steps:
- For Connection name, enter a name for the connection.
- For Authentication kind, confirm that API Key is selected.
- For Key and Secret, enter API key and key Secret for Amazon MSK Kafka cluster.
Note
If you only use mTLS to do the authentication, you can add any string in the Key section during connection creation.
Select Connect.
Now, on the Connect page, follow these steps.
For Topic, enter the Kafka topic.
For Consumer group, enter the consumer group of your Kafka cluster. This field provides you with a dedicated consumer group for getting events.
Select Reset auto offset to specify where to start reading offsets if there's no commit.
For Security protocol, select one of the following options:
- SASL_SSL: Use this option when your Kafka cluster uses SASL-based authentication. By default, the Kafka broker’s server certificate must be signed by a Certificate Authority (CA) included in the trusted CA list. If your Kafka cluster uses a custom CA, you can configure it by using TLS/mTLS settings.
- SSL (mTLS): Use this option when your Kafka cluster requires mTLS authentication, and you must configure both a custom server CA certificate and a client certificate in TLS/mTLS settings.
The default SASL mechanism is SCRAM-SHA-512 and can't be changed.
If your Kafka cluster uses a custom CA or requires mTLS, expand TLS/mTLS settings and configure the following options as needed:
- Trust CA certificate: Enable this option to configure the server CA certificate. Select your subscription, resource group, and key vault, and then provide the certificate name.
- Client certificate and key: Enable this option to configure the client certificate and key.
- Use the same CA certificate key vault: Select this checkbox when both certificates are stored in the same key vault. Then provide the certificate name.
- If you don't select this checkbox, select the subscription, resource group, and key vault, and then provide the certificate name.
Note
TLS/mTLS settings in this section are currently in preview.
For sources in a private network, ensure that the Azure Key Vault containing your certificates is connected to the Azure virtual network used by the streaming virtual network data gateway for Eventstream connector virtual network injection (for example, via a private endpoint).
TLS/mTLS certificate requirements
If you configured TLS/mTLS settings, refer to this section for certificate format specifications and common configuration mistakes when uploading to Azure Key Vault.
Certificate chain
| Certificate | Key size | Signed by | Purpose |
|---|---|---|---|
| CA certificate | 4096-bit RSA | Self-signed | Trust anchor - the broker verifies client certificates against this CA. |
| Server certificate | 2048-bit RSA | CA | Broker identity - the client verifies the broker is who it claims to be. |
| Client certificate | 2048-bit RSA | CA | Client identity - the broker verifies that the connector is authorized. |
Server certificate SAN requirements
The server certificate must include the broker's IP address and DNS name in the Subject Alternative Name (SAN) to pass hostname verification (ssl.endpoint.identification.algorithm=https):
subjectAltName:
DNS.1 = {broker FQDN}
DNS.2 = localhost
IP.1 = {broker public IP}
IP.2 = 127.0.0.1
Upload certificates to Azure Key Vault
Certificates are uploaded as Azure Key Vault certificate objects in PEM format. The PEM bundle file is certificate + private key concatenated in one file:
-----BEGIN CERTIFICATE-----
MIIExjCCA...
-----END CERTIFICATE-----
-----BEGIN RSA PRIVATE KEY-----
MIIEpAIB...
-----END RSA PRIVATE KEY-----
Use an import policy that matches the key properties:
{
"secretProperties": {
"contentType": "application/x-pem-file"
},
"keyProperties": {
"exportable": true,
"keyType": "RSA",
"keySize": 4096,
"reuseKey": false
},
"issuerParameters": {
"name": "Unknown"
}
}
To import the certificate, run the following command:
az keyvault certificate import \
--vault-name {kvName} \
--name {certName} \
--file {pemBundleFile} \
--policy @{policyFile}
Common mistakes
| Avoid | Do this instead |
|---|---|
| Upload as PKCS#12/PFX | Use PEM format with contentType: application/x-pem-file. |
| Upload certificate without private key | The PEM bundle must contain both the certificate and the key. |
Set keySize: 2048 for a 4096-bit key |
The keySize value must match the actual key size. |
Set issuerParameters.name: "Self" |
Use "Unknown" for externally signed certificates. |
| Use Windows line endings (CRLF) | The PEM file must use Unix line endings (LF only). |
View updated eventstream
You can see the Amazon MSK Kafka source added to your eventstream in Edit mode.
After you complete these steps, the Amazon MSK Kafka source is available for visualization in Live view.
Note
To preview events from this Amazon MSK Kafka source, ensure that the key used to create the cloud connection has read permission for consumer groups prefixed with "preview-".
For Amazon MSK Kafka source, only messages in JSON format can be previewed.
Related content
Other connectors: