Skip to contentSkip to navigationSkip to topbar
On this page
Looking for more inspiration?Visit the
(information)
You're in the right place! Segment documentation is now part of Twilio Docs. The content you are used to is still here—just in a new home with a refreshed look.

Kafka Destination


Destination Info
  • Accepts Page, Alias, Group, Identify and Track calls.
  • Refer to it as Kafka in the Integrations object
  • This destination is in Beta

Kafka(link takes you to an external page) provides a highly scalable and fault-tolerant messaging system that enables real-time data processing and stream processing at scale. When integrated with Segment, Kafka serves as a powerful backbone for managing and processing event data collected by Segment, allowing businesses to efficiently ingest, route, and analyze data across various applications and systems in real time.


Getting started

getting-started page anchor

Create the Kafka Destination

create-the-kafka-destination page anchor
  1. From your workspace's Destination catalog page(link takes you to an external page) search for "Kafka".
  2. Select the "Kafka" tile and click Add Destination.
  3. Select an existing Source to connect to Kafka.
  4. Enter a name for your Kafka destination.

Configure the Kafka Destination

configure-the-kafka-destination page anchor

The way you've configured your Kafka Cluster informs the authentication and encryption settings you'll need to apply to the Segment Kafka Destination. You may need the assistance of someone technical to provide values for the following Settings:

  1. On the Settings tab, enter values into the Client ID, Brokers and Authentication Mechanism setting fields.
  2. Populate fields based on the value you selected from the Authentication Mechanism field:
    • Plain or SCRAM-SHA-256 / 512 authentication: provide values for Username and Password fields.
    • Client Certificate authentication: provide values for the SSL Client Key and SSL Client Certificate fields.
  3. Populate the SSL Certificate Authority field, if necessary.
  4. Save your changes and proceed to Configure the Send Action.

Configure the "Send" Action

configure-the-send-action page anchor
  1. Select the Mappings tab and add a new Send mapping.
  2. Select a Topic to send data to. This field should auto-populate based on the credentials you provided in the Settings tab.
  3. Map your payload using the Payload field.
    (Optional): Specify partitioning preferences, Headers and Message Key values.
  4. Save and enable the Action, then navigate back to the Kafka destination's Settings tab to enable and save the Destination.

Property nameTypeRequiredDescription
AWS Access Key IDstring

Optional

The Access Key ID for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.


AWS Authorization Identitystring

Optional

AWS IAM role ARN used for authorization. This field is optional, and should only be populated if using the AWS IAM Authentication Mechanism.


Brokersstring
required

The brokers for your Kafka instance, in the format of host:port. E.g. localhost:9092. Accepts a comma delimited string.


Client IDstring
required

The client ID for your Kafka instance. Defaults to 'segment-actions-kafka-producer'.


Authentication Mechanismselect
required

Select the Authentication Mechanism to use. For SCRAM or PLAIN populate the 'Username' and 'Password' fields. For 'Client Certificate' populated the 'SSL Client Key' and 'SSL Client Certificate' fields


Passwordpassword

Optional

The password for your Kafka instance. Should only be populated if using PLAIN or SCRAM Authentication Mechanisms.


AWS Secret Keypassword

Optional

The Secret Key for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.


SSL Certificate Authoritystring

Optional

The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e -----BEGIN CERTIFICATE----- and -----END CERTIFICATE-----.


SSL Client Certificatestring

Optional

The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e -----BEGIN CERTIFICATE----- and -----END CERTIFICATE-----.


SSL Enabledboolean

Optional

Indicates if SSL should be enabled.


SSL Client Keypassword

Optional

The Client Key for your Kafka instance. Exclude the first and last lines from the file. i.e -----BEGIN CERTIFICATE----- and -----END CERTIFICATE-----.


SSL - Reject Unauthorized Certificate Authorityboolean

Optional

Whether to reject unauthorized CAs or not. This can be useful when testing, but is unadvised in Production.


Usernamestring

Optional

The username for your Kafka instance. Should be populated only if using PLAIN or SCRAM Authentication Mechanisms.


Build your own Mappings. Combine supported triggers with the following Kafka-supported actions:

(information)

Mapping limits per destination

Individual destination instances have support a maximum of 50 mappings.

Send data to a Kafka topic

Send is a Cloud action. The default Trigger is type = "track" or type = "identify" or type = "page" or type = "screen" or type = "group"

Property nameTypeRequiredDescription
TopicSTRING
required

The Kafka topic to send messages to. This field auto-populates from your Kafka instance.


PayloadOBJECT
required

The data to send to Kafka


HeadersOBJECT

Optional

Header data to send to Kafka. Format is Header key, Header value (optional).


PartitionINTEGER

Optional

The partition to send the message to (optional)


Default PartitionINTEGER

Optional

The default partition to send the message to (optional)


Message KeySTRING

Optional

The key for the message (optional)


Which Kafka Platforms are supported?

which-kafka-platforms-are-supported page anchor

The Kafka Destination can send data to Topics on self-hosted Kafka Clusters, or to Clusters hosted on Managed Service platforms like Confluent Cloud and Aiven.

Which data formats are supported?

which-data-formats-are-supported page anchor

Segment sends data to Kafka in JSON format only. Segment does not yet support other formats, like Avro or Protobuf.

Which authentication mechanisms are supported?

which-authentication-mechanisms-are-supported page anchor

The Authentication Mechanism is controlled with the Authentication Mechanism Setting field.

Segment supports the following SASL-based authentication methods:

  • Plain
  • SCRAM-SHA-256
  • SCRAM-SHA-512
  • AWS

Segment also supports Client Certificate authentication.

How is partitioning controlled?

how-is-partitioning-controlled page anchor

The Send Action provides multiple ways to specify which Partition an event should be sent to.

  • Partition: Use this field to specify the name of the Partition Segment should send events to.
  • Default Partition: Use this field to specify a default Partition. Segment uses this when you don't provide a value in the Partition field.
  • Message Key: Segment uses a hash of this field's value to determine which Partition should receive an event. If you don't provide a Message Key, Segment uses a round robin algorithm to select the partition to send the event to.

What is the "SSL - Reject Unauthorized Certificate Authority" field for?

what-is-the-ssl---reject-unauthorized-certificate-authority-field-for page anchor

This field specifies if Segment should reject server connections when a certificate is not signed by a trusted Certificate Authority (CA). This can be useful for testing purposes or when using a self-signed certificate.


You can send computed traits and audiences generated using Engage to this destination as a user property. To learn more about Engage, schedule a demo(link takes you to an external page).

For user-property destinations, an identify call is sent to the destination for each user being added and removed. The property name is the snake_cased version of the audience name, with a true/false value to indicate membership. For example, when a user first completes an order in the last 30 days, Engage sends an Identify call with the property order_completed_last_30days: true. When the user no longer satisfies this condition (for example, it's been more than 30 days since their last order), Engage sets that value to false.

When you first create an audience, Engage sends an Identify call for every user in that audience. Later audience syncs only send updates for users whose membership has changed since the last sync.

(information)

Real-time to batch destination sync frequency

Real-time audience syncs to Kafka may take six or more hours for the initial sync to complete. Upon completion, a sync frequency of two to three hours is expected.


Segment lets you change these destination settings from the Segment app without having to touch any code.

Property nameTypeRequiredDescription
AWS Access Key IDstring

Optional

The Access Key ID for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.


AWS Authorization Identitystring

Optional

AWS IAM role ARN used for authorization. This field is optional, and should only be populated if using the AWS IAM Authentication Mechanism.


Brokersstring
required

The brokers for your Kafka instance, in the format of host:port. E.g. localhost:9092. Accepts a comma delimited string.


Client IDstring
required

The client ID for your Kafka instance. Defaults to 'segment-actions-kafka-producer'.

Default: segment-actions-kafka-producer

Authentication Mechanismselect
required

Select the Authentication Mechanism to use. For SCRAM or PLAIN populate the 'Username' and 'Password' fields. For 'Client Certificate' populated the 'SSL Client Key' and 'SSL Client Certificate' fields

Default: plain

Passwordpassword

Optional

The password for your Kafka instance. Should only be populated if using PLAIN or SCRAM Authentication Mechanisms.


AWS Secret Keypassword

Optional

The Secret Key for your AWS IAM instance. Must be populated if using AWS IAM Authentication Mechanism.


SSL Certificate Authoritystring

Optional

The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e -----BEGIN CERTIFICATE----- and -----END CERTIFICATE-----.


SSL Client Certificatestring

Optional

The Certificate Authority for your Kafka instance. Exclude the first and last lines from the file. i.e -----BEGIN CERTIFICATE----- and -----END CERTIFICATE-----.


SSL Enabledboolean

Optional

Indicates if SSL should be enabled.

Default: true

SSL Client Keypassword

Optional

The Client Key for your Kafka instance. Exclude the first and last lines from the file. i.e -----BEGIN CERTIFICATE----- and -----END CERTIFICATE-----.


SSL - Reject Unauthorized Certificate Authorityboolean

Optional

Whether to reject unauthorized CAs or not. This can be useful when testing, but is unadvised in Production.

Default: true

Usernamestring

Optional

The username for your Kafka instance. Should be populated only if using PLAIN or SCRAM Authentication Mechanisms.