Configure message brokers
Kalix integrates with Google Cloud Pub/Sub and Confluent Cloud to integrate with other services and to allow asynchronous messaging within Kalix services.
Confluent Cloud
Take the following steps to configure access to your Confluent Cloud Kafka broker for your Kalix project:app-name:
-
Open Confluent Cloud.
-
Go to your cluster. Create a new cluster if you don’t have one already.
-
Go to
data integration
,clients
. Create a new client with theNew client
button. -
Choose
Java
orScala
. Create a key with theCreate Kafka cluster API key
button. Copy the configuration shown in the user interface into a file, and save it to <kafka-broker-config-file>. -
Use
kalix
to configure the broker for your project:
$ kalix projects config set broker \ --broker-service kafka \ --broker-config-file <kafka-broker-config-file> \ --description "Access to Kafka in ..."
Create a topic
To create a topic, you can either use the Confluent Cloud user interface, or the Confluent Cloud CLI.
- Browser
-
-
Open Confluent Cloud.
-
Go to your cluster
-
Go to the Topics page
-
Use the Add Topic button
-
Fill in the topic name, select the number of partitions, and use the Create with defaults button
You can now use the topic to connect with Kalix.
-
- Confluent Cloud CLI
-
ccloud kafka topic create TOPIC_ID
You can now use the topic to connect with Kalix.
Use a Kafka broker locally
The docker-compose example file below shows how to set the kalix.proxy.eventing.support
to kafka
.
A volume is mounted to the current directory .
, which is expected to contain a kafka.properties
file.
The BROKER_CONFIG_FILE
environment variable points to the kafka.properties
file in the mounted volume 'conf', which
ensures that the service can connect to Kafka.
version: "3"
services:
kalix-proxy:
image: gcr.io/kalix-public/kalix-proxy:1.1.12
command: -Dconfig.resource=dev-mode.conf -Dkalix.proxy.eventing.support=kafka
ports:
- "9000:9000"
extra_hosts:
- "host.docker.internal:host-gateway"
environment:
USER_FUNCTION_HOST: ${USER_FUNCTION_HOST:-host.docker.internal}
USER_FUNCTION_PORT: ${USER_FUNCTION_PORT:-8080}
BROKER_CONFIG_FILE: /conf/kafka.properties
volumes:
- .:/conf
You can check out a complete example in Java Customer Registry (with Kafka).
Delivery characteristics
When your application consumes messages from Kafka, it will try to deliver messages to your service in 'at-least-once' fashion while preserving order.
Kafka partitions are consumed independently. When passing messages to a certain entity or using them to update a view row by specifying the id as the Cloud Event ce-subject
attribute on the message, the same id must be used to partition the topic to guarantee that the messages are processed in order in the entity or view. Ordering is not guaranteed for messages arriving on different Kafka partitions.
Correct partitioning is especially important for topics that stream directly into views using the transform_update option: when messages for the same subject id are spread over different transactions, they may read stale data and lose updates.
|
To achieve at-least-once delivery, messages that are not acknowledged will be redelivered. This means redeliveries of 'older' messages may arrive behind fresh deliveries of 'newer' messages. The first delivery of each message is always in-order, though.
When publishing messages to Kafka from Kalix, the ce-subject
attribute, if present, is used as the Kafka partition key for the message.
Google Cloud Pub/Sub
To configure access to your Google Cloud Pub/Sub broker for your Kalix project, you need to create a Google service account with access to your Google Cloud Pub/Sub broker and provide it to Kalix.
Details on doing this can be found in the Google documentation. We provide simplified steps below.
The service account should allow for the roles/pubsub.editor
role.
Setting up the service account
To set up a service account and generate the key, follow these steps:
-
Navigate to https://console.cloud.google.com/
.
-
From the blue bar, click the dropdown menu next to Google Cloud Platform.
-
Click New Project to create a project and save the
<gcp-project-id>
, which you will need later. -
Enter the following
gcloud
command to set up thegcloud
environment:gcloud auth login gcloud projects list gcloud config set project <gcp-project-id>
-
Enter the following command to create the service account. The example uses the name
kalix-broker
, but you can use any name.gcloud iam service-accounts create kalix-broker
-
Enter the following commands to grant the GCP Pub/Sub editor role to the service account. Substitute your project ID for
<gcp-project-id>
.gcloud projects add-iam-policy-binding <gcp-project-id> \ --member "serviceAccount:kalix-broker@<gcp-project-id>.iam.gserviceaccount.com" \ --role "roles/pubsub.editor"
-
Generate a key file for your service account:
gcloud iam service-accounts keys create keyfile.json \ --iam-account kalix-broker@<gcp-project-id>.iam.gserviceaccount.com
Now you have a service account key file with which to configure Kalix to use your Google Cloud Pub/Sub broker. You can add the key file using either the Kalix Console or the Kalix CLI.
- Browser
-
-
Open the project in the Kalix Console.
-
Select Integrations from the left-hand navigation menu.
-
Click + for the Google Cloud Pub/Sub integration option.
-
Copy the contents of
keyfile.json
into the editor and, click Apply.
The project is now configured to use Google Pub/Sub as the message broker.
-
- CLI
-
kalix projects config set broker \ --broker-service google-pubsub \ --gcp-key-file keyfile.json \ --description "Google Pub/Sub in <gcp-project-id>"
The project is now configured to use Google Pub/Sub as the message broker.
Create a topic
To create a topic, you can either use the Google Cloud Console, or the Google Cloud CLI.
- Browser
-
-
Open the Google Cloud Console.
-
Go to the Pub/Sub product page.
-
Click CREATE TOPIC on the top of the screen.
-
Fill in the Topic ID field and choose any other options you need.
-
Click CREATE TOPIC in the modal dialog.
You can now use the topic to connect with Kalix
-
- Google Cloud CLI
-
gcloud pubsub topics create TOPIC_ID
You can now use the topic to connect with Kalix
Delivery characteristics
When your application consumes messages from Google Pub/Sub, it will try to deliver messages to your service in 'at-least-once' fashion while preserving order.
-
the GCP 'Subscription' has the 'Message ordering' flag enabled (this is the case by default for the subscriptions created by Kalix)
-
the code that acts as a publisher has 'message ordering' enabled (if needed on this client SDK)
-
an ordering key is provided for each message
When passing messages to a certain entity or using them to update a view row by specifying the id as the Cloud Event ce-subject
attribute on the message, the same id must be used for the Google Pub/Sub ordering key to guarantee that the messages are processed in order by the entity or view.
Correct ordering is especially important for topics that stream directly into views using the transform_update option: when messages for the same subject id are spread over different ordering keys (or do not have ordering keys), they may read stale data and lose updates.
|
To achieve at-least-once delivery, messages that are not acknowledged before the Ack deadline will be redelivered. This means redeliveries of 'older' messages may arrive behind fresh deliveries of 'newer' messages.
When publishing messages to Google Pub/Sub from Kalix, the ce-subject
attribute, if present, is used as the ordering key for the message.