Dapr and Kafka–easy binding

Lucas Jellema

Dapr is a runtime framework for distributed applications. Dapr also provides a personal assistant to any application and environment that offers standardized access to different facilities (state, secrets, configuration, pub/sub, actors) and technologies. Simply by using Dapr we can leverage a large number of Dapr components that through the Dapr runtime provide access to a wide variety to cloud services and on prem technologies. The next figure shows some of these.

imageSome of the heavy lifting an application developer typically has to implement for interacting with the specific APIs and intricacies of technologies is taken over by Dapr, allowing the application developers to focus more on business functionality instead of what is really often not much more than plumbing.

In this article, I will show a quick and simple example of interaction with Kafka. It arose from a session on Microcks and AsyncAPI and I needed a quick way to:

  • consume the events published by the Mock producer generated by Microcks (from the AsyncAPI interface definition) to a Kafka Topic
  • publish events according to the same AsyncAPI contract to a Kafka Topic to demonstrate how Microcks performs a test on these events

I have worked with Kafka before, but quickly ramping up client applications – and configuring their connection to the Kafka broker – always takes me longer than I would hope for. For what I needed, Dapr offers a simple and quick solution. And it also demonstrates quite well how the Dapr approach of components-configured-outside-of-applications works.

Consume Events Published to Kafka Topic

The first case implemented using Dapr can be visualized like this:


The consuming application is shown here on the far right as a Node application. It could have been any technology that can handle an HTTP POST request – any semi modern programming language would do. The code required for receiving the event from Dapr (after Dapr has taken care of the interaction with Kafka) is very, very simple (and of course using Express for such a simple case is very much over done):


The code in the rectangle handles the events that were consumed from the Kafka Topic by Dapr and are handed to the application in the form of an HTTP POST request.

The configuration of the Kafka Binding in the Dapr components.yaml file is also not very complex:

imageHere we see the name of the binding component; this names maps to the URL Path at which the application receives the events (turned HTTP POST request): advanced-street-light-destination

The very last line shown – scope street-light-consumer – instructs Dapr to route events consumed by this binding component to the application with the app-id street-light-consumer. We will see in a little while how our application is identified through that app-id. Note: By default, incoming events will be sent to an HTTP endpoint that corresponds to the name of the input binding. You can override this by setting the metadata property route.

We also see the Kafka broker configuration: localhost and port 9092. The topics that the binding component consumes from is only a single topic in this case: CodeCafeAdvancedStreetlightsAPI-1.1-smartylighting-codecafe-streetlights-event-lighting-measured – a name generated by Microcks from the AsyncAPI definition. Multiple topics can be specified in a comma separated list.

To get Dapr and the application started, we execute the following statement from the directory that contains both the app.js application file and the components.yaml file:

dapr run –app-id street-light-consumer –app-port 3000 node app.js –components-path .

This uses the Dapr CLI to run the Dapr “companion process” (a side car but in this case not as a container within a K8S Pod but just as a separate Linux process) to work on behalf of an application with app-id street-light-consumer.  that can be access at port 3000 (on localhost) and which is started with “node app.js”. Finally, this Dapr companion process (aka personal assistant) for the Node application should initialize and put to work all components configured in all yaml files in the current directory. In this case, only a single component is configured: Kafka input, consume from indicated topic on configured Kafka broker and route all messages to the application indicated in the scopes element.

imageThe Dapr logging shows how the component is initialized.


And then the logging shows how the application reports the reception of the messages – that were consumed by Dapr from Kafka.


You should like the fact that the application does not depend on Kafka in any way. The application developer does not need to know about Kafka at all. If we decide to use a different messaging technology, then we will update the components.yaml file – but not the application itself. The interaction with Kafka (or any alternative message broker) is taken care of by the Dapr side car – aka the application’s personal assistant.

Publish Events to Kafka Topic

The second case tackled using Dapr can be visualized like this:


In this case, the objective is to get messages published to a (pre existing) Kafka Topic. These messages are subsequently validated by Microcks in a test generated from the AsyncAPI interface contract. The challenge I use Dapr to address in this case is simply to get messages published with the least effort (because my real objective was to demonstrate Microcks).

Similar to what we saw before, here is the components.yaml file that configures the Dapr Kafka binding component:


The name of the binding – just as before – is advanced-street-light-destination. The Kafka broker configuration is similar to the previous example. A single topic can be specified to publish the outbound messages to, using the publishTopic property – in this case the topic is called streetlight-dropbox. Credentials to connect to the Kafka broker can be provided, but in this case none are needed.

A Dapr sidecar can be started without actually accompanying a specific application – rather like a PA without a corporate official to attend to. Anyone can ask this PA to do stuff for them. As we all can with this Dapr side car.

dapr run –components-path .


We need the HTTP port on which this Dapr side car is listening, so we can send our own instructions. In this case it is 37919.

To have Dapr publish an event to a Kafka binding, we can use curl to send an HTTP POST request. It needs to be send to:

Dapr-Host:Dapr-Post//v1.0/bindings/<NAME OF KAFKA BINDING>, which in this case means:


The full curl command looks as follows:

curl -X POST http://localhost:37919/v1.0/bindings/advanced-street-light-destination \
-H “Content-Type: application/json” \
-d ‘{“data”: {“streetlightId”: “dev99”, “streetLocation”: “Edisonbaan, Nieuwegein, The Netherlands”, “lumens”: 2500, “sentAt”: “2022-01-08T09:52:16Z”} , “operation”: “create”}’


This POST request instructs Dapr to publish an event with payload as defined in the data property of the request body’s JSON content.

I used the Kafdrop tool to inspect the messages on the streetlight-dropbox topic:


The curl command results in a message on a Kafka topic thanks to Dapr with a tiny little bit of declarative configuration (dare I say: low code?)

End to End through Dapr

Of course I could combine the two cases and make even more Dapr fun:


Dapr takes of both outbound publishing and inbound processing. In a diagram:


One Dapr side car is enough to handle both inbound and outbound cases. The curl post command invokes the outbound binding through the side car to have the event published to Kafka topic streetlight-dropbox. That same side car has created a subscription on the topic and will therefore receive the message that is then routed to the Node application, at the path /advanced-street-light-destination.


Dapr Docs on Kafka Binding – https://docs.dapr.io/reference/components-reference/supported-bindings/kafka/

Dapr Docs – Example of using Kafka Binding to trigger an application – https://docs.dapr.io/developing-applications/building-blocks/bindings/howto-triggers/


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Next Post

Apache NiFi: Forwarding HTTP headers

Apache NiFi can be used to expose various flavors of webservices. Using NiFi in such a way provides benefits like quick development using a GUI and of course data provenance. You know who called you with which data and where the data went. The NiFi is very scalable, delivery can […]
%d bloggers like this: