IBM Event Automation helps companies to accelerate their event-driven projects wherever businesses are on their journey. It provides multiple components (Event Streams, Event Endpoint Management, and Event Processing) which together lay the foundation of an event-driven architecture that can unlock the value of the streams of events that businesses have.
A key goal of Event Automation is to be composable. The three components can be used together, or they can each be used to extend and enhance an existing event-driven deployment.
Today, I demonstrated some of the Event Automation components working with Azure Event Hubs for Apache Kafka. As Event Hubs provides a Kafka interface to Azure’s data streaming service, it obviously can be used with Event Automation. But it can be helpful to inspire people by showing it for real, so even demos of obvious things can be valuable.
For example, Event Endpoint Management can enhance the value of topics in Event Hubs by offering management and governance, and by enabling governed reuse of those topics. Event Processing makes it easy to get insights from the events on Event Hubs topics, by providing an intuitive low-code authoring canvas to process them.
If I was going to be running this for a while and wanted to optimise for my applications in Azure, I would likely have set this up like this, with the Event Gateways deployed close to the Azure Kafka endpoints.
This is similar to how I deployed Event Automation with Amazon MSK last year. If you’d like to know more about that deployment pattern, you can get all the details in that post.
Instead, as my objective today was just to get people thinking about what could be possible, I went with this quicker and simpler approach:
I captured screenshots of the initial set-up which I’m sharing here, to help anyone else who wants to recreate this sort of demo:
- Creating the Azure Event Hubs demo environment
- Prepare a demo Event Hub topic with demo messages
- Add the demo topic to Event Endpoint Management
- Processing Azure Event Hubs topics in Event Processing
Creating the Azure Event Hubs demo environment
Go to the Azure services portal and search for Event Hubs.
Click into the Event Hubs marketplace page and click Create.
Create a new resource group and namespace for the demo instance.
Note that you need to choose the standard, premium, or dedicated pricing tier, as those are the tiers that include the Kafka interface.
All the other options can be left to the default settings.
Click Create to start creating the instance.
It takes a brief moment for the deployment to complete.
Once the deployment is complete, click into the resource to start setting it up.
Confirm that Kafka surface is shown as ENABLED, which it should be as long as you selected a pricing tier that supports the Kafka interface.
Prepare a demo Event Hub topic with demo messages
Click on + Event Hub to create a new topic.
I gave my topic the name orders
.
Otherwise, the default settings are fine.
Click Create and wait for the new topic to be created.
The next thing to do is to put some messages on the new topic that can be used for testing.
Click on Shared access policies, then click Add.
Name the credentials to use to put some test messages on the topic, and then click Create.
The Connection string-primary key is the password to use for a Kafka client.
The standard Kafka console producer is sufficient to produce a small number of test messages useful for a demo.
The boostrap servers value to use is the host name from the Azure Event Hub overview page, followed by :9093
.
The username to use is $ConnectionString
.
The password to use is the Connection string-primary key value from the policy.
Add the demo topic to Event Endpoint Management
The starting point is the Topics page in the Event Endpoint Management UI.
Adding the Azure Event Hubs topic alongside topics from other Kafka distributions is a nice way to illustrate the value of bringing topics from multiple Kafka clusters into a single catalog.
Click on Add topic.
For this demo, I wanted to consume the test messages that I produced directly to the topic, so I chose Consume events.
Give the cluster a name.
The server address to use is the host name from the Azure Event Hub overview page, followed by :9093
.
From the Azure portal, create a new policy for use by the Event Endpoint Management gateway. I called it event-endpoint-management.
Copy the Connection string-primary key value as the password that the Event Gateway can use to access Azure Event Hubs topics.
As before, the username is $ConnectionString
.
This adds Azure Event Hubs to the list of clusters available in Event Endpoint Management.
Click Next.
Choose the topic to add to the catalog.
Documenting the topic helps to enable greater reuse. In this demo, I used the name to highlight that this topic was from Azure Event Hubs, but I equally could have used tags to do this instead.
For this demo, I had produced JSON messages, so I selected application/json
for the encoding.
I copied one of the messages I produced earlier to use as a sample message in the catalog page.
Click Save.
The next step after describing the topic is to publish it to the catalog.
Click on Options, then click on Create option.
Choose the topic name that applications can use when consuming from this topic through Event Endpoint Management. As I already have “orders” topics from other Kafka clusters in the catalog, I gave this topic the alias ORDERS.AZURE
.
Click Next and then click Publish.
The Azure Event Hubs topic is now in the catalog, alongside the other topics from my Event Streams clusters.
Processing Azure Event Hubs topics in Event Processing
The final step was to show that these events can be processed using Flink jobs created in Event Processing’s low-code UI.
The server address for the Event Gateway can be copied from the catalog page for the Azure topic.
Paste the gateway address into the server field for the event source node.
Create credentials for accessing the Azure Event Hubs topic from the Event Endpoint Management catalog page.
Paste the credentials into the event source configuration in Event Processing.
The messages for this demo are JSON strings, so choose JSON
as the message format to use for processing.
Copy the sample message from the Event Endpoint Management catalog page, and use that to configure the Event Processing event source.
The sample message is used to generate a table structure for the events that will be consumed from Azure Event Hubs. Click Configure to confirm.
To illustrate processing the messages, you can quickly use something like a Filter node.
This is a trivially simple example of the processing that can be done, but now that the events are available, it’s an easy jumping off point to do all of the other kinds of processing that you can see described in our tutorials.
The aim of doing this demo was to inspire some ideas of the ways that you can get more from Azure Event Hubs, by making it easy to:
- share and manage – using Event Endpoint Management
- process – using Event Processing
For more information about any of this, please see the Event Automation documentation, or get in touch.
Tags: apachekafka, kafka