Getting demo events onto IBM Event Streams topics in a hurry

March 13th, 2024

Sharing a couple of tips for quick-and-dirty demo setups.

I often need to put together demos of IBM Event Automation without much notice. The starting point is almost always needing to get a bunch of interesting events onto a Kafka topic.

What I need is a jumping-off point to illustrate the benefit of sharing streams of events in Event Endpoint Management, or the types of processing you can do in Event Processing. And to do that, I need a topic with events on them that will look interesting or relevant to who I’m demo’ing to.

If I’ve got time to do this properly, I’ll setup a generator that will give me a continuous stream of randonly-generated events (example). But if I’m in a hurry, I’ll use the REST Producer API and do something like this instead.

Read the rest of this entry »

Explaining regression in Scratch

February 15th, 2024

In this post, I want to share a preview of a new feature I’m adding to Machine Learning for Kids, to ask for feedback and ideas for projects that it could be used to make.

I’ll start by contrasting this new feature with what I’ve done with Machine Learning for Kids before, then I’ll share screenshots of the new feature, and finally I’ve got a ten-minute video showing the sort of school lesson that I think it could be used for.

Read the rest of this entry »

How to create a new Kafka Connect connector

January 28th, 2024

I’m helping with a hackathon this week to get developers to create their first Kafka Connect connectors. To help get them going, I’ll be starting the day with a quick crash course.

I’ve created a simple skeleton connector project, and I’ll be walking through how they can use that to skip the boilerplate stuff and jump straight into trying out their ideas for new connectors.

Read the rest of this entry »

“Local projects” in Machine Learning for Kids

January 19th, 2024

I added support for “local projects” (storing projects on your own computer) to Machine Learning for Kids this week. In this post, I want to give a little background.

Read the rest of this entry »

Processing XML with Kafka Connect

December 11th, 2023

In this tutorial, I’ll share examples of how to process XML data at various points in a Kafka Connect pipeline, using a new plugin from IBM Event Streams.

You can assemble a Kafka Connect pipeline in a huge number of ways, so I’m not going to attempt an exhaustive list here. Instead, I’ve come up with eight examples that are illustrative of the sort of use cases you can satisfy.

I’ll summarise and link to my different examples here, so you can jump straight to the one that sounds the closest to your use case:

Read the rest of this entry »

You need two schemas to deserialize an Avro message… but which two?

November 17th, 2023

In this post, I want to talk about what happens when you use Avro to deserialize messages on a Kafka topic, why it actually needs two schemas, and what those schemas need to be.

I should start by pointing out that if you’re using a schema registry, you probably don’t need to worry about any of this. In fact, a TLDR for this whole post could be “You should be using a good schema registry and SerDes client“.

But, there are times where this may be difficult to do, so knowing how to set a deserializer up correctly is helpful. (Even if you’re doing the right thing and using a Schema Registry, it is still interesting to poke at some of the details and know what is happening.)

The key thing to understand is that to deserialize binary-encoded Avro data, you need a copy of the schema that was used to serialize the data in the first place [1].


This gets interesting after your topic has been around for a while, and you have messages using a mixture of schema versions on the topic. Maybe over the lifetime of your app, you’ve needed to add new fields to your messages a couple of times.

If you want a consumer application to be able to consume all of the messages on this topic, what does that mean?

Read the rest of this entry »

Using IBM Event Automation with Amazon MSK

October 25th, 2023

Written with Chris Patmore

IBM Event Automation helps companies to accelerate their event-driven projects wherever businesses are on their journey. It provides multiple components (Event Streams, Event Endpoint Management, and Event Processing) which together lay the foundation of an event-driven architecture that can unlock the value of the streams of events that businesses have.

A key goal of Event Automation is to be composable. The three components can be used together, or they can each be used to extend and enhance an existing event-driven deployment.

Amazon MSK (Managed Streaming for Kafka) is a hosted, managed Kafka service available in Amazon Web Services. If a business has started their event-driven journey using MSK, then components from Event Automation can help to enhance this. This could be by offering management and governance of their MSK topics. And it could be by providing an intuitive low-code authoring canvas to process the events on their MSK topics.

Working with Amazon MSK is a nice example of the benefits of the composability of Event Automation, by helping businesses to get more value from their existing MSK topics.

In this blog post, we want to show a few different examples of where this can be done. For each example, we’ll provide a high-level diagram and description. We’ll also share a demonstration that we created to show it in action.

Read the rest of this entry »

Connecting App Connect Enterprise to Event Endpoint Management

October 20th, 2023

Configuring IBM App Connect Enterprise to consume messages from Kafka topics in IBM Event Endpoint Management requires careful configuration. In this post, I’ll share the steps I use that help me to avoid missing any required values.

If this sounds familiar, it might be because I wrote a post like this about using App Connect Enterprise to work with topics from Event Streams. People seem to have found that post useful, so I thought I’d do something similar for topics in Event Endpoint Management this time.

To illustrate this, I’ll create a simple App Connect flow that consumes messages from a Kafka topic and publish them to an MQTT topic.

The key to getting this to work correctly first time is to make sure that values are accurately copied from Event Endpoint Management to App Connect.

To help with this, I use a grid like the one below.

The instructions in this post start with Event Endpoint Management, and explain how to populate the grid with the information you need.

Then the instructions will switch to App Connect, and explain how to use the values in the grid to set up your App Connect flow.

What this is Values you will see in my screenshots Your value
A Topic name
DEMO.ACE
B Bootstrap address
my-eem-gateway-ibm-egw-rt-event-automation.apps.dalelane.cp.fyre.ibm.com:443
C SASL mechanism
PLAIN
D SASL config
org.apache.kafka.common.security.plain.PlainLoginModule required;
E Security protocol
SASL_SSL
F Certificate
eem-cert.jks
G Certificate password
STOREPASSW0RD
H Username
eem-9c8fc5d9-fddd-48dd-ab41-e062214166e5
I Password
dd08a1fc-99be-4931-8059-70aef88c1f0c
J Policy project name
demo-policies
K Policy name
demo-eem-policy
L Security identity name
eem-credentials
M Truststore identity name
eem-truststore

Read the rest of this entry »