Posts Tagged ‘kafka’

Talking at TechCon about AI and EDA

Sunday, March 22nd, 2026

IBM TechCon is an annual online technical event for engineers, creators, and integration specialists.

One of our sessions for this year was AI patterns in event-driven architectures:

You already have Kafka topics sending valuable event data through your systems. You’ve heard about the increasing adoption and promise of AI technologies. But how do these worlds overlap?

We’ll explain the ways that you can take your Kafka topics and use them not just for integration and analytics, but also to drive AI and ML — things like real-time anomaly detection, prediction models, personalization, decision pipelines, and more.

In this session, we’ll explain the four main patterns for how your existing Kafka topics (and the streams of events on them) can be leveraged as the foundation for AI/ML. We’ll show multiple technology approaches to implement each pattern – rather than convince you to use any single specific tool, help you understand the high level patterns, and how can you get started with them.


session recording on video.ibm.com

This was adapted from a talk I gave at Current last year.

I got to go to a fancy convention centre in New Orleans for that talk, but I gave this one from a poorly-lit meeting room in my office… so in that respect at least, this one was less fun! 😉

But it’s a topic I find super interesting, so I am always pleased to have another chance to share my thoughts.

Talking at TechCon about metrics and monitoring

Saturday, March 21st, 2026

IBM TechCon is an annual online technical event for engineers, creators, and integration specialists.

One of our sessions for this year was an introduction to Monitoring your Event Driven Architecture:

This session will give you an insight into the life of an Event Automation administrator, responsible for a busy event-driven system where teams have been creating a variety of Kafka topics, integrations, stream processing apps, connectors, and much more. We’ll highlight the importance of metrics and monitoring for event driven architectures and introduce you to the tools that are available to help.

We’ll do this by showing you an event-driven environment where things have gotten out of hand. In our fictional scenario, users are being impacted by things like poorly configured topics, poorly written applications, poorly managed connectors, poorly configured stream processors…

In this session, we’ll walk you through to bring control to the chaos. We’ll step through how to get an insight into what is happening, find out where the problems are, and put controls in place to mitigate their impact.


session recording on video.ibm.com

It was an introduction for beginners, that you could sum up as a 40-minute plea for people to monitor their Kafka clusters and applications! Essentially, we set up a handful of naive and broken applications, and walked through how metrics and monitoring show you where the problems are hiding.

Watch it to be persuaded that metrics are important.

Or to watch how Matt had to jump in and help me when an Apple Magic Mouse decide it didn’t like scrolling any more, and I needed to get to things at the bottom of web pages!

Or just to marvel at how glamourous our offices are. 😉

Processing JSON with Kafka Connect

Wednesday, February 18th, 2026

In this post, I’ll share examples of how to process JSON data in a Kafka Connect pipeline, and explain the schema format that Kafka uses to describe JSON events. 

Using sink connectors

Kafka Connect sink connectors let you send the events on your Kafka topics to external systems. I’ve talked about this before, but to recap the structure looks a bit like this:

Imagine that you have this JSON event on a Kafka topic. 

{
    "id": 12345678,
    "message": "Hello World",
    "isDemo": true
}

How should you configure Kafka Connect to send that somewhere? 

It depends…

(more…)

AI patterns in event driven architectures

Monday, November 3rd, 2025

I gave a talk at Current last week about how artificial intelligence and machine learning are used with Kafka topics. I had a lot of examples to share, so I wrote up my slides across several posts.

I’ll use this post to recap and link to my write-ups of each bit of the talk:

I started by talking about the different building blocks that are needed, and the sorts of choices that teams make.

Next, I talked about how projects to introduce AI into event driven architectures typically fall into one or more of these common patterns:

The most common, and the simplest: using AI to improve and augment the sorts of processing we can do with events. This can be as simple as using off-the-shelf pre-trained models to enrich a stream of events, and using this to filter or route the event as part of processing.

Perhaps the newest (and the pattern that is recently getting the most interest and attention) is to use streams of events to trigger agents, so that they can autonomously take actions in repsonse.

Maybe the less obvious approach is to collect and store a projection of recent events, and use these to enhance an agentic AI, by making it available as a queryable or searchable form of real-time context.

And finally, the longest established pattern is to simply use the retained history of Kafka topics as a relevant source of historical training data, for training new custom and bespoke models.

Using streams of events to train machine learning models

Sunday, November 2nd, 2025

In this post, I describe how event streams can be used as a source of training data for machine learning models.

I spoke at Current last week. I gave a talk about how artificial intelligence and machine learning are most commonly used with Kafka topics. I had a lot to say, so I didn’t manage to finish writing up my slides – but this post covers the last section of the talk.

It follows:

The talk covered the four main patterns for using AI/ML with events.

This pattern was where I talked about using events as a source of training data for models. This is perhaps the simplest and longest established approach – I’ve been writing about this for years, long pre-dating the current generative AI-inspired interest.

(more…)

Using event streams to provide real-time context for agentic AI

Saturday, November 1st, 2025

In this post, I describe how event stream projections can be used to make agentic AI more effective.

I spoke at a Kafka / Flink conference on Wednesday. I gave a talk about how AI and ML are used with Kafka topics. I had a lot to say, so this is the fourth post I’ve needed to write up my slides (and I’ve still got more to go!).

The talk was a whistlestop tour through the four main patterns for using artificial intelligence and machine learning with event streams.

This pattern was where I talked about using events as a source of context data for agents.

(more…)

Triggering agentic AI from event streams

Friday, October 31st, 2025

In this post, I describe how agentic AI can respond autonomously to event streams.

I spoke at Current on Wednesday, about the most common patterns for how AI and ML are used with Kafka topics. I had a lot of content I wanted to cover in the session, so it’s taking me a while to write it all down:

The premise of the talk was to describe the four main patterns for using AI/ML with events. This pattern was where I started focusing on agents.

(more…)

Using AI to augment event stream processing

Thursday, October 30th, 2025

In this post, I describe how artificial intelligence and machine learning are used to augment event stream processing.

I gave a talk at a Kafka / Flink conference yesterday about the four main patterns for using AI/ML with events. I had a lot to say, so it is taking me a few days to write up my slides.

The most common pattern for introducing AI into an event driven architecture is to use it to enhance event processing.

As part of event processing, you can have events, collections of events, or changes in events – and any of these can be sent to an AI service. The results can inform the processing or downstream workflows.

(more…)