In this post, I describe how artificial intelligence and machine learning are used to augment event stream processing.
I gave a talk at a Kafka / Flink conference yesterday about the four main patterns for using AI/ML with events. I had a lot to say, so it is taking me a few days to write up my slides.
- the building blocks used in AI/ML Kafka projects
- how AI / ML is used to augment event stream processing (this post)
- how agentic AI is used to respond autonomously to events
- how events can provide real-time context to agents
- how events can be used as a source of training data for models

The most common pattern for introducing AI into an event driven architecture is to use it to enhance event processing.
As part of event processing, you can have events, collections of events, or changes in events – and any of these can be sent to an AI service. The results can inform the processing or downstream workflows.







