Triggering agentic AI from event streams

In this post, I describe how agentic AI can respond autonomously to event streams.

I spoke at Current on Wednesday, about the most common patterns for how AI and ML are used with Kafka topics. I had a lot of content I wanted to cover in the session, so it’s taking me a while to write it all down:

The premise of the talk was to describe the four main patterns for using AI/ML with events. This pattern was where I started focusing on agents.

Agentic AI means systems that can act autonomously.

This is often assumed to mean systems that respond to human requests in a chat interface, but an event-driven agent responds to events and autonomously decides to trigger a notification or business workflow.

As with the previous pattern, the simplest option is to use individual events to trigger agents.

That looks something like this: a stream of events is landing on a Kafka topic, and being used to trigger an agent. The agent uses a generative AI model, together with a collection of tools, to respond.

The results can be actions that are taken, workflows that are triggered, or data that is created and updated in external sink systems.

This is easier to explain with a couple of examples.

Imagine a supermarket that has a topic for alerts on perishable goods risk – items where there is an unexpectedly high amount of stock which is going to spoil soon.

An automated workflow can decide to do something like create a time-sensitive special offer to encourage increased sales.

This can be enhanced by using Generative AI to create a customized recipe or meal plan using that special offer food item, to tempt customers even further.

That could look something like this.

A topic with high stock alerts can be used to trigger an agent.

The agent calls on a range of relevant tools to decide when a special offer is appropriate.

It makes the special offer more enticing by generating custom recipes and meal plans to include with it.

Finally the agent can trigger a business process to start pushing this offer out to customers.

Triggering the agent can be as simple as a Kafka Connect connector making HTTP calls to the agent’s webhook endpoint.

And the result is customized, tailored special offers generated in response to stock levels in real-time.

This kind of approach can be taken further by pairing it with event stream processing.

Agents can be triggered not just by raw Kafka events, but by turning them into meaningful business events first.

That looks something like this: a stateful stream processor collecting together insight from multiple events and using this to invoke an agent to respond to those insights.

Imagine an event-driven holiday booking website.

Some topics receive infrequent events, like holiday purchases. Other topics receive high throughput events, like click-tracking as customers browse the website for different destinations and activities.

Event processing could identify a pattern across these events: potential customers who have a sustained interest in a certain destination or activity, that isn’t followed by a purchase.

Once this pattern is spotted, to help encourage this potential customer, an agent could be triggered to generate a customised and personalised itinerary for the locations or activities that they have been looking at. The agent can use tools to identify upcoming events at that destination, special offers, use customer profile data, and more.

To make a demo of this, I used Flink to identify those potential customers worth trying to tempt by spotting the pattern across the different holiday site topics.

The pattern detection output was used to trigger a low-code LangFlow agent I created to use generative AI to create a customised, personalised holiday itinerary.

When Kafka events are a record that something interesting or important has happened, consider using them to trigger an agent to respond immediately, in the moment, to that interesting or important event.

In the next post, I’ll still be talking about agents, but switch to the pattern that uses events not to trigger them, but to provide context for them.

Tags: , ,

One Response to “Triggering agentic AI from event streams”

  1. Troy Mott says:

    Dale, I love this example (I used to run some grocery stores long ago):

    Imagine a supermarket that has a topic for alerts on perishable goods risk – items where there is an unexpectedly high amount of stock which is going to spoil soon.

    An automated workflow can decide to do something like create a time-sensitive special offer to encourage increased sales.

    This can be enhanced by using Generative AI to create a customized recipe or meal plan using that special offer food item, to tempt customers even further.

Leave a Reply