In this demo, I give examples of different ways to enrich a stream of events – and the types of event processing that this can enable.
I presented this in a webinar with Matt Sunley (replay recording available) yesterday. Matt started with some context, explaining how enriching events with data from external sources can enable event processing solutions that aren’t otherwise possible.
And then I ran through a demo, creating an event processing flow that included four types of enrichment.
For our demo, we chose a Customer Care use case for a clothing retailer – looking at how event processing could help companies to respond efficiently to customer concerns.
Event source
The starting point was a Kafka topic with a stream of customer reviews. I used Amazon Reviews Polarity for this (a dataset of real customer reviews from Amazon customers) – wrapping it in a quick Java app that produced reviews for clothing products to a Kafka topic at random intervals every second or so.
Database enrichment
Each event included a product ID, but no more info about the product that the review relates to. Processing the reviews in the context of the products they relate to would let you identify insights such as which product types or styles are getting the most complaints.
The “products database” node looked up product info in a PostgreSQL database, and added it into the events being processed.
API enrichment – using system of record
Each event included a user id for who submitted the review, but no other information about the user. Processing reviews in the context of other data stored about the user lets you tailor the processing to the user info and preferences.
In the demo, we used this to filter customer reviews to only those from customers who have opted in to getting email replies.
The “CRM API” node looked up customer info using the userid in the review event, and added contact details and preferences into the events being processed.
API enrichment – using traditional machine learning models
Machine learning classifiers let you sort, group, and filter events in powerful ways that aren’t possible with declarative rules. You can train a machine learning model using example events, and invoke that model via a simple REST API to classify it – using that to inform additional event processing.
In the demo, we used this to classify the sentiment of the review text.
The “prepare sentiment prompt” node prepared the review text to submit to sentiment analysis.
The “sentiment analysis” node invoked the sentiment analysis model.
The “negative reviews” node filtered the events to keep only the reviews that were assessed as negative.
API enrichment – using generative AI
I’ve written about this idea before. In those demos, I had used generative AI to summarise data.
In this demo, we used generative AI to draft a suitable message to send to the customer in response to their review.
The “prepare reply prompt” node prepared the generative AI prompt.
The “watsonx” node invoked the generative AI API
Tags: apachekafka, ibmeventstreams, kafka