Posts Tagged ‘ibmeventstreams’

Analysing Wikipedia edits with IBM Event Processing

Monday, October 14th, 2024

In this post, I’ll share a demo I gave today to explain some of the processing nodes in the palette of IBM Event Processing.

I’ve found that demonstrations of Event Processing are easier to understand when I don’t need to explain the stream of events I’m processing in the first place. This means I’m always looking for interesting real-world event streams that are widely understood, as they can make for the most effective demos.

With this in mind, today I tried explaining a few of the Event Processing nodes by using them with a live stream of events representing pages that are being created and edited in the English Wikipedia.


Click on the image for a higher-resolution screenshot

Each event contains:

  • title of the page
  • who made the edit (user ID if logged in, or IP address if anonymous)
  • was this the creation of a new page, or an edit of an existing page?

Every edit on Wikipedia results in an event on the Kafka topic, so there are typically a few events a second. It’s not a super-high-throughput topic in Kafka terms, but there are enough events to try out interesting ideas.


Click on the image for a higher-resolution screenshot

Here are a few of the demos I gave today.

This is by no means an exhaustive list of what you could do with this data, but it was enough to let me show what the most commonly-used tools in the palette can do.

(more…)

Analysing social media sentiment with IBM Event Processing

Thursday, October 10th, 2024

aka “Who wants a Mario alarm clock?”

In this post, I want to share a quick demo of using Event Processing to process social media posts.

diagram

Background

A fun surprise from Nintendo today: they’ve introduced a new product! “Alarmo” is a game-themed alarm clock, with some interesting gesture recognition features.

I was (unsurprisingly!) tempted…

But that got me wondering how the rest of the Internet was reacting.

In this post, I want to share a (super-simple!) demo for how to look at this – using IBM Event Processing to create an Apache Flink job that looks at the sentiment of social media posts about this unusual new product.

(more…)

Taming the Kafka topics Wild West

Tuesday, September 17th, 2024

aka Approaches to managing Kafka topic creation with IBM Event Streams

How can you best operate central Kafka clusters, that can be shared by multiple different development teams?

Administrators talk about wanting to enable teams to create Kafka topics when they need them, but worry about it resulting in their Kafka clusters turning into a sprawling “Wild West”. At best, they talk about the mess of anonymous topics that are named and configured inconsistently. At worst, they talk about topics being created or configured in ways that negatively affect their Kafka cluster and impact their other users.

With that in mind, I wanted to share a few ideas for how to control the topics that are created in your Event Streams cluster:

(more…)

Unleash real-time responsiveness by enriching streams of events

Wednesday, June 19th, 2024

In this demo, I give examples of different ways to enrich a stream of events – and the types of event processing that this can enable.

I presented this in a webinar with Matt Sunley (replay recording available) yesterday. Matt started with some context, explaining how enriching events with data from external sources can enable event processing solutions that aren’t otherwise possible.

And then I ran through a demo, creating an event processing flow that included four types of enrichment.

(more…)

Processing Apache Avro-serialized messages from Kafka using IBM App Connect Enterprise

Monday, May 13th, 2024

IBM App Connect Enterprise (ACE) is a broker for developing and hosting high-throughput, high-scale integrations between a large number of applications and systems, including Apache Kafka.

In this post, I’ll describe how to use App Connect Enterprise to process Kafka messages that were serialized using Apache Avro schemas.

screenshot

This is an update to an earlier version of this post, reflecting updates to the sample code.

Background

Best practice when using Apache Kafka is to define Apache Avro schemas with a definition of the structure of your Kafka messages, and to store those schemas in a central registry that client applications can access at runtime.

If you want to use IBM App Connect Enterprise to develop and host integrations for processing those Kafka messages, you need App Connect to know how to:

  • retrieve the Avro schemas it needs using schema registry REST APIs
  • use the schemas to turn the binary stream of bytes on your Kafka topics into structured objects that ACE can manipulate and process

(more…)

Using Mirror Maker 2 with IBM Event Streams to migrate to a new cluster

Thursday, April 18th, 2024

This is the sixth in a series of blog posts sharing examples of ways to use Mirror Maker 2 with IBM Event Streams.

Mirror Maker 2 is a powerful and flexible tool for moving Kafka events between Kafka clusters.

For this sixth post, I’ll look at using Mirror Maker to migrate your Kafka cluster to a new region.

I’ve broken this down into multiple stages. For each stage, I’ll explain the intent and share a demo script I’ve created to let you try this for yourself.

(more…)

Using Mirror Maker 2 with IBM Event Streams to restore from a backup cluster

Friday, April 12th, 2024

This is the fifth in a series of blog posts sharing examples of ways to use Mirror Maker 2 with IBM Event Streams.

Mirror Maker 2 is a powerful and flexible tool for moving Kafka events between Kafka clusters.

For this fifth post, I’ll look at using Mirror Maker to maintain a backup of your Kafka events, and to be able to restore from that backup.

This is more complex than the previous posts as there are multiple stages involved. For each stage, I’ll explain the intent and share the demo script I’ve created to let you try this for yourself.

(more…)

Using Mirror Maker 2 with IBM Event Streams to create a failover cluster

Monday, April 8th, 2024

This is the fourth in a series of blog posts sharing examples of ways to use Mirror Maker 2 with IBM Event Streams.

Mirror Maker 2 is a powerful and flexible tool for moving Kafka events between Kafka clusters.

For this fourth post, I’ll look at using Mirror Maker to create an active/passive topology with a backup cluster ready to failover to.

(more…)