Posts Tagged ‘kafka’

Using IBM Event Endpoint Management with Kafka Connect

Wednesday, September 11th, 2024

You’ve discovered a topic in the IBM Event Endpoint Management catalog that someone in your company has shared. It looks useful, so you want to use that stream of events to maintain a local projection in your database.

Or maybe you’ve discovered a topic in the Catalog that is available to produce to, and you want to contribute events to it from your MQ queue.

What are the options for using Kafka Connect to produce to, or consume from, topics that you discover in Event Endpoint Management?

In this post, we’ll share options that you can consider, and briefly outline the pros and cons of each.

Co-written with Andrew Borley

(more…)

Event Gateway topologies for IBM Event Endpoint Management

Sunday, June 30th, 2024

In this post, I share a few examples for how to run Event Gateways for Event Endpoint Management.

When we talk about Event Endpoint Management, we often draw logical diagrams like this, with Kafka client applications able to produce and consume events to back-end Kafka clusters via an Event Gateway.

When it comes to start planning a deployment, we need to make decisions about the best way to create that logical Event Gateway layer. This typically includes running multiple gateways, but there are many different ways to do this, depending on your requirements for scaling and availability.

For this post, I want to show two approaches for running two Event Gateways, as a way of illustrating the kind of topologies that are possible.

(more…)

Unleash real-time responsiveness by enriching streams of events

Wednesday, June 19th, 2024

In this demo, I give examples of different ways to enrich a stream of events – and the types of event processing that this can enable.

I presented this in a webinar with Matt Sunley (replay recording available) yesterday. Matt started with some context, explaining how enriching events with data from external sources can enable event processing solutions that aren’t otherwise possible.

And then I ran through a demo, creating an event processing flow that included four types of enrichment.

(more…)

Processing XML with Kafka Connect

Wednesday, May 15th, 2024

I spoke at Devoxx UK last week, about how to process XML data using a Kafka Connect pipeline.

This was based on some work I did last year, but it was good to get a chance to share it with a new audience.


youtu.be/NfYHE2i0-es

Processing Apache Avro-serialized messages from Kafka using IBM App Connect Enterprise

Monday, May 13th, 2024

IBM App Connect Enterprise (ACE) is a broker for developing and hosting high-throughput, high-scale integrations between a large number of applications and systems, including Apache Kafka.

In this post, I’ll describe how to use App Connect Enterprise to process Kafka messages that were serialized using Apache Avro schemas.

screenshot

This is an update to an earlier version of this post, reflecting updates to the sample code.

Background

Best practice when using Apache Kafka is to define Apache Avro schemas with a definition of the structure of your Kafka messages, and to store those schemas in a central registry that client applications can access at runtime.

If you want to use IBM App Connect Enterprise to develop and host integrations for processing those Kafka messages, you need App Connect to know how to:

  • retrieve the Avro schemas it needs using schema registry REST APIs
  • use the schemas to turn the binary stream of bytes on your Kafka topics into structured objects that ACE can manipulate and process

(more…)

Using IBM Event Automation with Azure Event Hubs

Friday, April 26th, 2024

IBM Event Automation helps companies to accelerate their event-driven projects wherever businesses are on their journey. It provides multiple components (Event Streams, Event Endpoint Management, and Event Processing) which together lay the foundation of an event-driven architecture that can unlock the value of the streams of events that businesses have.

A key goal of Event Automation is to be composable. The three components can be used together, or they can each be used to extend and enhance an existing event-driven deployment.

Today, I demonstrated some of the Event Automation components working with Azure Event Hubs for Apache Kafka. As Event Hubs provides a Kafka interface to Azure’s data streaming service, it obviously can be used with Event Automation. But it can be helpful to inspire people by showing it for real, so even demos of obvious things can be valuable.

For example, Event Endpoint Management can enhance the value of topics in Event Hubs by offering management and governance, and by enabling governed reuse of those topics. Event Processing makes it easy to get insights from the events on Event Hubs topics, by providing an intuitive low-code authoring canvas to process them.

If I was going to be running this for a while and wanted to optimise for my applications in Azure, I would likely have set this up like this, with the Event Gateways deployed close to the Azure Kafka endpoints.

(more…)

Using Mirror Maker 2 with IBM Event Streams to migrate to a new cluster

Thursday, April 18th, 2024

This is the sixth in a series of blog posts sharing examples of ways to use Mirror Maker 2 with IBM Event Streams.

Mirror Maker 2 is a powerful and flexible tool for moving Kafka events between Kafka clusters.

For this sixth post, I’ll look at using Mirror Maker to migrate your Kafka cluster to a new region.

I’ve broken this down into multiple stages. For each stage, I’ll explain the intent and share a demo script I’ve created to let you try this for yourself.

(more…)

Using Mirror Maker 2 with IBM Event Streams to restore from a backup cluster

Friday, April 12th, 2024

This is the fifth in a series of blog posts sharing examples of ways to use Mirror Maker 2 with IBM Event Streams.

Mirror Maker 2 is a powerful and flexible tool for moving Kafka events between Kafka clusters.

For this fifth post, I’ll look at using Mirror Maker to maintain a backup of your Kafka events, and to be able to restore from that backup.

This is more complex than the previous posts as there are multiple stages involved. For each stage, I’ll explain the intent and share the demo script I’ve created to let you try this for yourself.

(more…)