IBM Event Automation helps companies to accelerate their event-driven projects wherever businesses are on their journey. It provides multiple components (Event Streams, Event Endpoint Management, and Event Processing) which together lay the foundation of an event-driven architecture that can unlock the value of the streams of events that businesses have.
A key goal of Event Automation is to be composable. The three components can be used together, or they can each be used to extend and enhance an existing event-driven deployment.
Today, I demonstrated some of the Event Automation components working with Azure Event Hubs for Apache Kafka. As Event Hubs provides a Kafka interface to Azure’s data streaming service, it obviously can be used with Event Automation. But it can be helpful to inspire people by showing it for real, so even demos of obvious things can be valuable.
For example, Event Endpoint Management can enhance the value of topics in Event Hubs by offering management and governance, and by enabling governed reuse of those topics. Event Processing makes it easy to get insights from the events on Event Hubs topics, by providing an intuitive low-code authoring canvas to process them.
If I was going to be running this for a while and wanted to optimise for my applications in Azure, I would likely have set this up like this, with the Event Gateways deployed close to the Azure Kafka endpoints.
(more…)