This is the second in a series of blog posts sharing examples of ways to use Mirror Maker 2 with IBM Event Streams.
- Using Mirror Maker 2 to aggregate events from multiple regions
- Using Mirror Maker 2 to broadcast events to multiple regions
- Using Mirror Maker 2 to share topics across multiple regions
- Using Mirror Maker 2 to create a failover cluster
- Using Mirror Maker 2 to restore events from a backup cluster
- Using Mirror Maker 2 to migrate to a different region
Mirror Maker 2 is a powerful and flexible tool for moving Kafka events between Kafka clusters, but sometimes I feel like this can be forgotten if we only talk about it in the context of disaster recovery.
In these posts, I want to inspire you to think about other ways you could use Mirror Maker 2. The best way to learn about what is possible is to play with it for yourself, so with these posts I’ll include a script to create a demonstration of the scenario.
For this second post, I’ll look at using Mirror Maker to broadcast events to clusters in multiple regions.
Where the last post described a “fan in” scenario, this is effectively the opposite use case: a “fan out” scenario.
Imagine that you have a source of events that is useful for a variety of applications running in multiple different regions.
If you do this with a single Kafka cluster, this means that all applications are making a remote connection to the Kafka cluster – so you have introduced a latency cost to each of the consumer applications.
It also increases the amount of network traffic across your infrastructure, as all events are transferred across the network multiple times – once for each application.
Mirroring is an efficient way of allowing all consumers to have local access to a copy of the events.
Applications in all regions get the benefits of interacting with a local Kafka cluster. Mirror Maker takes the responsibility of broadcasting a copy of the events to each region, ready for use by a local consumer.
This is especially beneficial where consumer applications are likely to rewind, and repeatedly reconsume older events from earlier offsets. The more that they do this, the greater the benefits from having local access to the events.
Using Mirror Maker 2 is a good fit where events are generated in a central location, and need to be processed by applications running in multiple remote locations. This is particularly beneficial where the applications are sensitive to latency or where cross-region network traffic is considered expensive.
Demo
For a demonstration of this, I created:
Three Kubernetes namespaces (“north-america”, “south-america”, “europe”) represent three different regions. An Event Streams Kafka cluster is created in each “region”.
A producer application is started in the “Europe” region, regularly producing randomly generated events, themed around a fictional clothing retailer, Loosehanger Jeans.
Mirror Maker instances running in the “North America region” and “South America region” broadcasts these to topics in the Event Streams clusters there. Consuming applications in each of these regions consume the mirrored events from their local Kafka cluster.
To create the demo for yourself
There is an Ansible playbook here which creates all of this:
github.com/dalelane/eventstreams-mirrormaker2-demos/blob/master/03-broadcast/setup.yaml
An example of how to run it can be found in the script at: setup-03-broadcast.sh
This script will also display the URL and username/password for the Event Streams web UI for all three regions, to make it easier to log in and see the events.
Once you’ve created the demo, you can run the consumer-northamerica.sh
and consumer-southamerica.sh
scripts to see the events that are received by the consumer applications in the North America and South America “regions”.
(Once you’ve finished, the cleanup.sh
script deletes everything that the demo created.)
How the demo is configured
The Mirror Maker configs can be found here:
mm2-na.yaml
(Mirror Maker in the “North American region”)mm2-sa.yaml
(Mirror Maker in the “South American region”)
The specs are commented so these are the files to read if you want to see how to configure Mirror Maker to satisfy this kind of scenario.
I’ve only talked here about how MM2 is moving the events between regions, but if you look at the comments in the mm2 specs, you’ll see that it is doing more than that. For example, it is also keeping the topic configuration in sync. Try that for yourself. Modify the configuration of one of the topics in the “Europe region” and then see that change reflected in the corresponding topics in the North America and South America “regions”.
More scenarios to come
In the next few posts, I’ll share other demos that show other scenarios.
Tags: apachekafka, ibmeventstreams, kafka