Getting demo events onto IBM Event Streams topics in a hurry

Sharing a couple of tips for quick-and-dirty demo setups.

I often need to put together demos of IBM Event Automation without much notice. The starting point is almost always needing to get a bunch of interesting events onto a Kafka topic.

What I need is a jumping-off point to illustrate the benefit of sharing streams of events in Event Endpoint Management, or the types of processing you can do in Event Processing. And to do that, I need a topic with events on them that will look interesting or relevant to who I’m demo’ing to.

If I’ve got time to do this properly, I’ll setup a generator that will give me a continuous stream of randonly-generated events (example). But if I’m in a hurry, I’ll use the REST Producer API and do something like this instead.

Step 1 :
Get the URL for the REST Producer API

You can find this in the status section of the Event Streams custom resource. Or you can find it in the Event Streams admin UI.

Step 2 :
Get credentials to use with the REST Producer API

You can do this by creating a KafkaUser custom resource. Or you can click on the Generate credentials button in the Event Streams UI next to the Producer endpoint URL.

Choose SCRAM username and password.

Choose Produce messages.

Name the topic you will be producing demo messages to.

Copy the username and password that is displayed.

Step 3 :
Get some test data – one event per line

There are loads of sites that will generate random data.

Mockaroo is a good example – it generates random values for fields with different types, and will let you download all of that to a file.

What you need to do will depend on which site you pick, but in general you choose a few fields, decide what names to give them, and get the site to generate random values for them.

In this example, I’m generating JSON data, and by unticking the array checkbox I get jsonlines data (that is, one file with lots of JSON objects, one JSON object per line).

Step 4 :
Run this shell script

It just needs to read the file a line at a time, using curl to POST each line to the Producer API.

send-file.sh on github

Step 5 :
Review the events

Look at your topic in the Event Streams admin UI. You’ve now got a topic with a batch of events on, ready to use for a demo.

Variation :
Test data – one event per file

Some data generator sites don’t give you a single file, but generate a separate file for each JSON object.

In that case, the shell script is slightly different. Put all of the files in a folder, and use curl to POST each file in the folder to the Producer API.

send-folder on github

The result is the same – a topic with events ready for a demo.

That’s it.

Super simple, but putting this here will hopefully save people a little time in getting their demo ready.

It’s not a very elegant way of putting data on a Kafka topic. It’s certainly a lot less efficient than using a proper Kafka client. But if you’re in a hurry, it works and it doesn’t need anything other than curl to work.

Tags: , ,

Leave a Reply