IBM Event Streams is IBM’s Kafka offering. Naturally it comes with it’s own UI and CLI tools, but one of the great things about Apache Kafka is that it’s not just a single thing from a single company – rather it is an active and diverse ecosystem, which means you’ve got a variety of tools to choose from.
I thought I’d try a couple of open source CLI tools, and share how to connect them and what they can do.
First up, kafkacat
.
kafkacat
I’ve used this before, but not for a while.
To install it, I just used:
brew install kafkacat
There are lots of other install options at github.com/edenhill/kafkacat.
To configure it, I created a file at ~/.config/kafkacat.conf
with the contents:
bootstrap.servers=9.20.196.31:30885 ssl.ca.location=/Users/dalelane/myfolder/es-cert.pem security.protocol=sasl_ssl sasl.mechanisms=PLAIN sasl.username=token sasl.password=xxxxx-MY-API-KEY-HERE-xxxxx
(I’ll explain where to get these config values for your cluster at the end of this post.)
Once you’ve done that you can use kafkacat
to produce and consume messages in a variety of ways.
You can produce messages from stdin using kafkacat -P
:
kafkacat -P -t YOUR.TOPIC.NAME
Or produce a message with the contents of a file using:
kafkacat -P -t YOUR.TOPIC.NAME your-file-name.txt
And include headers using:
kafkacat -P -t YOUR.TOPIC.NAME -H "headerkey=headervalue" your-file-name.txt
Similarly, you can consume messages using kafkacat -C
:
kafkacat -C -G your-consumer-group-id YOUR.TOPIC.NAME
And if you want to see the headers, you can customize the format like:
kafkacat -C -G your-consumer-group-id -f 'headers:\n%h\nmessage:\n%s' YOUR.TOPIC.NAME
There are tons of other options, but they’re well explained at github.com/edenhill/kafkacat so I won’t reproduce it here. Suffice to say, you can control exactly how you produce or consume messages.
But if all you want is a quick and easy way to send and receive messages, it’s good for that, too.
And you can query for metadata about topics, with kafkacat -L
.
$ kafkacat -L -t MY.TOPIC Metadata for MY.TOPIC (from broker -1: sasl_ssl://9.20.196.31:30885/bootstrap): 3 brokers: broker 0 at 9.20.196.31:31179 (controller) broker 2 at 9.20.196.31:30011 broker 1 at 9.20.196.31:31053 1 topics: topic "MY.TOPIC" with 3 partitions: partition 0, leader 1, replicas: 1,0, isrs: 1,0 partition 1, leader 0, replicas: 0,2, isrs: 0,2 partition 2, leader 2, replicas: 2,1, isrs: 2,1
Next up, kaf
.
kaf
This was a new one on me, that I first heard about at Kafka Summit last month.
To install it, I just used:
curl https://raw.githubusercontent.com/infinimesh/kaf/master/godownloader.sh | BINDIR=$HOME/bin bash
To configure it, I created a file at ~/.kaf/config
with the contents:
current-cluster: mycluster clusters: - name: mycluster brokers: - 9.20.196.31:30885 SASL: mechanism: PLAIN username: token password: xxxxx-MY-API-KEY-HERE-xxxxx TLS: cafile: /Users/dalelane/myfolder/es-cert.pem security-protocol: SASL_SSL
(I’ll explain where to get these config values for your cluster at the end of this post.)
Once you’ve done that you can use kaf
to manage your cluster, as well as produce and consume messages.
You can get a list of topics:
$ kaf topics NAME PARTITIONS REPLICAS __consumer_offsets 50 3 dale 1 1 TEST.TOPIC 1 3
You can query the properties of a topic:
$ kaf topic describe dale Name: dale Internal: false Compacted: false Partitions: Partition High Watermark Leader Replicas ISR --------- -------------- ------ -------- --- 0 16 2 [2] [2] Config: Name Value ReadOnly Sensitive ---- ----- -------- --------- cleanup.policy delete false false message.format.version 2.2-IV1 false false
You can create a topic with:
$ kaf topic create MY.TOPIC -p 3 -r 2 Created topic MY.TOPIC.
And a variety of similar admin tasks for topics and consumer groups.
Like kafkacat
, you can also use kaf
to easily produce and consume messages from the command line.
You can produce messages from stdin using:
$ echo -n "My message" | kaf produce MY.TOPIC Sent record to partition 0 at offset 0.
Or produce a message with the contents of a file using:
$ cat your-file-name.txt | kaf produce MY.TOPIC Sent record to partition 0 at offset 1.
Similarly, you can consume messages using:
kaf consume MY.TOPIC
There aren’t as many options as kafkacat
has, but I found it a little more intuitive. And it’s got all the options you need to quickly and easily send and receive messages. You can read more about it at github.com/birdayz/kaf.
I regularly work with a variety of different clusters, and I really like the way that you can put the config for multiple clusters in the config file, give each one a friendly name, and then switch between them with kaf config use-cluster mycluster
. I think that’ll come in handy.
Getting the config values
Finally, a quick few pointers for where I got the config values that I used above.
Starting from the Event Streams UI…
Click on the image for high-res version
I clicked on Connect to this cluster on the right to bring up this side panel…
Click on the image for high-res version
Scrolling down shows the button I used to download the PEM certificate file.
Click on the image for high-res version
To the right is a small wizard for creating API keys…
Click on the image for high-res version
It walks you through a bunch of options, so you can decide exactly what actions the API key is allowed to take, and which topics it can use (or you can just keep picking “all”)
Finally, the middle tab was where I got a helpful reminder for the rest of the config options to use.
Click on the image for high-res version
That was a quick play with kafkacat
and kaf
. What other tools should I be trying next?
Tags: apachekafka, ibmeventstreams, kafka