{"id":3664,"date":"2019-06-09T17:33:01","date_gmt":"2019-06-09T17:33:01","guid":{"rendered":"http:\/\/dalelane.co.uk\/blog\/?p=3664"},"modified":"2019-07-07T11:39:08","modified_gmt":"2019-07-07T11:39:08","slug":"using-kafkacat-and-kaf-with-ibm-event-streams","status":"publish","type":"post","link":"https:\/\/dalelane.co.uk\/blog\/?p=3664","title":{"rendered":"Using kafkacat and kaf with IBM Event Streams"},"content":{"rendered":"<p><a href=\"https:\/\/www.ibm.com\/cloud\/event-streams\">IBM Event Streams<\/a> is IBM&#8217;s <a href=\"https:\/\/kafka.apache.org\/\">Kafka<\/a> offering. Naturally it comes with it&#8217;s own UI and CLI tools, but one of the great things about Apache Kafka is that it\u2019s not just a single thing from a single company &#8211; rather it is an active and diverse ecosystem, which means you\u2019ve got a variety of tools to choose from. <\/p>\n<p>I thought I&#8217;d try a couple of open source CLI tools, and share how to connect them and what they can do. <\/p>\n<p>First up, <code>kafkacat<\/code>. <\/p>\n<p><!--more--><\/p>\n<h3>kafkacat<\/h3>\n<p>I&#8217;ve used this before, but not for a while. <\/p>\n<p>To <strong>install<\/strong> it, I just used: <\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">brew install kafkacat<\/pre>\n<p>There are lots of other install options at <a href=\"https:\/\/github.com\/edenhill\/kafkacat\">github.com\/edenhill\/kafkacat<\/a>. <\/p>\n<p>To <strong>configure<\/strong> it, I created a file at <code>~\/.config\/kafkacat.conf<\/code> with the contents:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">bootstrap.servers=9.20.196.31:30885\r\nssl.ca.location=\/Users\/dalelane\/myfolder\/es-cert.pem\r\nsecurity.protocol=sasl_ssl\r\nsasl.mechanisms=PLAIN\r\nsasl.username=token\r\nsasl.password=xxxxx-MY-API-KEY-HERE-xxxxx<\/pre>\n<p><em>(I&#8217;ll explain where to get these config values for your cluster at the end of this post.)<\/em><\/p>\n<p>Once you&#8217;ve done that you can <strong>use <code>kafkacat<\/code> to produce and consume messages<\/strong> in a variety of ways. <\/p>\n<p>You can produce messages from stdin using <code>kafkacat -P<\/code>:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">kafkacat -P -t YOUR.TOPIC.NAME<\/pre>\n<p>Or produce a message with the contents of a file using:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">kafkacat -P -t YOUR.TOPIC.NAME your-file-name.txt<\/pre>\n<p>And include headers using:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">kafkacat -P -t YOUR.TOPIC.NAME -H \"headerkey=headervalue\" your-file-name.txt<\/pre>\n<p>Similarly, you can consume messages using <code>kafkacat -C<\/code>:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">kafkacat -C -G your-consumer-group-id YOUR.TOPIC.NAME<\/pre>\n<p>And if you want to see the headers, you can customize the format like:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">kafkacat -C -G your-consumer-group-id -f 'headers:\\n%h\\nmessage:\\n%s' YOUR.TOPIC.NAME<\/pre>\n<p>There are tons of other options, but they&#8217;re well explained at <a href=\"https:\/\/github.com\/edenhill\/kafkacat\">github.com\/edenhill\/kafkacat<\/a> so I won&#8217;t reproduce it here. Suffice to say, you can control exactly how you produce or consume messages. <\/p>\n<p>But if all you want is a quick and easy way to send and receive messages, it&#8217;s good for that, too. <\/p>\n<p>And you can query for metadata about topics, with <code>kafkacat -L<\/code>.<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">$ kafkacat -L -t MY.TOPIC\r\nMetadata for MY.TOPIC (from broker -1: sasl_ssl:\/\/9.20.196.31:30885\/bootstrap):\r\n 3 brokers:\r\n  broker 0 at 9.20.196.31:31179 (controller)\r\n  broker 2 at 9.20.196.31:30011\r\n  broker 1 at 9.20.196.31:31053\r\n 1 topics:\r\n  topic \"MY.TOPIC\" with 3 partitions:\r\n    partition 0, leader 1, replicas: 1,0, isrs: 1,0\r\n    partition 1, leader 0, replicas: 0,2, isrs: 0,2\r\n    partition 2, leader 2, replicas: 2,1, isrs: 2,1<\/pre>\n<p>Next up, <code>kaf<\/code>. <\/p>\n<p><!--more--><\/p>\n<h3>kaf<\/h3>\n<p>This was a new one on me, that I first heard about at <a href=\"https:\/\/kafka-summit.org\/events\/kafka-summit-london-2019\/\">Kafka Summit last month<\/a>. <\/p>\n<p>To <strong>install<\/strong> it, I just used: <\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">curl https:\/\/raw.githubusercontent.com\/infinimesh\/kaf\/master\/godownloader.sh | BINDIR=$HOME\/bin bash<\/pre>\n<p>To <strong>configure<\/strong> it, I created a file at <code>~\/.kaf\/config<\/code> with the contents:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">current-cluster: mycluster\r\nclusters:\r\n- name: mycluster\r\n  brokers:\r\n  - 9.20.196.31:30885\r\n  SASL:\r\n    mechanism: PLAIN\r\n    username: token\r\n    password: xxxxx-MY-API-KEY-HERE-xxxxx\r\n  TLS:\r\n    cafile: \/Users\/dalelane\/myfolder\/es-cert.pem\r\n  security-protocol: SASL_SSL<\/pre>\n<p><em>(I&#8217;ll explain where to get these config values for your cluster at the end of this post.)<\/em><\/p>\n<p>Once you&#8217;ve done that you can <strong>use <code>kaf<\/code> to manage your cluster, as well as produce and consume messages<\/strong>. <\/p>\n<p>You can get a list of topics:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">$ kaf topics\r\nNAME                 PARTITIONS   REPLICAS\r\n__consumer_offsets   50           3\r\ndale                 1            1\r\nTEST.TOPIC           1            3<\/pre>\n<p>You can query the properties of a topic:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">$ kaf topic describe dale\r\nName:        dale\r\nInternal:    false\r\nCompacted:   false\r\nPartitions:\r\n  Partition  High Watermark  Leader  Replicas  ISR\r\n  ---------  --------------  ------  --------  ---\r\n  0          16              2       [2]       [2]\r\nConfig:\r\n  Name                    Value    ReadOnly  Sensitive\r\n  ----                    -----    --------  ---------\r\n  cleanup.policy          delete   false     false\r\n  message.format.version  2.2-IV1  false     false<\/pre>\n<p>You can create a topic with:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">$ kaf topic create MY.TOPIC -p 3 -r 2\r\nCreated topic MY.TOPIC.<\/pre>\n<p>And a variety of similar admin tasks for topics and consumer groups. <\/p>\n<p>Like <code>kafkacat<\/code>, you can also use <code>kaf<\/code> to easily produce and consume messages from the command line. <\/p>\n<p>You can produce messages from stdin using:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">$ echo -n \"My message\" | kaf produce MY.TOPIC\r\nSent record to partition 0 at offset 0.<\/pre>\n<p>Or produce a message with the contents of a file using:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">$ cat your-file-name.txt | kaf produce MY.TOPIC\r\nSent record to partition 0 at offset 1.<\/pre>\n<p>Similarly, you can consume messages using:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1.1em; overflow: auto;\">kaf consume MY.TOPIC<\/pre>\n<p>There aren&#8217;t as many options as <code>kafkacat<\/code> has, but I found it a little more intuitive. And it&#8217;s got all the options you need to quickly and easily send and receive messages. You can read more about it at <a href=\"https:\/\/github.com\/birdayz\/kaf\">github.com\/birdayz\/kaf<\/a>. <\/p>\n<p>I regularly work with a variety of different clusters, and I really like the way that you can put the config for multiple clusters in the config file, give each one a friendly name, and then switch between them with <code>kaf config use-cluster mycluster<\/code>. I think that&#8217;ll come in handy.<\/p>\n<h3>Getting the config values<\/h3>\n<p>Finally, a quick few pointers for where I got the config values that I used above.<\/p>\n<p>Starting from the Event Streams UI&#8230;<\/p>\n<p><a href=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-01-topics-view.png\"><img decoding=\"async\" src=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-01-topics-view-small.png\" style=\"border: thin black solid\" border=0\/><\/a><br \/>\n<small>Click on the image for high-res version<\/small><\/p>\n<p>I clicked on <strong>Connect to this cluster<\/strong> on the right to bring up this side panel&#8230;<\/p>\n<p><a href=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-02-bootstrap.png\"><img decoding=\"async\" src=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-02-bootstrap-small.png\" style=\"border: thin black solid\" border=0\/><\/a><br \/>\n<small>Click on the image for high-res version<\/small><\/p>\n<p>Scrolling down shows the button I used to download the PEM certificate file.<\/p>\n<p><a href=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-03-pem-cert.png\"><img decoding=\"async\" src=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-03-pem-cert-small.png\" style=\"border: thin black solid\" border=0\/><\/a><br \/>\n<small>Click on the image for high-res version<\/small><\/p>\n<p>To the right is a small wizard for creating API keys&#8230; <\/p>\n<p><a href=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-04-apikey.png\"><img decoding=\"async\" src=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-04-apikey-small.png\" style=\"border: thin black solid\" border=0\/><\/a><br \/>\n<small>Click on the image for high-res version<\/small><\/p>\n<p>It walks you through a bunch of options, so you can decide exactly what actions the API key is allowed to take, and which topics it can use (or you can just keep picking &#8220;all&#8221;)<\/p>\n<p>Finally, the middle tab was where I got a helpful reminder for the rest of the config options to use.<\/p>\n<p><a href=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-05-conn-props.png\"><img decoding=\"async\" src=\"\/\/dalelane.co.uk\/blog\/post-images\/190609-kafka-clis\/190609-05-conn-props-small.png\" style=\"border: thin black solid\" border=0\/><\/a><br \/>\n<small>Click on the image for high-res version<\/small><\/p>\n<p>That was a quick play with <code>kafkacat<\/code> and <code>kaf<\/code>. What other tools should I be trying next?<\/p>\n","protected":false},"excerpt":{"rendered":"<p>IBM Event Streams is IBM&#8217;s Kafka offering. Naturally it comes with it&#8217;s own UI and CLI tools, but one of the great things about Apache Kafka is that it\u2019s not just a single thing from a single company &#8211; rather it is an active and diverse ecosystem, which means you\u2019ve got a variety of tools [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7,4],"tags":[593,583,584],"class_list":["post-3664","post","type-post","status-publish","format-standard","hentry","category-code","category-ibm","tag-apachekafka","tag-ibmeventstreams","tag-kafka"],"_links":{"self":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/3664","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=3664"}],"version-history":[{"count":0,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/3664\/revisions"}],"wp:attachment":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=3664"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=3664"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=3664"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}