Posts Tagged ‘mq’

Using Apache Kafka with IBM MQ using Kafka Connect

Thursday, April 20th, 2023

A recording of a demo walkthrough I did about using the Kafka Connect MQ connectors to flow messages between IBM MQ and Apache Kafka.

A few weeks ago, I presented a session at TechCon about IBM MQ and Apache Kafka with David Ware. I spent most of my time running through how to use Kafka Connect with IBM MQ, with a few demos showing different ways to setup and run the kafka-connect-mq-source Connector.

My demos start at around 20 minutes in, but you should listen to David give the context first!

How to scale IBM MQ clusters and client applications in OpenShift

Tuesday, July 19th, 2022


You’re running a cluster of IBM MQ queue managers in Red Hat OpenShift, together with a large number of client applications putting and getting messages to them. This workload will vary over time, so you need flexibility in how you scale all of this.

This tutorial will show how you can easily scale the number of instances of your client applications up and down, without having to reconfigure their connection details and without needing to manually distribute or load balance them.

And it will show how to quickly and easily grow the queue manager cluster – adding a new queue manager to the cluster without complex, new, custom configuration.


The IBM MQ feature demonstrated in this tutorial is Uniform Clusters. Dave Ware has a great introduction and demo of Uniform Clusters, so if you’re looking for background about how the feature works, I’d highly recommend it.

This tutorial is heavily inspired by that demo (thanks, Dave!), but my focus here is mainly on how to apply the techniques that Dave showed in OpenShift.


How to use MQ Streaming Queues and Kafka Connect to make an auditable copy of IBM MQ messages

Sunday, July 10th, 2022


You have an IBM MQ queue manager. An application is putting messages to a command queue. Another application gets these messages from the queue and takes actions in response.



You want multiple separate audit applications to be able to review the commands that go through the command queue.

They should be able to replay a history of these command messages as many times as they want.

This must not impact the application that is currently getting the messages from the queue.



You can use Streaming Queues to make a duplicate of every message put to the command queue to a separate copy queue.

This copy queue can be used to feed a connector that can produce every message to a Kafka topic. This Kafka topic can be used by audit applications



The final solution works like this:


  1. A JMS application called Putter puts messages onto an IBM MQ queue called COMMANDS
  2. For the purposes of this demo, a development-instance of LDAP is used to authenticate access to IBM MQ
  3. A JMS application called Getter gets messages from the COMMANDS queue
  4. Copies of every message put to the COMMANDS queue will be made to the COMMANDS.COPY queue
  5. A Connector will get every message from the COMMANDS.COPY queue
  6. The Connector transforms each JMS message into a string, and produces it to the MQ.COMMANDS Kafka topic
  7. A Java application called Audit can replay the history of all messages on the Kafka topic


Explaining PowerShell for WebSphere MQ

Tuesday, December 11th, 2007

I’ve made a start on a series of posts designed to introduce how to use Windows PowerShell for WebSphere MQ admin. There is a bit of a learning curve for people new to PowerShell, so rather than try to explain everything in one go, I’m planning on breaking it down into bits, covering one topic a day.

If you’re curious to see all this PowerShell stuff I’ve been working on for months, head on over to the WMQ blog: