Archive for the ‘ibm’ Category

Using client quotas with IBM Event Streams

Sunday, February 26th, 2023

In this post, I want to highlight a feature that I often see under-used in IBM Event Streams, and show how you can easily give it a try.

Kafka can enforce quotas to limit the impact that client applications can have on your cluster. To quote the Kafka documentation:

It is possible for producers and consumers to produce/consume very high volumes of data or generate requests at a very high rate and thus monopolize broker resources, cause network saturation and generally DOS other clients and the brokers themselves.

Having quotas protects against these issues and is all the more important in large multi-tenant clusters where a small set of badly behaved clients can degrade user experience for the well behaved ones.

In fact, when running Kafka as a service this even makes it possible to enforce API limits according to an agreed upon contract.

(more…)

What is IBM Client Engineering?

Saturday, December 17th, 2022

I’ve been working in Client Engineering since last summer.

This video is a great description of the aspiration of the team, and the vision of the sort of work we want to do with our customers. (Plus you get a few glimpses of our office in York Road where I worked this year).

Setting up the Event Streams UI for developer-only use

Friday, December 9th, 2022

A quick tip for how to give a developer access to the IBM Event Streams UI only for the Kafka topics used by their application, and not everything else.

Imagine I’m a Kafka cluster admin. I’m running a cluster with a variety of topics on it.

Only viewing their own topics

One of my developers is responsible for the flight tracking app, and wants to use the Event Streams UI. But I don’t want them to be able to access the other sensitive topics for other applications.

I can create them their own login for the UI, that only lets them see their own topics.

The permissions I want to give them are:

- operation: Read
  resource:
    name: FLIGHT.
    patternType: prefix
    type: topic

(Remember, managing my Kafka cluster through Kubernetes resources is a good fit with a CI/CD workflow.)

(more…)

Using IBM DataStage to process JSON events on Apache Kafka topics

Monday, November 28th, 2022

In this post, I share a step-by-step guide for how to use IBM DataStage to merge JSON messages from multiple different Apache Kafka topics, into a single joined-up stream of events.

screenshot

(more…)

Geo-steering with IBM Code Engine and Cloud Internet Services

Saturday, September 3rd, 2022

In this post, I want to share a small tip from how I run Machine Learning for Kids: how I run instances of the site in different regions, and use geo-steering so that users are directed to the instance of the site nearest to them.

(more…)

How to use MQ Streaming Queues and Kafka Connect to make an auditable copy of IBM MQ messages

Sunday, July 10th, 2022

Scenario

You have an IBM MQ queue manager. An application is putting messages to a command queue. Another application gets these messages from the queue and takes actions in response.

diagram

Objective

You want multiple separate audit applications to be able to review the commands that go through the command queue.

They should be able to replay a history of these command messages as many times as they want.

This must not impact the application that is currently getting the messages from the queue.

diagram

Solution

You can use Streaming Queues to make a duplicate of every message put to the command queue to a separate copy queue.

This copy queue can be used to feed a connector that can produce every message to a Kafka topic. This Kafka topic can be used by audit applications

diagram

Details

The final solution works like this:

diagram

  1. A JMS application called Putter puts messages onto an IBM MQ queue called COMMANDS
  2. For the purposes of this demo, a development-instance of LDAP is used to authenticate access to IBM MQ
  3. A JMS application called Getter gets messages from the COMMANDS queue
  4. Copies of every message put to the COMMANDS queue will be made to the COMMANDS.COPY queue
  5. A Connector will get every message from the COMMANDS.COPY queue
  6. The Connector transforms each JMS message into a string, and produces it to the MQ.COMMANDS Kafka topic
  7. A Java application called Audit can replay the history of all messages on the Kafka topic

(more…)

Connecting App Connect Enterprise to Event Streams

Sunday, June 19th, 2022

Configuring IBM App Connect Enterprise to produce or consume messages from Kafka topics in IBM Event Streams requires careful configuration. In this post, I’ll share the steps I use that help me to avoid missing any required values.

To illustrate this, I’ll create a simple App Connect flow that implements a REST API, where any data I POST to the REST API is sent to a Kafka topic.

The key to getting this to work correctly first time is to make sure that values are accurately copied from Event Streams to App Connect.

To help with this, I use a grid like the one below.

The instructions in this post start with Event Streams, and explain how to populate the grid with the information you need.

Then the instructions will switch to App Connect, and explain how to use the values in the grid to set up your App Connect flow.

What this is Values you will see in my screenshots Your value
A Topic name
THIS.IS.MY.TOPIC
B Bootstrap address
kafkadev_kafka_bootstrap_demo.itzroks_120000f8p4_f9nd74_6ccd7f378ae819553d37d5f2ee142bd6_0000.eu_gb.containers.appdomain.cloud:443

kafkadev_kafka_bootstrap.demo.svc:9093

kafkadev_kafka_bootstrap.demo.svc:9092

C SASL mechanism
SCRAM-SHA-512
D SASL config
org.apache.kafka.common.security.scram.ScramLoginModule required;
E Security protocol
SASL_SSL

SASL_PLAINTEXT

SSL

PLAINTEXT

F Certificate
es-cert.jks
G Certificate password
wo05RndLJQgI
H Username
app-connect-enterprise
I Password
AIYJjrM2bSic
J Policy project name
demo-policies
K Policy name
demo-eventstreams-policy
L Security identity name
kafka-credentials
M Truststore identity name
kafka-truststore

(more…)

Taking your first step towards an event-driven architecture

Friday, December 3rd, 2021

In this post, I want to suggest some approaches for introducing event-driven architecture patterns into your existing application environment. I’ll demonstrate how you can incrementally adopt Apache Kafka without needing to immediately build new applications or rebuild your existing applications, and show how this can be delivered in Red Hat OpenShift.

(more…)