Archive for the ‘ibm’ Category

Setting up the Event Streams UI for developer-only use

Friday, December 9th, 2022

A quick tip for how to give a developer access to the IBM Event Streams UI only for the Kafka topics used by their application, and not everything else.

Imagine I’m a Kafka cluster admin. I’m running a cluster with a variety of topics on it.

Only viewing their own topics

One of my developers is responsible for the flight tracking app, and wants to use the Event Streams UI. But I don’t want them to be able to access the other sensitive topics for other applications.

I can create them their own login for the UI, that only lets them see their own topics.

The permissions I want to give them are:

- operation: Read
  resource:
    name: FLIGHT.
    patternType: prefix
    type: topic

(Remember, managing my Kafka cluster through Kubernetes resources is a good fit with a CI/CD workflow.)

(more…)

Using IBM DataStage to process JSON events on Apache Kafka topics

Monday, November 28th, 2022

In this post, I share a step-by-step guide for how to use IBM DataStage to merge JSON messages from multiple different Apache Kafka topics, into a single joined-up stream of events.

screenshot

(more…)

Geo-steering with IBM Code Engine and Cloud Internet Services

Saturday, September 3rd, 2022

In this post, I want to share a small tip from how I run Machine Learning for Kids: how I run instances of the site in different regions, and use geo-steering so that users are directed to the instance of the site nearest to them.

(more…)

How to use MQ Streaming Queues and Kafka Connect to make an auditable copy of IBM MQ messages

Sunday, July 10th, 2022

Scenario

You have an IBM MQ queue manager. An application is putting messages to a command queue. Another application gets these messages from the queue and takes actions in response.

diagram

Objective

You want multiple separate audit applications to be able to review the commands that go through the command queue.

They should be able to replay a history of these command messages as many times as they want.

This must not impact the application that is currently getting the messages from the queue.

diagram

Solution

You can use Streaming Queues to make a duplicate of every message put to the command queue to a separate copy queue.

This copy queue can be used to feed a connector that can produce every message to a Kafka topic. This Kafka topic can be used by audit applications

diagram

Details

The final solution works like this:

diagram

  1. A JMS application called Putter puts messages onto an IBM MQ queue called COMMANDS
  2. For the purposes of this demo, a development-instance of LDAP is used to authenticate access to IBM MQ
  3. A JMS application called Getter gets messages from the COMMANDS queue
  4. Copies of every message put to the COMMANDS queue will be made to the COMMANDS.COPY queue
  5. A Connector will get every message from the COMMANDS.COPY queue
  6. The Connector transforms each JMS message into a string, and produces it to the MQ.COMMANDS Kafka topic
  7. A Java application called Audit can replay the history of all messages on the Kafka topic

(more…)

Connecting App Connect Enterprise to Event Streams

Sunday, June 19th, 2022

Configuring IBM App Connect Enterprise to produce or consume messages from Kafka topics in IBM Event Streams requires careful configuration. In this post, I’ll share the steps I use that help me to avoid missing any required values.

To illustrate this, I’ll create a simple App Connect flow that implements a REST API, where any data I POST to the REST API is sent to a Kafka topic.

The key to getting this to work correctly first time is to make sure that values are accurately copied from Event Streams to App Connect.

To help with this, I use a grid like the one below.

The instructions in this post start with Event Streams, and explain how to populate the grid with the information you need.

Then the instructions will switch to App Connect, and explain how to use the values in the grid to set up your App Connect flow.

What this is Values you will see in my screenshots Your value
A Topic name
THIS.IS.MY.TOPIC
B Bootstrap address
kafkadev_kafka_bootstrap_demo.itzroks_120000f8p4_f9nd74_6ccd7f378ae819553d37d5f2ee142bd6_0000.eu_gb.containers.appdomain.cloud:443

kafkadev_kafka_bootstrap.demo.svc:9093

kafkadev_kafka_bootstrap.demo.svc:9092

C SASL mechanism
SCRAM-SHA-512
D SASL config
org.apache.kafka.common.security.scram.ScramLoginModule required;
E Security protocol
SASL_SSL

SASL_PLAINTEXT

SSL

PLAINTEXT

F Certificate
es-cert.jks
G Certificate password
wo05RndLJQgI
H Username
app-connect-enterprise
I Password
AIYJjrM2bSic
J Policy project name
demo-policies
K Policy name
demo-eventstreams-policy
L Security identity name
kafka-credentials
M Truststore identity name
kafka-truststore

(more…)

Taking your first step towards an event-driven architecture

Friday, December 3rd, 2021

In this post, I want to suggest some approaches for introducing event-driven architecture patterns into your existing application environment. I’ll demonstrate how you can incrementally adopt Apache Kafka without needing to immediately build new applications or rebuild your existing applications, and show how this can be delivered in Red Hat OpenShift.

(more…)

(nearly) 18 years in IBM

Monday, July 12th, 2021

I started working at IBM on 6th August 2003. I’m feeling nostalgic as my eighteenth anniversary approaches, so wanted to write about what I’ve been doing all this time.

I’ve been a back-end developer, a support engineer, a tester, a consultant, a (terrible) front-end developer, and much more.

I’ve worked on proprietary software, and I’ve worked on open-source software.

I’ve worked in a large open plan floor, I’ve worked in cubicle bays with half-a-dozen people, and I’ve had my own office. 

I’ve had roles that were fully based at Hursley. I’ve worked from other IBM offices in the UK. I’ve been based at customer sites for months. I’ve had overseas assignments. I’ve had roles that meant travelling to somewhere different every month.

I’ve worked in teams so small they all fit around my dining table for dinner. I’ve worked in teams so large that we needed several coaches for the team social trip to London.

I’ve worked in distributed teams with team members around the world in four different time zones. I’ve worked in teams where we were all in the same office together.

I’ve worked on software that was first released in the 1990s, and I’ve worked on the first releases of brand new products.

The point I’m making… it hasn’t felt like the same job for eighteen years.

(more…)

Event Endpoint Management

Sunday, June 27th, 2021

Last week, we released the latest version of Event Endpoint Management in IBM Cloud Pak for Integration 2021.2.1. It allows organisations to share and manage access to their Kafka topics. In this post, I want to share a run-through of how it all works.

I’ll start with a high level summary overview, then a walkthrough demo video, and finally share some links to related reading if you’d like more detail.

Overview


click for a larger version of the diagram – numbers in the diagram are described below

Kafka topic owner

This is someone who has a Kafka topic, and is running an application or system that is producing a stream of events to that topic.

They think this stream of events might be useful to other developers in their organisation, so they describe it (using AsyncAPI) and publish this to a catalog where it can be discovered and managed.

  1. creates a Kafka topic and an application that produces events to it
  2. describes and documents their Kafka topic, and the events that are being produced to it
  3. publishes the description of their Kafka topic
  4. pushes the Kafka cluster security info to the Event Gateway service so it can manage access to the topic for the topic owner

App developer

This is someone who is building an application that could benefit from a stream of events.

They are able to discover the event sources that have been shared in their organisation, and get access to them through a self-service Developer Portal.

  1. creates credentials for use in their application
  2. registers new application credentials
  3. updates the Event Gateway service with the new application credentials
  4. creates or configures an application with guidance from the Portal
  5. application connects to the Event Gateway service
  6. application connection routed securely to the Kafka brokers

(more…)