Posts Tagged ‘apachekafka’

Running IBM Event Streams on a laptop (sort of)

Friday, December 23rd, 2022

How to run a tiny local Kafka cluster using IBM Event Streams images

For local development on Kafka projects, I always run the public open source builds of ZooKeeper and Kafka as Java processes directly on my laptop (similar to steps described in the Apache Kafka Quickstart).

But for a project this week, I needed to verify something with the distribution of Kafka that comes with IBM Event Streams.

I used a simple Docker Compose setup for this. I’ll use this post to share how I did it.

(more…)

Xbox LIVE events for Kafka

Monday, December 19th, 2022

I’ve made a Kafka Connect source connector for sending “real-time” events from Xbox LIVE to Kafka topics.

Quick primer if you’re not a gamer or don’t know what I’m talking about!
The Xbox platform comes with a social aspect: details about games you play and the achievements you earn playing them, are shared with your friends on the Xbox LIVE service. That is the source of data I’m using here.

Create an API key for your Xbox LIVE account, and run the Connector with it, and it will start producing two streams of events to your Kafka cluster:


Screenshot from Event Streams, but as a Kafka Connect connector, you could use any flavour of Kafka, including Apache Kafka.

ACHIEVEMENTS
Events when one of your friends earns an achievement.

PRESENCE
Events when your friends start playing a game, or go online/offline.

Details about the attributes of each of these events can be found in the Connector README, but here are a few screenshots to give you an idea.

An example message on the ACHIEVEMENTS topic.

(more…)

Setting up the Event Streams UI for developer-only use

Friday, December 9th, 2022

A quick tip for how to give a developer access to the IBM Event Streams UI only for the Kafka topics used by their application, and not everything else.

Imagine I’m a Kafka cluster admin. I’m running a cluster with a variety of topics on it.

Only viewing their own topics

One of my developers is responsible for the flight tracking app, and wants to use the Event Streams UI. But I don’t want them to be able to access the other sensitive topics for other applications.

I can create them their own login for the UI, that only lets them see their own topics.

The permissions I want to give them are:

- operation: Read
  resource:
    name: FLIGHT.
    patternType: prefix
    type: topic

(Remember, managing my Kafka cluster through Kubernetes resources is a good fit with a CI/CD workflow.)

(more…)

Using IBM DataStage to process JSON events on Apache Kafka topics

Monday, November 28th, 2022

In this post, I share a step-by-step guide for how to use IBM DataStage to merge JSON messages from multiple different Apache Kafka topics, into a single joined-up stream of events.

screenshot

(more…)

Setting up trusted SSL for IBM Event Streams

Thursday, October 27th, 2022

A quick how-to for setting up Event Streams with trusted certificates when running a development project.

Problem

You’re working on a project using IBM Event Streams. It’s just a development project, so you’re not using an SSL certificate signed by your real, trusted, corporate signer.

Everything works, but…

You get errors like these every time you access the web tooling – which you have to click through.

And you get errors like these from your Kafka client applications – which you have to configure with a custom truststore to avoid (although, if you do need to do that, I have a guide to help!)

[2021-06-27 23:19:06,048] ERROR [Consumer clientId=consumer-dalegrp-1, groupId=dalegrp] Connection to node -1 (dale-kafka-saslscram-bootstrap-strimzi.apps.eem-test-fest-6.cp.fyre.ibm.com/9.46.199.58:443) failed authentication due to: SSL handshake failed (org.apache.kafka.clients.NetworkClient)
[2021-06-27 23:19:06,049] WARN [Consumer clientId=consumer-dalegrp-1, groupId=dalegrp] Bootstrap broker dale-kafka-saslscram-bootstrap-strimzi.apps.eem-test-fest-6.cp.fyre.ibm.com:443 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2021-06-27 23:19:06,069] ERROR Error processing message, terminating consumer process:  (kafka.tools.ConsoleConsumer$)
org.apache.kafka.common.errors.SslAuthenticationException: SSL handshake failed
Caused by: javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
	at java.base/sun.security.ssl.Alert.createSSLException(Alert.java:131)
	at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:326)
	at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:269)
	at java.base/sun.security.ssl.TransportContext.fatal(TransportContext.java:264)
	at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.checkServerCerts(CertificateMessage.java:1339)
	at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.onConsumeCertificate(CertificateMessage.java:1214)
	at java.base/sun.security.ssl.CertificateMessage$T13CertificateConsumer.consume(CertificateMessage.java:1157)
	at java.base/sun.security.ssl.SSLHandshake.consume(SSLHandshake.java:392)
	at java.base/sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:444)
	at java.base/sun.security.ssl.SSLEngineImpl$DelegatedTask$DelegatedAction.run(SSLEngineImpl.java:1074)
	at java.base/sun.security.ssl.SSLEngineImpl$DelegatedTask$DelegatedAction.run(SSLEngineImpl.java:1061)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:770)
	at java.base/sun.security.ssl.SSLEngineImpl$DelegatedTask.run(SSLEngineImpl.java:1008)

(more…)

Deploying App Connect Enterprise applications from a CI/CD pipeline

Saturday, October 22nd, 2022

Sharing an example Tekton pipeline for deploying an IBM App Connect Enterprise application to Red Hat OpenShift.

This post is about a repository I’ve shared on github at dalelane/app-connect-tekton-pipeline. It contains an example of how to use Tekton to create a CI/CD pipeline that builds and deploys an App Connect Enterprise application to Red Hat OpenShift.

The pipeline uses the IBM App Connect Operator to easily build, deploy and manage your applications in containers. The pipeline runs on OpenShift to allow it to easily be integrated into an automated continuous delivery workflow without needing to build anything locally from a developer’s workstation.

For background information about the Operator, and the different types of Kubernetes resources that this pipeline will create (e.g. IntegrationServer and Configuration), see these blog posts:

(more…)

Take your first step into Event Driven Architectures

Saturday, October 15th, 2022

Introducing an event-driven architecture into your application can seem like a scary task if you’re only used to synchronous and data-centric technologies. But bringing together data-centric and event-centric approaches means that getting started with technologies like Apache Kafka doesn’t need to be as daunting as you might think.

You don’t have to start from a blank page to adopt an event-driven architecture. You don’t have to replace everything that you already have built. With a few small and easy steps, you can start to introduce elements of event-driven approaches into an existing data-centric landscape.


presentation recording on YouTube

In this session, I showed simple approaches for introducing event-driven architecture patterns into an existing application. I demonstrated how to incrementally adopt Apache Kafka, and start getting benefits without needing to immediately build new applications or rebuild existing applications.

My aim for this session was to give practical ideas for how to take your first steps into an event-driven world and start introducing Apache Kafka into an existing data-centric application environment.

(more…)

How to use MQ Streaming Queues and Kafka Connect to make an auditable copy of IBM MQ messages

Sunday, July 10th, 2022

Scenario

You have an IBM MQ queue manager. An application is putting messages to a command queue. Another application gets these messages from the queue and takes actions in response.

diagram

Objective

You want multiple separate audit applications to be able to review the commands that go through the command queue.

They should be able to replay a history of these command messages as many times as they want.

This must not impact the application that is currently getting the messages from the queue.

diagram

Solution

You can use Streaming Queues to make a duplicate of every message put to the command queue to a separate copy queue.

This copy queue can be used to feed a connector that can produce every message to a Kafka topic. This Kafka topic can be used by audit applications

diagram

Details

The final solution works like this:

diagram

  1. A JMS application called Putter puts messages onto an IBM MQ queue called COMMANDS
  2. For the purposes of this demo, a development-instance of LDAP is used to authenticate access to IBM MQ
  3. A JMS application called Getter gets messages from the COMMANDS queue
  4. Copies of every message put to the COMMANDS queue will be made to the COMMANDS.COPY queue
  5. A Connector will get every message from the COMMANDS.COPY queue
  6. The Connector transforms each JMS message into a string, and produces it to the MQ.COMMANDS Kafka topic
  7. A Java application called Audit can replay the history of all messages on the Kafka topic

(more…)