Archive for the ‘ibm’ Category

(nearly) 18 years in IBM

Monday, July 12th, 2021

I started working at IBM on 6th August 2003. I’m feeling nostalgic as my eighteenth anniversary approaches, so wanted to write about what I’ve been doing all this time.

I’ve been a back-end developer, a support engineer, a tester, a consultant, a (terrible) front-end developer, and much more.

I’ve worked on proprietary software, and I’ve worked on open-source software.

I’ve worked in a large open plan floor, I’ve worked in cubicle bays with half-a-dozen people, and I’ve had my own office. 

I’ve had roles that were fully based at Hursley. I’ve worked from other IBM offices in the UK. I’ve been based at customer sites for months. I’ve had overseas assignments. I’ve had roles that meant travelling to somewhere different every month.

I’ve worked in teams so small they all fit around my dining table for dinner. I’ve worked in teams so large that we needed several coaches for the team social trip to London.

I’ve worked in distributed teams with team members around the world in four different time zones. I’ve worked in teams where we were all in the same office together.

I’ve worked on software that was first released in the 1990s, and I’ve worked on the first releases of brand new products.

The point I’m making… it hasn’t felt like the same job for eighteen years.

(more…)

Event Endpoint Management

Sunday, June 27th, 2021

Last week, we released the latest version of Event Endpoint Management in IBM Cloud Pak for Integration 2021.2.1. It allows organisations to share and manage access to their Kafka topics. In this post, I want to share a run-through of how it all works.

I’ll start with a high level summary overview, then a walkthrough demo video, and finally share some links to related reading if you’d like more detail.

Overview


click for a larger version of the diagram – numbers in the diagram are described below

Kafka topic owner

This is someone who has a Kafka topic, and is running an application or system that is producing a stream of events to that topic.

They think this stream of events might be useful to other developers in their organisation, so they describe it (using AsyncAPI) and publish this to a catalog where it can be discovered and managed.

  1. creates a Kafka topic and an application that produces events to it
  2. describes and documents their Kafka topic, and the events that are being produced to it
  3. publishes the description of their Kafka topic
  4. pushes the Kafka cluster security info to the Event Gateway service so it can manage access to the topic for the topic owner

App developer

This is someone who is building an application that could benefit from a stream of events.

They are able to discover the event sources that have been shared in their organisation, and get access to them through a self-service Developer Portal.

  1. creates credentials for use in their application
  2. registers new application credentials
  3. updates the Event Gateway service with the new application credentials
  4. creates or configures an application with guidance from the Portal
  5. application connects to the Event Gateway service
  6. application connection routed securely to the Kafka brokers

(more…)

Talking about IBM Event Streams

Wednesday, September 9th, 2020

We’ve been running a virtual event this week to explain the capabilities of IBM’s Cloud Pak for Integration.

One of these is Event Streams, so I gave an overview of the Event Streams Operator.

But what it really reminded me is that I miss going to conferences and tech events. I don’t want to sound ungrateful for what I’m sure has been a huge amount of work for event organisers in the pivot to online events. It’s great that we can still do events at all, and that organisers are still trying out ways to make it interactive, to enable panels and Q&A sessions.

(more…)

Installing IBM Event Streams using the kubectl-operator plugin

Thursday, August 13th, 2020

Installing operators in Red Hat OpenShift from the CLI is much easier with the new kubectl-operator plugin. Here’s an example of how you can use it to install the Event Streams Operator.

Installing operators in OpenShift from the CLI is a little fiddly. It’s possible, but you have to create a bunch of custom resources that aren’t entirely intuitive, like Subscriptions and OperatorGroups.

It’s easy if you’re using the OpenShift Console web UI, as it does this all for you so you don’t need to worry about it. But sometimes you want to do things from the command line. And the new kubectl-operator plugin looks like it’ll make that much simpler.

I had a quick play with it this evening, and it let me get the Event Streams operator running with three commands. (Compare this with the OpenShift Console web UI equivalent in my Event Streams demo video).

(more…)

IBM Event Streams v10

Tuesday, June 30th, 2020

On Friday, we released the latest version of IBM Event Streams. This means I’ve been doing a variety of demo sessions to show people what we’ve made and how it works.

Here’s a recording of one of them:

In this session, I did a run-through of the new Event Streams Operator on Red Hat OpenShift, with a very quick intro to some of the features:

00m30s – installing the Operator
02m10s – creating custom Kafka clusters in the OpenShift console
05m10s – creating custom Kafka clusters in IBM Cloud Pak for Integration
08m00s – running the sample Kafka application
08m50s – creating topics
10m20s – creating credentials for client applications
11m45s – automating deployment of event-streaming infrastructure
12m30s – using schemas with the schema registry
13m10s – sending messages with HTTP POST requests
13m45s – viewing messages in the message browser
14m00s – command line administration
14m30s – running Kafka Connect
15m10s – geo-replication for disaster recovery
15m50s – monitoring Kafka clusters in the Event Streams UI
17m10s – monitoring with custom Grafana dashboards
17m30s – alerting using Prometheus

This is IBM

Saturday, June 20th, 2020

The “This is IBM” videos are a nice intro to some of the things that we work on at Hursley.

They’re not too technical, they’re not “sales-y” for IBM products, they’re interesting stories, and each one is only a few minutes long.

I also like them as I’ve worked with all of these awesome people before, so it’s fun to see them being all serious on camera – even if it makes me a little jealous that they’re so much better at it than me!

(more…)

Bringing AI into the classroom

Friday, February 28th, 2020

IBM and mindSpark are running a series of free webinars for teachers about artificial intelligence.

This evening’s 90 minute webinar was about bringing AI into the classroom, and I helped contribute some of the content.

The session was very interactive, but there were some pre-prepared presentations in there.

I’ve got a recording of one of the segments below, in which I shared some of my experiences of introducing AI and machine learning in schools, and what I’ve found works well.

(more…)

A run-through of IBM Event Streams

Thursday, January 16th, 2020

I needed to quickly record a demo of what it looks like to get started with Event Streams yesterday.

It’s a little rough around the edges (it was only for an internal event, so the production values were essentially me-talking-at-my-laptop without a lot of planning or editing) but I thought I’d share it here in case I need to point anyone else at it.