A run-through of the DEVELOPMENT.md guide.
In this video, I go from zero to a running Machine Learning for Kids website (including installing all the necessary dependencies and building the site from source).
A run-through of the DEVELOPMENT.md guide.
In this video, I go from zero to a running Machine Learning for Kids website (including installing all the necessary dependencies and building the site from source).
In this post, I want to describe how to use AsyncAPI to document how you’re using Apache Kafka. There are already great AsyncAPI “Getting Started” guides, but it supports a variety of protocols, and I haven’t found an introduction written specifically from the perspective of a Kafka user.
I’ll start with a description of what AsyncAPI is.
“an open source initiative … goal is to make working with Event-Driven Architectures as easy as it is to work with REST APIs … from documentation to code generation, from discovery to event management”
The most obvious initial aspect is that it is a way to document how you’re using Kafka topics, but the impact is broader than that: a consistent approach to documentation enables an ecosystem that includes things like automated code generation and discovery.
Machine Learning for Kids lets students train their own machine learning models in a simplified child-friendly training tool. A variety of project types are supported (such as classifying text, images, numeric data, sound recordings, etc.). Under the covers, machine learning models they train are created and hosted using IBM Watson cloud services, such as Watson Assistant and Watson Visual Recognition
I’m currently investigating image projects being created and hosted in the browser, without using Watson cloud API calls.
This post is a simple example of how to use a machine learning model to make predictions on a stream of events on a Kafka topic.
It’s more a quick hack than a polished project, with most of this code hacked together from samples and starter code in a single evening. But it’s a fun demo, and could be a jumping-off point for starting a more serious project.
For the purposes of a demo, I wanted to make a simple example of how to implement this pattern, using:
With that goal in mind, I went with:
I’ve got my phone publishing a live stream of raw sensor readings, and passing that stream through an ML model to give me a live stream of events like “phone has been put on a table”, “phone has been picked up and is in my hand”, or “phone has been put in a pocket while I’m sat down”, etc.
Here is it in action. It’s a bit fiddly to demo, and a little awkward to film putting something in your pocket without filming your lap, so bear with me!
The source code is all at
github.com/dalelane/machine-learning-kafka-events.
Operators bring a lot of benefits as a way of managing complex software systems in a Kubernetes cluster. In this post, I want to illustrate one in particular: the way that custom resources (and declarative approaches to managing systems in general) enable easy integration with source control and a CI/CD pipeline.
I’ll be using IBM Event Streams as my example here, but the same principles will be true for many Kubernetes Operators, in particular, the open-source Strimzi Kafka Operator that Event Streams is based on.
I’ve started adding pretrained machine learning models to Machine Learning for Kids. In this post, I wanted to describe what I’m doing.
In this post, I want to share a random thing I made in Scratch this week, and ask for suggestions of what I could do with it.
I get a lot of emails from teachers and coding groups asking for help with Scratch projects. They’re normally small or specific questions – asking for help figuring out a bug in a Scratch project or how to get something working.
But this week I got a more challenging email. It asked for a way to show a map in Scratch, and use a Scratch script to plot points on the map, given coordinates in latitude and longitude.
I agreed to give it a try. (Details for how to access it below.)
In this post, I want to explain how to get started creating machine learning applications using the data you have on Kafka topics.
I’ve written a sample app, with examples of how you can use Kafka topics as:
I’ll use this post to explain how it works, and how you can use it as the basis of writing your first ML pipeline using the data on your own Kafka topics.