In this post, I want to share a quick demo of using Event Processing to process social media posts.
Background
A fun surprise from Nintendo today: they’ve introduced a new product! “Alarmo” is a game-themed alarm clock, with some interesting gesture recognition features.
I was (unsurprisingly!) tempted…
But that got me wondering how the rest of the Internet was reacting.
In this post, I want to share a (super-simple!) demo for how to look at this – using IBM Event Processing to create an Apache Flink job that looks at the sentiment of social media posts about this unusual new product.
aka Approaches to managing Kafka topic creation with IBM Event Streams
How can you best operate central Kafka clusters, that can be shared by multiple different development teams?
Administrators talk about wanting to enable teams to create Kafka topics when they need them, but worry about it resulting in their Kafka clusters turning into a sprawling “Wild West”. At best, they talk about the mess of anonymous topics that are named and configured inconsistently. At worst, they talk about topics being created or configured in ways that negatively affect their Kafka cluster and impact their other users.
With that in mind, I wanted to share a few ideas for how to control the topics that are created in your Event Streams cluster:
For example, cheat codes. You’d press a specific sequence of buttons on the game controller at a specific time to unlock some “secret” bit of content – like special abilities, special resources, or levels.
Some of these are so ingrained in me now that my fingers just know how to enter them without thinking. The level select cheat for Sonic the Hedgehog is the best example of this: press UP, DOWN, LEFT, RIGHT, START + A during the title screen to access a level select mode that would let you jump immediately to any part of the game.
With this in the back of my head, it’s perhaps no surprise that when I needed to explain pattern recognition in Apache Flink, the metaphor I thought of first was how games of yesteryear could recognize certain button press sequences.
If you think of each button press on the game controller as an event, then recognizing a cheat code is just a pattern of events to recognize.
And once I thought of the metaphor – I had to build it. 🙂
Version 1 (virtual controllers)
There is more detail on how I built this in the git repository, but this is the overall idea for what I’ve made.
In this post, we share examples of using quotas with IBM Event Endpoint Management, give you some pointers to help you try them for yourself, and most importantly get you thinking about where this might be useful for your own catalog.
Event Endpoint Management makes it easy for you to share your Kafka topics. Put some of your Kafka topics in the catalog, and allow colleagues and partners to discover the topics, so they can use the self-service catalog page to get started with them immediately.
Increasing reuse of your streams of events makes it possible for your business to unlock even more value from them. Innovative new ways to use them, that you might not have even thought of, will be enabled the more widely you share.
But before you invite colleagues and partners to start using your topics, you want to make sure that you’re ready. Event Endpoint Management offers a range of tools to make sure that you remain in control. Quotas are just one of these, and we dig into what they offer in this post.
You’ve discovered a topic in the IBM Event Endpoint Management catalog that someone in your company has shared. It looks useful, so you want to use that stream of events to maintain a local projection in your database.
Or maybe you’ve discovered a topic in the Catalog that is available to produce to, and you want to contribute events to it from your MQ queue.
What are the options for using Kafka Connect to produce to, or consume from, topics that you discover in Event Endpoint Management?
In this post, we’ll share options that you can consider, and briefly outline the pros and cons of each.
In this post, I share a few examples for how to run Event Gateways for Event Endpoint Management.
When we talk about Event Endpoint Management, we often draw logical diagrams like this, with Kafka client applications able to produce and consume events to back-end Kafka clusters via an Event Gateway.
When it comes to start planning a deployment, we need to make decisions about the best way to create that logical Event Gateway layer. This typically includes running multiple gateways, but there are many different ways to do this, depending on your requirements for scaling and availability.
For this post, I want to show two approaches for running two Event Gateways, as a way of illustrating the kind of topologies that are possible.
In this demo, I give examples of different ways to enrich a stream of events – and the types of event processing that this can enable.
I presented this in a webinar with Matt Sunley (replay recording available) yesterday. Matt started with some context, explaining how enriching events with data from external sources can enable event processing solutions that aren’t otherwise possible.
And then I ran through a demo, creating an event processing flow that included four types of enrichment.