In this post, I want to share two examples of how you can use pattern recognition in Apache Flink to turn a noisy stream of output into something useful and actionable.
Imagine that you have a Kafka topic with a stream of events that you sometimes need to respond to.
You can think of this conceptually as being a stream of values that could be plotted against time.
To make this less abstract, we can use the events from the sensor readings topic that the “Loosehanger” data generator produces to. Those events are a stream of temperature and humidity readings.
Imagine that these events represent something that you might need to respond to when the event for a sensor exceeds some given threshold.
You can think of it visually like this: