In this tutorial, we will go through the different types of Apache Flink libraries such as Complex event processing, Gelly and see how it is implemented in Flink with use cases.
Apache Flink Complex Event Processing
FlinkCEP is the API library of Apache Flink that performs the analysis on the stream pattern of the continuous flow of stream data with high throughput and low latency. In today's time, the stream data is increasing day by day from different sources such as sensors, smart mobiles, GPS, and the challenges which organization has to analyses streaming data so that the right decision can be taken. Apache Flink CEP helps in those use cases, it analyses real-time stream data patter and provides real-time output and alerts in case of complexity in-stream patterns so that the corrective decision can be taken from stream data.
The following is an architectural diagram of Apache Flink using FlinkCEP.
In the above figure, we can see the sensor stream data is coming from different sources and Apache Kafka distributed messaging system is sending stream data to Apache Flink after that FlinkCEP is analyzing the patterns of the stream.
Pattern API in Apache Flink
Apache Flink Pattern API is used to write the program for FlinkCEP. The pattern has a layer of states and it moves from the initial stage to the next stage depending upon the logic defined by the user. To identify the patterns uniquely, each pattern will have a unique name that will be used to uniquely identify the patterns.
The following are some of the commonly used FlinkCEP patterns.
1. Begin
This operation is used to define the starting state of a pattern and it is written as mentioned below.
[#]Pattern<Event, ?> start = Pattern.<Event>begin("start");[/#]
2. Next
This operation is used to appends a new pattern. In this operation, the matching event succeeded the previous events. It is defined as mentioned below.
[#]Pattern<Event, ?> next = start.next("next");[/#]
3. Where
This operation defines the filtration condition in which an event should be matched with the filter condition to qualify and it is written as below.
[#]patternState.where(new FilterFunction <Event>() { @Override public boolean filter(Event value) throws Exception { } });[/#]
4. FollowedBy
This operation is used to append the new pattern and the event that takes place should be two events. It is written as below.
[#]Pattern<Event, ?> followedBy = start.followedBy(“next”);[/#]
5. Within
This operation is used to provide the max time frame for matching the event sequence to a pattern and in case the event sequence is not completed and exceeded the time then that will be rejected. It is written as below.
[#]patternState.within(Time.seconds(10));[/#]