In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces you how to use FLINK to monitor the website alarm and alarm recovery, the content is very detailed, interested friends can refer to, I hope to help you.
Flink CEP Profile
Flink CEP (Complex event processing) is a complex event processing library implemented on top of Flink, which allows us to detect and obtain the data we want through our own defined patterns in the continuous streaming data, and then process these data in the next step. By combining patterns, we can define very complex patterns to match our data.
There are many articles on CEP principles and usage on the Internet. You can refer to https://juejin.im/post/5de1f32af265da05cc3190f9#heading-9
To put it simply, we can actually use flink cep as our usual regular expression. Pattern in cep is the regular expression we define. DataStream in flink is the string to be matched in the regular expression. Flink matches DataStream with custom Pattern to generate a filtered DataStream .
Based on custom patterns, we can do a lot of work, such as monitoring alarms, risk control, anti-climbing, etc. Next, we will explain some practical applications of FLINK ceps based on a simple alarm example.
Case Details
Let's make a simple alarm based on flink CEP. First, let's simplify the alarm requirements.
1. Count out the proportion of http status codes other than 200 per second. When it is greater than 0.7, an alarm is triggered.
2. The statistical results are continuously greater than the threshold value (0.7, this number is written by myself, for testing purposes, the real environment needs to be set according to actual experience) to send an alarm notification.
3. Statistical results less than or equal to threshold trigger alarm recovery notification.
In practice, we generally consume kafka data as a source. Here, for simplicity, we generate some simulated data by defining a source.
public static class MySource implements SourceFunction{
static int status[] = {200, 404, 500, 501, 301};
@Override
public void run(SourceContext sourceContext) throws Exception{
while (true){
Thread.sleep((int) (Math.random() * 100));
// traceid,timestamp,status,response time
Tuple4 log = Tuple4.of(
UUID.randomUUID().toString(),
System.currentTimeMillis(),
status[(int) (Math.random() * 4)],
(int) (Math.random() * 100));
sourceContext.collect(log);
}
}
@Override
public void cancel(){
}
}
Next we define an sql that calculates the first requirement of our requirements.
String sql = "select pv,errorcount,round(CAST(errorcount AS DOUBLE)/pv,2) as errorRate," +
"(starttime + interval '8' hour ) as stime," +
"(endtime + interval '8' hour ) as etime " +
"from (select count(*) as pv," +
"sum(case when status = 200 then 0 else 1 end) as errorcount, " +
"TUMBLE_START(proctime,INTERVAL '1' SECOND) as starttime," +
"TUMBLE_END(proctime,INTERVAL '1' SECOND) as endtime " +
"from log group by TUMBLE(proctime,INTERVAL '1' SECOND) )";
By executing sql, we get a DataStream of Result object,
Table table = tenv.sqlQuery(sql);
DataStream ds1 = tenv.toAppendStream(table, Result.class);
Then we get to the heart of the matter and we need a Pattern.
Pattern pattern = Pattern.begin("alert").where(new IterativeCondition(){
@Override
public boolean filter(
Result i, Context context) throws Exception{
return i.getErrorRate() > 0.7D;
}
}).times(3).consecutive().followedBy("recovery").where(new IterativeCondition(){
@Override
public boolean filter(
Result i,
Context context) throws Exception{
return i.getErrorRate()
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.