Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

According to the push data of urlooker, generate the application survival report (it feels silly anyway)

2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

I. requirements

Recently, the boss of the company has asked to output the daily survival status of the application into a report and send it to them by e-mail every day. Output in the report: the name of the application, the server IP where the application is located, the point in time when the application died, and the duration of the application hanging.

II. Introduction to the environment

     the monitoring system used by our company is Xiaomi's open-falcon, while app-alive 's monitoring uses the urlooker,urlooker monitoring mode recommended by Xiaomi, which pushes the application status to open-falcon every minute. On the other hand, we use the ops system developed by ourselves to take the framework. Ops directly calls the interface of open-falcon. In ops, you can see the status of the application's survival, and the ops.app_detail table records the application id (unique ID) and the application name. So at first my idea was to go to ops to record the status of the application directly query the table. As a result, I found that ops has two tables to record the data related to the status of the application. One sheet records the information after the aggregation of the duration of the application of down, and the other records the survival rate of 1-day and 30-day applications. This is not what I want at all, so after thinking about it for a long time, I decided to go directly to urlooker to get the data.

About the urlooker database:

There are two tables related to app-alive: urlooker.strategy and urlooker.item_status00. The first table has two useful fields: urlooker.strategy.ops_cp_app_id (corresponding to ops.app_ detail.id) and urlooker.strategy.environment (differentiating testing, pre-release, and release environments). The second table, urlooker.item_status00, is urlooker push data per minute, which holds 12 hours of data by default.

Third, query urlooker database to obtain data source

The desired data source is obtained through cross-database join table query. The result of my query here is that all the hang status information is applied if the result is not 0, and output by sorting within the group.

SELECT ops.app_detail.app_name, urlooker.item_status00.sid AS sid, urlooker.strategy.ops_cp_app_id, urlooker.item_status00.ip, urlooker.strategy.note, urlooker.item_status00.result, ROUND (urlooker.item_status00.push_time) AS down_time FROM_UNIXTIME (urlooker.item_status00.push_time) AS new_time FROM urlooker.item_status00 LEFT JOIN urlooker.strategy ON urlooker.item_status00.sid = urlooker.strategy.id LEFT JOIN ops.app_detail ON ops.app_detail.id = urlooker.strategy.ops_cp_app_id WHERE urlooker.strategy.environment = "release" AND urlooker.item_status00.result "0" ORDER BY sid DESC, down_time DESC

Originally, I wanted to do scheduled tasks directly through sql, but I really don't know what to do with the calculation and processing of data sources. So export the data source directly and process it with shell.

Fourth, deal with data sources

Write shell scripts to work with data sources:

# cat send_app_alived.sh #! / bin/bashDataInputPath=/home/app-alive/source.txtDataOutputPath=/home/app-alive/$ (date +% F)-output.csv#data mysql-uops_ro-pxxxx-e "source / home/app-alive/select.sql" > $DataInputPath 2 > / dev/nullapp_id=$ (awk'{print $2}'$DataInputPath | grep-v sid | uniq) if [!-f "$DataOutputPath"] "then echo" application name, server IP, application stop time Stop duration "> $DataOutputPathfifor id in $app_iddo down_time=$ (awk'{if ($2 million roommates filled idid') print $7}'$DataInputPath) app_name=$ (awk'{if ($2 million thanks thanks id') print $1}'$DataInputPath | uniq) app_ip=$ (awk'{if ($2 million thanks thanks id') print $4}'$DataInputPath | uniq) down_time_num=$ (awk'{if ($2 million roommates) print $7}'$DataInputPath | wc-l) if [" $down_time_num "- eq" 1] Then Globaltimetables $(date-d "1970-01-01 UTC $down_time seconds" + "% Y-%m-%d% HJV% M") echo "$app_name,$app_ip,$G_time,1 minutes" > > $DataOutputPath else sum=0 count=0 for i in $down_time do sum=$ ((i-sum)) if [$sum-ne-60] & & [$sum-ne $I] Then Gaming timetables $(date-d "1970-01-01 UTC $down_sum seconds" + "% Y-%m-%d% Hpurs% M") echo "$app_name,$app_ip,$G_time,$ {count} minutes" > > $DataOutputPath count=0 fi if [$count-eq 0]; then down_sum=$i fi count=$ ((count + 1)) sum=$i done if [$count-eq 1] Then Gaming timetables $(date-d "1970-01-01 UTC $down_sum seconds" + "% Y-%m-%d% HRV% M") echo "$app_name,$app_ip,$G_time,1 minutes" > > $DataOutputPath fi fidone

After the script is processed, the output is as follows:

# cat 2019-10-24-output.csv

Name, server IP, application stop time, stop duration

App01,10.25.100.36,2019-10-24 02 21 minute

App01,10.81.126.19,2019-10-24 02 17pm 1 minute

App02,10.81.126.19,2019-10-24 10 purge 43 mins

.

But what is output here is only 12 hours of data, so add a scheduled task:

59 11 usr/bin/bash 23 * / usr/bin/bash / home/app-alive/send_app_alived.sh

5. Send mail by python script

Find any python script on the Internet and modify it:

# cat sendmail.py#! / usr/bin/env python#-*-coding: utf-8-*-import smtplibimport email.mime.multipartimport email.mime.textfrom email.mime.text import MIMETextfrom email.mime.multipartimport MIMEMultipartfrom email.mime.application import MIMEApplicationfrom email.mime.multipartimport MIMEMultipartfrom email.mime.base import MIMEBaseimport emailimport base64import timeimport osclass send_mail:def _ _ init__ (self, From, To, pw, file_path, file_header File_body): # sender self.From = From# recipient ['aaa@a.com' 'bbb@a.com'] self.To = list (To) # CC person # self.Cc = list (Cc) # Login email password base64.encodestring (' plaintext') encrypted password self.pw = pw# file specific path (path + file name) self.file_path = file_path# header self.file_header = file_header# content self.file_body = file_bodydef login (self): server = smtplib.SMTP ( 'smtp.qq.com') server.starttls () # pwd = base64.decodestring (self.pw) server.login (self.From Self.pw) try:receive = self.To#receive.extend (self.Cc) server.sendmail (self.From,self.To,self.atta ()) finally:server.quit () def atta (self): main_msg = MIMEMultipart () # content text_msg = MIMEText (self.file_body) main_msg.attach (text_msg) try:contype = 'application/octet-stream'maintype, subtype = contype.split (' /', 1) data = open (self.file_path, 'rb') file_msg = MIMEBase (maintype Subtype) file_msg.set_payload (data.read () data.close () email.encoders.encode_base64 (file_msg) basename = os.path.basename (self.file_path.split ('/') [- 1]) file_msg.add_header ('Content-Disposition',' attachment', filename=basename) main_msg.attach (file_msg) except Exception as ret:print (ret) main_msg ['From'] = self.Frommain_msg [' To'] = " ".join (self.To) # main_msg ['Cc'] =" ".join (self.Cc) # heading main_msg ['Subject'] = self.file_headermain_msg [' Date'] = email.utils.formatdate () fullText = main_msg.as_string () return fullTextif _ _ name__ ='_ _ main__':fileTime = time.strftime ("% Y-%m-%d ", time.localtime ()) s = send_mail ('9426096 roomqq.compositions, [' 1224877 roomq.compositions, 'jshntbmbejj' "/ home/app-alive/" + fileTime+ "- output.csv", 'Application Survival record',') s.login () print ('sent successfully!')

All right, it's done. Although achieved, but feel so silly, open-falcon is not easy to use ah, the company quickly recruit an operation and maintenance development bar, I am so difficult ~

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report