Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Mysql slow query and error log analysis

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)06/01 Report--

Mysql slow query and error log analysis and alarm view are troublesome. The current slow query alarms only reflect the number of slow queries.

We have made a program for slow query log alarm and analysis.

The background uses the filebeat log file shipping tool to transfer logs to the redis database. Filebeat uses es by default. The timer is executed every minute.

Vi / etc/filebeat/filebeat.yml

Filebeat.prospectors: paths:-/ data/mysql/xxx/tmp/slow.log document_type: syslog fields: app: mysql_slowlog port: xxx ip: xxxx scan_frequency: 30s tail_files: true multiline.pattern:'^\ #\ Time' multiline.negate: true multiline.match: after app: mysql_slowlog output.redis: enabled: true hosts: ["IP:port"] port: 2402 key: filebeat keys:-key: "% {[fields.app]}" mapping: " Mysql_slowlog ":" mysql_slowlog "" mysql_errorlog ":" mysql_errorlog "db: 0 datatype: list logging.to_files: true

Redis data is read on the monitoring side and routinely processed to the mysql database.

Vi / data/mysql_slowLog.py

#! / usr/bin/env python

#-*-coding: utf-8-*-

Import redis

Import json

Import pymysql

Import re

Import time

Import threading # redis connect info

RedisHost = 'xxx'

RedisPort = 2402

RedisDB ='0'

RedisKey = 'mysql_slowlog' # mysql connect info

MysqlHost = 'xxx'

MysqlPort = 2001

# mysqlPort = 23306

MysqlUser =''

MysqlPasswd =''

# mysqlPasswd = 'open'

MysqlDB =''

MysqlTablePrefix = 'mysql_slowlog_'

CollectStep = 60

Def time_log ():

Return'['+ time.strftime ('% Y-%m-%d% HGV% MVR% SUV, time.localtime (time.time () +']''

Def gather_log (redisConn):

Data_list = []

LogList = []

KeyState = redisConn.exists (redisKey)

If keyState:

LogLen = redisConn.llen (redisKey)

If logLen > 0:

RedisKeyNew = redisKey +'- bak'

RedisConn.renamenx (redisKey, redisKeyNew)

LogList = redisConn.lrange (redisKeyNew,0,logLen)

RedisConn.delete (redisKeyNew)

Else:

Pass

Else:

Pass if len (logList) > 0:

For item in logList:

Data_dict = {}

SlowLogJson = json.loads (item)

# print (slowLogJson ['message'])

Data_dict ['hostname'] = slowLogJson [' beat'] ['hostname']

# print (slowLogJson ['beat'] [' hostname'])

Data_dict ['ip'] = slowLogJson [' fields'] ['ip']

# print (slowLogJson ['fields'] [' ip'])

Data_dict ['port'] = slowLogJson [' fields'] ['port']

# print (slowLogJson ['fields'] [' port'])

LogContent = slowLogJson ['message'] # Regex

TimeRe = ringing # Time: (. *)\ n # User@Host:'

UserRe = ringing # User@Host:.*\ [(. *?)\]\ sspeak @'

HostRe = ringing # User@Host:. *\ [(. *?)\] Id:'

SchemaRe = ringing # Schema:\ s + (. *?)\ sloaded Lastbelt errnoRank'

QueryRe = ringing # Query_time:\ s + (. *?)\ sroomLockkeeper timegroup'

LocklRe = ringing # Query_time:.*?Lock_time:\ s + (. *?)\ squalified Rowssentries'

RowsRe = ringing # Query_time:.*?Lock_time:.*?Rows_sent:\ s + (\ d +)\ sowned Rowslike examinedVRV'

BytesRe = ringing # Bytes_sent:\ s + (\ d +)'

TimestampRe = r'SET\ stimestamp = (. *?);'

CommandRe = r'SET\ sroomtimestampboys. Examples;\ n (. *?) (? = $)'

If re.findall (timeRe, logContent):

Data_dict ['sys_time'] = upright 20' + re.findall (timeRe, logContent) [0]

Data_dict ['sys_time'] = data_dict [' sys_time'] [: 4] +'-'+ data_dict ['sys_time'] [4:6] +' -'+ data_dict ['sys_time'] [6:]

Data_dict ['cli_user'] = re.findall (userRe, logContent) [0]

Data_dict ['cli_ip'] = re.findall (hostRe,logContent) [0]

Data_dict ['schema'] = re.findall (schemaRe,logContent) [0]

Data_dict ['query_time'] = re.findall (queryRe,logContent) [0]

Data_dict ['lock_time'] = re.findall (locklRe,logContent) [0]

Data_dict ['rows_sent'] = re.findall (rowsRe,logContent) [0]

Data_dict ['bytes_sent'] = re.findall (bytesRe,logContent) [0]

Data_dict ['timestamp'] = re.findall (timestampRe,logContent) [0]

Data_dict ['command'] = re.findall (commandRe,logContent,re.M) [0]

Data_list.append (data_dict)

Else:

Pass

# print ('Not slowlog data')

Else:

Pass

# print ('No data')

Return data_list def send_data (data,mysql_pool):

MysqlTableDate = time.strftime ('% Y% masks, time.localtime (time.time ()

MysqlTable = mysqlTablePrefix + mysqlTableDate

Cursor = mysql_pool.cursor ()

Data_list = []

CreateTableSql = "create table mysql_slowlog_000000 (`id` int (11) NOT NULL AUTO_INCREMENT,"\

"hostname varchar (64) NOT NULL,"\

"ip varchar (20) NOT NULL,"\

"port int (11) NOT NULL,"\

"sys_time datetime NOT NULL,"\

"cli_user varchar (32) NOT NULL,"\

"cli_ip varchar (32) NOT NULL,"\

"`schema`varchar (32) NOT NULL,"\

"query_time float (6p3) NOT NULL,"\

"lock_time float (6p3) NOT NULL,"\

"rows_sent int (11) NOT NULL,"\

"bytes_sent int (11) NOT NULL,"\

"`timestamp` varchar (40) NOT NULL,"\

"command varchar (2048) DEFAULT NULL,"\

"PRIMARY KEY (`id`),"\

"KEY `cli_ user` (`cli_ user`),"\

"KEY `query_ time` (`query_ time`),"\

"KEY `timestamp` (`timestamp`) ENGINE=InnoDB AUTO_INCREMENT=0 DEFAULT CHARSET=utf8"

CreateTableSql = createTableSql.replace ('000000000000000 memory mysqlTableDate) # Create slow log table if not exist

Try:

Cursor.execute ("show tables like's'" mysqlTable)

Res = cursor.fetchone ()

If not res:

Cursor.execute (createTableSql)

Mysql_pool.commit ()

Except Exception as e:

Print (time_log () + 'Error:', e)

Mysql_pool.rollback ()

Mysql_pool.close ()

SlowLogInsertSql = "insert into% s"% mysqlTable + "(hostname,"\

"ip,"\

"port,"\

"sys_time,"\

"cli_user,"\

"cli_ip,"\

"`schema`,"\

"query_time,"\

"lock_time,"\

"rows_sent,"\

"bytes_sent,"\

"`timestamp`,"\

"command) values (% SMagol% sMagna% sMagna% sMagol% s)"

If len (data) > 0:

For item in data:

Row = (item ['hostname'] .encode (' utf-8')

Item ['ip'] .encode (' utf-8')

Item ['port']

Item ['sys_time'] .encode (' utf-8')

Item ['cli_user'] .encode (' utf-8')

Item ['cli_ip'] .encode (' utf-8')

Item ['schema'] .encode (' utf-8')

Item ['query_time'] .encode (' utf-8')

Item ['lock_time'] .encode (' utf-8')

Item ['rows_sent'] .encode (' utf-8')

Item ['bytes_sent'] .encode (' utf-8')

Item ['timestamp'] .encode (' utf-8')

Pymysql.escape_string (item ['command']) .encode (' utf-8'))

Data_list.append (row)

Print (len (data_list)) # Insert slow log data

Try:

Cursor.executemany (slowLogInsertSql, data_list)

Mysql_pool.commit ()

Mysql_pool.close ()

Except Exception as e:

Print (time_log () + 'Error:',e)

Mysql_pool.rollback ()

Mysql_pool.close ()

Else:

Print (time_log () +'No data')

Def main ():

Try:

Redis_pool = redis.ConnectionPool (host=redisHost, port=redisPort, db=redisDB)

RedisConn= redis.Redis (connection_pool=redis_pool)

Except:

Print (time_log () + 'Error! Can not connect to redisplaying') Try:

Mysql_pool = pymysql.connect (host=mysqlHost, port=mysqlPort, user=mysqlUser, password=mysqlPasswd, db=mysqlDB)

Except:

Print (time_log () + 'Error! Can not connect to mysqlure')

Print (time_log ())

Data = gather_log (redisConn)

Send_data (data,mysql_pool)

Print (time_log ()) # time scheduler

TimeSchedule = collectStep

Global timer

Timer = threading.Timer (timeSchedule, main)

Timer.start () if _ _ name__ = ='_ main__':

Timer = threading.Timer (1, main)

Timer.start ()

The front end uses django to display the slow query data and sends the response business slow query data to the developer every week.

The same goes for mysql error logs.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report