In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/02 Report--
Practice and Application of Log Analysis
In this scenario, the log system usually uses INFO level or ERROR level for the processing capacity of the system. When problems are found, we need to dynamically change the log level to DEBUG to find problems in the implementation details. The dynamic change operations are listed below, with the help of code, timing and configuration center services.
In the micro-service scenario, the logs are scattered in various service cluster nodes, so it is not easy to view them, so we need to collect them in one place to save, view and analyze.
Configuration of logs in the application
Logback.xml
1. You can reuse variables, default configurations, and policies by configuring include in configuration, referencing the basic defaults.xml, console-appender.xml, and file-appender.xml configurations. two。 Define an appender for stash, configure the target host and port, and what to use for the transcoder. Send the log to the unified log management platform for further analysis and preservation through configuration. Note: there is no official reason for recommending the use of logback-spring.xml instead of logback.xml. After testing, the logback.xml configuration is still available, and it can also be automatically restarted after changes, so it has not been changed. Just note that the configuration attribute scan cannot be set to true, which can be scanned by spring.
The specific configuration is as follows:
% d {yyyy-MM-dd HH:mm:ss} [% level] -% m% n d:/logs/error.log% d {yyyy-MM-dd HH:mm:ss} [% class:%line] -% m% n ERROR ACCEPT DENY error.%d { Yyyy-MM-dd} .log 30 localhost:4560 ${fluentHost} regular update log code / * log level update schedule refresh every 2 minutes * / @ Scheduled (fixedRate=1000*60*2) public void refresh () {String moduleKey = "com.ftsafe" / / determine that this method applies only to logback's logging implementation if (log instanceof ch.qos.logback.classic.Logger) {Config applicationConfig = ConfigService.getAppConfig (); String levelConfig = applicationConfig.getProperty ("logger.level." + moduleKey, null); ch.qos.logback.classic.Logger classicLog = (ch.qos.logback.classic.Logger) log; ch.qos.logback.classic.Logger logger = classicLog.getLoggerContext () .getLogger (moduleKey) Logger.setLevel (Level.toLevel (levelConfig)); log.debug ("logger modify level {}", levelConfig); log.info ("logger modify level {}", levelConfig);} log.info ("logger refresh invoked!"); log.debug ("logger refresh invoked!");} centralized log management (win environment) Elasticsearch
Is a search and analysis engine
Decompression
Https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.0.0.zip
Start
D:/baiduYun/java/elasticsearch-6.0.0/bin/elasticsearch.batKibana
Allows users to use charts and charts to visualize data in Elasticsearch
Decompression
Https://artifacts.elastic.co/downloads/kibana/kibana-6.0.0-windows-x86_64.zip
Start
D:/baiduYun/java/kibana-6.0.0-windows-x86_64/bin/kibana.batlogstash
Is a server-side data processing pipeline that can simultaneously obtain data from multiple sources and convert it to "stash" such as Elasticsearch
Decompression
Https://artifacts.elastic.co/downloads/logstash/Logstash-6.0.0.zip
Configure the contents of the logstash.conf profile
Input {tcp {port = > 4560 host = > localhost}} output {elasticsearch {hosts = > ["localhost:9200"]} stdout {codec = > rubydebug}}
Start
D:/baiduYun/java/logstash-6.0.0/bin/logstash.bat-f d:\ baiduYun\ java\ logstash-6.0.0\ bin\ logstash.conf browse log
Visit http://localhost:5601 to enter the kibana interface, and you can enter the retrieval conditions in the Discover interface to find out what you want to retrieve.
Appendix:
Spring logging
Https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-logging.html
Defaults.xml
Console-appender.xml
${CONSOLE_LOG_PATTERN}
File-appender.xml
${FILE_LOG_PATTERN} ${LOG_FILE} ${LOG_FILE}.% d {yyyy-MM-dd}.% i.gz ${LOG_FILE_MAX_SIZE:-10MB} ${LOG_FILE_MAX_HISTORY:-0} reference content https://www.jianshu.com/p/c9d9fe37256a "51CTO download-Spring Cloud Micro Services Architecture Development practice .pdf"
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.