Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use SARG Log Analyzer to analyze Squid logs

2025-03-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article introduces the knowledge of "how to use SARG log analyzer to analyze Squid logs". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

SARG is a web-based tool that analyzes Squid logs and presents the analysis in more detail. System administrators can use SARG to monitor which sites have been visited and track the most visited sites and users. This article covers the SARG configuration work. SARG provides many useful features, but analyzing a raw Squid log file is not straightforward. For example, how do you analyze the timestamps and numbers in the Squid log below?

The code is as follows:

1404788984.429 1162 172.17.1.23 TCP_MISS/302 436 GET http://facebook.com/-DIRECT/173.252.110.27 text/html

1404788985.046 12416 172.17.1.23 TCP_MISS/200 4169 CONNECT stats.pusher.com:443-DIRECT/173.255.223.127-

1404788986.124 172.17.1.23 TCP_MISS/200 955 POST http://ocsp.digicert.com/-DIRECT/117.18.237.29 application/ocsp-response

1404788989.738 342 172.17.1.23 TCP_MISS/200 3890 CONNECT www.google.com:443-DIRECT/74.125.200.106-

1404788989.757 226 172.17.1.23 TCP_MISS/200 942 POST http://clients1.google.com/ocsp-DIRECT/74.125.200.113 application/ocsp-response

1404788990.839 3939 172.17.1.23 TCP_MISS/200 78944 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

1404788990.846 2148 172.17.1.23 TCP_MISS/200 118947 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

1404788990.849 2151 172.17.1.23 TCP_MISS/200 76809 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

1404788991.140 611 172.17.1.23 TCP_MISS/200 110073 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

We use yum to install the necessary dependencies.

The code is as follows:

# yum install gcc make wget httpd crond

Load the necessary services at startup

The code is as follows:

# service httpd start; service crond start

# chkconfig httpd on; chkconfig crond on

Now let's download and extract the SARG

In the previous tutorial, we showed how to configure transparent proxies on CentOS using Squid. Squid provides many useful features, but analyzing a raw Squid log file is not straightforward. For example, how do you analyze the timestamps and numbers in the Squid log below?

The code is as follows:

1404788984.429 1162 172.17.1.23 TCP_MISS/302 436 GET http://facebook.com/-DIRECT/173.252.110.27 text/html

1404788985.046 12416 172.17.1.23 TCP_MISS/200 4169 CONNECT stats.pusher.com:443-DIRECT/173.255.223.127-

1404788986.124 172.17.1.23 TCP_MISS/200 955 POST http://ocsp.digicert.com/-DIRECT/117.18.237.29 application/ocsp-response

1404788989.738 342 172.17.1.23 TCP_MISS/200 3890 CONNECT www.google.com:443-DIRECT/74.125.200.106-

1404788989.757 226 172.17.1.23 TCP_MISS/200 942 POST http://clients1.google.com/ocsp-DIRECT/74.125.200.113 application/ocsp-response

1404788990.839 3939 172.17.1.23 TCP_MISS/200 78944 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

1404788990.846 2148 172.17.1.23 TCP_MISS/200 118947 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

1404788990.849 2151 172.17.1.23 TCP_MISS/200 76809 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

1404788991.140 611 172.17.1.23 TCP_MISS/200 110073 CONNECT fbstatic-a.akamaihd.net:443-DIRECT/184.26.162.35-

We use yum to install the necessary dependencies.

The code is as follows:

# yum install gcc make wget httpd crond

Load the necessary services at startup

The code is as follows:

# service httpd start; service crond start

# chkconfig httpd on; chkconfig crond on

Now let's download and extract the SARG

The code is as follows:

# wget http://downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.8/sarg-2.3.8.tar.gz?

# tar zxvf sarg-2.3.8.tar.gz

# cd sarg-2.3.8

Note: the source code for 64-bit Linux,log.c needs to be patched with the following file.

The code is as follows:

1506c1506

< if (fprintf(ufile->

File, "% s\ t% s\ n", dia,hora,ip,url,nbytes,code,elap_time,smartfilter) if (ufile- > file, "% s\ t%" PRIi64 "\ t% s\ t%ld\ t% s\ n", dia,hora,ip,url, (int64_t) nbytes,code,elap_time Smartfilter) fprintf (fp_log, "% s\ t%" PRIi64 "\ t% s\ t%ld\ t% s\ n", dia,hora,user,ip,url, (int64_t) nbytes,code,elap_time,smartfilter)

1564c1564

< printf("LEN=\t%"PRIi64"\n",nbytes);  ---   >

Printf ("LEN=\ t%" PRIi64 "\ n", (int64_t) nbytes)

Continue and compile / install SARG as follows

The code is as follows:

#. / configure

# make

# make install

After SARG is installed, the configuration file can be modified according to your requirements. The following is an example of a SARG configuration.

The code is as follows:

# vim / usr/local/etc/sarg.conf

Access_log / var/log/squid/access.log

Temporary_dir / tmp

Output_dir / var/www/html/squid-reports

Date_format e # # We use Europian DD-MM-YYYY format here # #

# # we don't want multiple reports for single day/week/month # #

Overwrite_report yes

Now that it's time for the test run, let's run sarg in debug mode to see if there are any errors.

The code is as follows:

# sarg-x

If I is fine, sarg will root the Squid log and create a report under / var/www/html/squid-reports. The report can also be accessed in the browser through the address http:///squid-reports/.

SARG can be used to create daily, weekly, and monthly reports. The time range is specified by the "- d" parameter, and the value is likely to be the number of days / weeks / months that move forward in the form of day-n, week-n, or month-n,n. For example, using week-1,SARG generates a report from the previous week. Using day-2,SARG generates reports from the previous two days.

As a demonstration, we will prepare a scheduled task to run SARG every day.

The code is as follows:

# vim / etc/cron.daily/sarg

#! / bin/sh

/ usr/local/bin/sarg-d day-1

The file requires executable permissions.

The code is as follows:

# chmod 755 / usr/local/bin/sarg

SARG should now prepare daily traffic reports on Squid management. These reports can be easily accessed through the SARG network interface.

This is the end of the content of "how to use SARG log analyzer to analyze Squid logs". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report