Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the common commands and examples of Shell

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article mainly explains "what are the common commands and examples of Shell". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn what Shell commands and examples are commonly used.

1. If users under Windows want to use the shell command character, please install cygwin first, and the installation method is Google. (for technical problems, please use google. Baidu can't find it.)

2. The following is a rough description of the command line usage commonly used in SEO log analysis. If you need to know more about each command line, please use Google.

Less file name view file contents press "Q" to exit

Cat file name opens a file, and you can open several files multiple times | cat 1.log 2.log | cat * .cat

Grep-Parameter file name

-I is not case sensitive

-v displays all lines that do not meet the criteria

-c displays the number of rows that meet the criteria (the number of rows that meet the criteria)

Egrep belongs to the upgraded version of grep. The support for regularization is more perfect. It is recommended to use egrep when using regularization.

Head-2 filename displays 2 lines

Head-100File name | tail-10 > > a.log extracts 91-100th lines of data from the file

Wc-Parameter file name statistics text size, number of characters, number of lines

-c Statistics the number of bytes of text

-m statistics on the number of text characters

-l count the number of lines in the text

Sort-the parameter file name sorts the file

-n sort files by number

-r reverse sort

Uniq-the file name of the parameter is deduplicated, and sorting is required before it is duplicated.

Sort

-c displays the number of data duplicates

Split-the parameter file name cuts the file

-100 (cut into a file every 100 lines)

-C 25m/b/k (split into one file every 25 megabytes / byte / K)

| | Pipeline, transfer the result of the previous command to the next command |

">" and "> >" redirect writing to the file ">" is equivalent to "w" emptying and writing ">" is equivalent to "a" appending to the file.

The awk-F 'delimiter' Pattern {action} filename segments each line of data using the specified characters. The default is a space (the site log is separated by spaces).

-F is followed by a delimiter

Pattern is the condition that action executes. Regular expressions can be used here.

$n instant segment of data $0 represents the whole row of data

NF indicates the number of fields in the current record

$NF represents the last field

BEGIN and END, both of which can be used in pattern, provide BEGIN and END to give the program an initial state and perform some clean-up work after the program ends

Bash shell.sh runs the shell.sh script

Dos2unix xxoo.sh converts "\ r\ n" to "\ n" Windows-- > linux (because the newline characters under Windows and Linux are different, the code under Windows needs to be converted to newline characters under Linux using dos2unix, otherwise running the shell script will report an error.)

Unix2dos xxoo.sh converts "\ n" to "\ r\ n" linux-- > Windows

Rm xx.txt deletes the xx.txt file

3, some simple commands introduced here, need to know shell, suggest you look at the relevant books.

Let's start using shell to analyze the log

1. Cutting Baidu's crawling data (cutting out files to deal with special crawler data can improve efficiency)

The code is as follows:

Cat log.log | grep-I 'baiduspider' > baidu.log

2. Query the number of website status codes

The code is as follows:

Awk'{print $9} 'baidu.log | sort | uniq-c | sort-nr

3. Baidu's total capture

The code is as follows:

Wc-l baidu.log

4. Baidu does not repeat the capture quantity.

The code is as follows:

Awk'{print $7} 'baidu.log | sort | uniq | wc-l

5. The average data size of each crawl by Baidu (the result is KB)

The code is as follows:

Awk'{print $10} 'baidu.log | awk' BEGIN {axi0} {aquifer 1} END {print a/NR/1024}'

6. Home page capture quantity

The code is as follows:

Awk'$7clients /\ .com\ / $/ 'baidu.log | wc-l

7. The amount captured by a certain directory

The code is as follows:

Grep'/ news/' baidu.log | wc-l

8. Crawl up to 10 pages

The code is as follows:

Awk'{print $7} 'baidu.log | sort | uniq-c | sort-nr | head-10

9. Find the crawled 404 error page

The code is as follows:

Awk'$9 ~ / ^ 404 $/ {print $7} 'baidu.log | sort | uniq | sort-nr

10. Find out how many js files have been crawled and how many times they have been crawled

The code is as follows:

Awk'$7 ~ / .js$/ {print $7} 'baidu.log | sort | uniq-c | sort-nr

At this point, I believe you have a deeper understanding of "what Shell commands and examples are commonly used". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report