In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
For the usage of "awk" command in Linux, this article introduces the corresponding analysis and solution in detail, hoping to help more partners who want to solve this problem to find a more simple and feasible way.
Awk is a powerful text analysis tool, to put it simply, awk is to read the file line by line, (space, tab) as the default separator to slice each line, and then analyze the cut part.
0. Basic usage
Awk is a powerful text analysis tool. To put it simply, awk is to read the file line by line, (space, tab) as the default separator to slice each line, and then analyze the cut parts.
The awk command format is as follows
Awk [- F field-separator] 'commands' input-file (s)
The [- F delimiter] is optional because awk uses spaces and tabs as the default field delimiters, so you don't need to specify this option if you want to browse for spaces between fields and the text of tabs, but if you want to browse files such as / etc/passwd, where fields are separated by colons, you must specify the-F option
Echo "this is a test" | awk'{print $0}'# # output to this is a test
Shell reads the string entered by the user and finds that |, indicates that there is a pipeline. | | left or right is understood as a simple command, that is, the standard output of the previous (left) simple command points to the standard input of the latter (right) standard command |
Awk divides the line into several fields based on the delimiter. $0 is the whole line, $1 is the first field, $2 is the second field, and so on.
To print one or all fields, use the print command. This is an awk action
Echo "this is a test" | awk'{print $1}'# # output as this echo "this is a test" | awk'{print $1, $2}'# # output as this is
The contents of the / etc/passwd file are as follows
Root:x:0:0:root:/root:/bin/bash bin:x:1:1:bin:/bin:/sbin/nologin daemon:x:2:2:daemon:/sbin:/sbin/nologin adm:x:3:4:adm:/var/adm:/sbin/nologin lp:x:4:7:lp:/var/spool/lpd:/sbin/nologin
Name a few simple requirements
1. Show only the account of / etc/passwd
Awk-F:'{print $1}'/ etc/passwd # # output to root bin daemon adm lp
2. Display the first and seventh columns of / etc/passwd, separated by commas, add the column name start1,start7 before all rows start, and add the last row, end1,end7
Awk-F': 'BEGIN {print "start1,start7"} {print $1 "," $7} END {print "end1,end7"}' / etc/passwd # # output as start1,start7 root,/bin/bash bin,/sbin/nologin daemon,/sbin/nologin adm,/sbin/nologin lp,/sbin/nologin end1,end7
BEGIN statements are executed before all text processing actions are executed, and END is executed after all text processing actions are executed
3. In the statistics / etc/passwd file, the line number of each line, the number of columns per line, and the corresponding complete line content
Awk-F:'{print NR "" NF "" $0}'/ etc/passwd # # output as 1 7 root:x:0:0:root:/root:/bin/bash 2 7 bin:x:1:1:bin:/bin:/sbin/nologin 3 7 daemon:x:2:2:daemon:/sbin:/sbin/nologin 4 7 adm:x:3:4:adm:/var/adm:/sbin/nologin 5 7 lp:x:4:7:lp:/var/spool/lpd:/sbin/nologin
1. Support built-in variables
In the above example, NR and NF are actually built-in variables for awk. Some of the built-in variables are as follows
The variable name explains the file name of FILENAMEawk browsing FS sets the input field separator, which is equivalent to the command line-F option NF browsing record number of fields NR read records
2. Support function
The length of the output string
Awk 'BEGIN {print length ("this is a text")}'
# # output as
fourteen
Change the user name of / etc/passwd to uppercase output
Awk-F':'{print toupper ($1)}'/ etc/passwd
# # output as
ROOT BIN DAEMON ADM LP
The common functions are as follows
The function name functions toupper (s) to return uppercase tolower (s) to return s lowercase length (s) to return s length substr (sjournal p) to return the suffix part of the string s starting from p
3. Conditional operation and regular expression matching are supported.
Show lines with daemon in / etc/passwd
Awk-F':'$0 ~ / daemon/' / etc/passwd
# # output as
Daemon:x:2:2:daemon:/sbin:/sbin/nologin awk conditional operator description
< 小于 < = 小于等于 == 等于 != 不等于 ~ 匹配正则表达式 !~ 不匹配正则表达式4、支持流程控制语句,类C语言if while do/while for break continue 输出第一个字段的第一个字符大于d的行 awk -F ':' '{ if ($1 >"d") {print $1} else {print "-"}'/ etc/passwd
# # output as
Root-daemon-lp
You can put the flow control statement into a script, and then call the script for execution, such as the content of test.sh
{if ($1 > "d") {print $1} else {print "-"}}
Execute in the following way, and the effect is the same
Awk-F':'- f test.sh / etc/passwd
# # output as
Root-daemon-lp
5. Application scenarios
Editors seldom use awk for text analysis, and are mainly used to write scripts.
For example, a weibo-interface-1.0.jar application, the startup script is as follows
Start.shnohup java-jar weibo-interface-1.0.jar > out 2 > & 1 &
The shutdown script is as follows, kill.sh
Kill-9 `jps-l | grep 'weibo-interface-1.0.jar' | awk' {print $1}'`
The output of jps-l is as follows
70208 com.st.kmp.main.KmpService 31036 com.st.cis.main.BaiduAnalysisService 66813 weibo-interface-1.0.jar
Also, close all the DataNode nodes of the hadoop cluster (those who don't know the hadoop can be considered as a cluster application), if each machine jps, check the pid,kill. It is troublesome to write a script directly, ssh to each node in turn, and then execute the following command
Kill `jps | grep 'DataNode' | awk' {print $1}'`
The output of jps is
508 DataNode 31481 JournalNode 31973 NodeManager's questions on the usage of the "awk" command in Linux are shared here. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel for more related knowledge.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.