In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
Hadoop Shell commands have what, many novices are not very clear about this, in order to help you solve this problem, the following small series will explain in detail for everyone, there are people who need this to learn, I hope you can gain something.
The Shell command to invoke the file system (FS) should be of the form bin/hadoop fs. All FS shell commands take URI paths as arguments.
1、cat
Description: outputs the contents of the path-specified file to stdout.
Usage: hadoop fs -cat URI [URI …]
Example:
hadoop fs -cat hdfs://host1:port1/file1 hdfs://host2:port2/file2 hadoop fs -cat file:///file3/user/hadoop/file4
Return value: 0 for success,-1 for failure.
2、chgrp
Description: Change the group to which a file belongs. Using-R causes changes to recursively occur under the directory structure. The user of the command must be the file owner or superuser.
Usage: hadoop fs -chgrp [-R] GROUP URI [URI …]
Example:
hadoop fs -chgrp -R hadoop /user/hadoop/
3、chmod
Description: Change permissions on files. Using-R causes changes to recursively occur under the directory structure. The user of the command must be the file owner or superuser.
Usage: hadoop fs -chmod [-R] URI [URI …]
Example:
hadoop fs -chmod -R 744 /user/hadoop/
4、chown
Description: Change the owner of a file. Using-R causes changes to recursively occur under the directory structure. The user of the command must be a superuser.
Usage: hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI ]
Example:
hadoop fs -chmod -R hadoop /user/hadoop/
copyFromLocal(local to hdfs)
Description: Similar to put except that the source path is a local file.
Usage: hadoop fs -copyFromLocal URI
copyToLocal(hdfs to local)
Description: Similar to get except that the destination path is a local file.
Usage: hadoop fs -copyToLocal [-ignorecrc] [-crc] URI
7、cp
Description: Copies files from a source path to a destination path. This command allows multiple source paths, in which case the destination path must be a directory.
Usage: hadoop fs -cp URI [URI …]
Example:
hadoop fs -cp /user/hadoop/file1 /user/hadoop/file2 hadoop fs -cp /user/hadoop/file1 /user/hadoop/file2 /user/hadoop/dir
Return value: 0 for success,-1 for failure.
8、du
Description: Displays the size of all files in the directory, or when only one file is specified, displays the size of that file.
Usage: hadoop fs -du URI [URI …]
Example:
hadoop fs -du /user/hadoop/dir1 /user/hadoop/file1 hdfs://host:port/user/hadoop/dir1
View the size of all hbase files
hadoop fs -du hdfs://master:54310/hbase
Return value: 0 for success,-1 for failure.
9、dus
Description: Displays the file size.
Usage: hadoop fs -dus
10、expunge
Description: Empty the Recycle Bin.
Hadoop fs -expunge
Get(hdfs to local)
Description: Copies files to the local file system. You can copy files that fail CRC checks with the-ignore rc option. Copy files and CRC information using the-crc option.
Usage: hadoop fs -get [-ignorecrc] [-crc]
Example:
hadoop fs -get /user/hadoop/file localfile hadoop fs -get hdfs://host:port/user/hadoop/file localfile
Return value: 0 for success,-1 for failure.
12、getmerge
Description: Accepts a source directory and a destination file as input, and concatenates all files in the source directory to the destination file. addnl is optional and specifies that a newline character be added to the end of each file.
Usage: hadoop fs -getmerge [addnl]
13、ls
Usage: hadoop fs -ls
Description:
(1). If it is a file, file information is returned in the following format:
File name File size Modification date Modification time Permission User ID Group ID
(2). If it is a directory, a list of its immediate subfiles is returned, as in Unix. The directory returns the following information to the list:
Directory Name Modification Date Modification Time Permission User ID Group ID
Example:
hadoop fs -ls /user/hadoop/file1 /user/hadoop/file2 hdfs://host:port/user/hadoop/dir1 /nonexistentfile
Return value: 0 for success,-1 for failure.
14、lsr
Usage: hadoop fs -lsr
Description: Recursive version of ls command. Similar to ls -R in Unix.
15、mkdir
Description: Create these directories by accepting the uri specified by the path as an argument. Its behavior is similar to Unix mkdir -p, which creates parent directories at all levels of the path.
Usage: hadoop fs -mkdir
Example:
hadoop fs -mkdir /user/hadoop/dir1 /user/hadoop/dir2 hadoop fs -mkdir hdfs://host1:port1/user/hadoop/dir hdfs://host2:port2/user/hadoop/dir
Return value: 0 for success,-1 for failure.
16、movefromLocal
Description: Output a "not implemented" message.
Usage: dfs -moveFromLocal
17、mv
Description: Moves files from a source path to a destination path. This command allows multiple source paths, in which case the destination path must be a directory. Moving files between different file systems is not allowed.
Usage: hadoop fs -mv URI [URI …]
Example:
hadoop fs -mv /user/hadoop/file1 /user/hadoop/file2 hadoop fs -mv hdfs://host:port/file1 hdfs://host:port/file2 hdfs://host:port/file3 hdfs://host:port/dir1
Return value: 0 for success,-1 for failure.
18、put
Description: Copies a single or multiple source paths from a local file system to a destination file system. It also supports reading input from standard input and writing input to the target file system.
Usage: hadoop fs -put …
Example:
hadoop fs -put localfile /user/hadoop/hadoopfile hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir hadoop fs -put localfile hdfs://host:port/hadoop/hadoopfile hadoop fs -put – hdfs://host:port/hadoop/hadoopfile
Read input from standard input.
Return value: 0 for success,-1 for failure.
19、rm
Description: Delete the specified file. Delete only non-empty directories and files. Refer to the rmr command for recursive deletion.
Usage: hadoop fs -rm URI [URI …]
Example:
hadoop fs -rm hdfs://host:port/file /user/hadoop/emptydir
Return value: 0 for success,-1 for failure.
20、rmr
Description: Recursive version of delete.
Usage: hadoop fs -rmr URI [URI …]
Example:
hadoop fs -rmr /user/hadoop/dir hadoop fs -rmr hdfs://host:port/user/hadoop/dir
Return value: 0 for success,-1 for failure.
21、setrep
Description: Change the copy coefficient of a file. - The R option is used to recursively change the copy factor for all files in the directory.
Usage: hadoop fs -setrep [-R]
Example:
hadoop fs -setrep -w 3 -R /user/hadoop/dir1
Return value: 0 for success,-1 for failure.
22、stat
Description: Returns statistics for a specified path.
Usage: hadoop fs -stat URI [URI …]
Example:
hadoop fs -stat path
Return value: 0 for success,-1 for failure.
23、tail
Usage: Output the last 1K bytes of a file to stdout. The-f option is supported, and the behavior is consistent with Unix.
Usage: hadoop fs -tail [-f] URI
Example:
hadoop fs -tail pathname
Return value: 0 for success,-1 for failure.
24、test
Usage: hadoop fs -test -[ezd] URI
Options:
-e Checks if the file exists. Returns 0 if present.
-z Checks if the file is 0 bytes. If yes, return 0.
-d Returns 1 if the path is a directory, 0 otherwise.
Example:
hadoop fs -test -e filename
25、text
Description: Exports the source file to text format. The allowed formats are zip and TextRecordInputStream.
Usage: hadoop fs -text
26、touchz
Description: Create an empty file of 0 bytes.
Usage: hadoop fs -touchz URI [URI …]
Example:
hadoop -touchz pathname
Return value: 0 for success,-1 for failure.
Did reading the above help you? If you still want to have further understanding of related knowledge or read more related articles, please pay attention to the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.