In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article is to share with you what are the commonly used commands about Hadoop FS, the editor thinks it is very practical, so I share it with you to learn. I hope you can get something after reading this article.
1. Overview
The Hadoop file system (FS) provides a variety of shell commands, similar to the shell command, that can be used to interact with the distributed file system (HDFS) to manage files and data in the HDFS cluster.
2. Hadoop FS common commands
(1) create a directory
Usage:
Hadoop fs-mkdir
Example:
Create a single directory:
Hadoop fs-mkdir / home/myfile/dir1
Create multiple directories:
Hadoop fs-mkdir / home/myfile/dir1 / home/myfile/dir2
(2) View the catalog
Usage:
Hadoop fs-ls
Example:
Similar to ls in shell:
Hadoop fs-ls / home/myfile/
Note: hadoop fs-ls will print out directory statistics and details, and pay attention to the format for file batch processing.
(3) upload files
Copy one or more files from the local system to the Hadoop file system.
Usage:
Hadoop fs-put
Example:
Hadoop fs-put Desktop/test.sh / home/myfile/dir1/
(4) download files
Download files from HDFS to the local file system.
Usage:
Hadoop fs-get
Example:
Hadoop fs-get / home/myfile/test.sh Downloads/
(5) View documents
Usage:
Hadoop fs-cat
Example:
Hadoop fs-cat / home/myfile/test.sh
(6) copy files
Usage:
Hadoop fs-cp
Example:
Hadoop fs-cp / home/myfile/test.sh / home/myfile/dir
(7) move documents
Usage:
Hadoop fs-mv
Example:
Hadoop fs-mv / home/myfile/test.sh / home/myfile/dir
(8) Delete files
Delete instructions have two options,-rm and-rm-r
Usage:
Hadoop fs-rm
Example:
Hadoop fs-rm / home/myfile/test.sh
The above command only deletes files in a given directory, and if you want to delete a directory that contains files, similar to rm in shell, you need to use the recursive parameter-r.
The following is the recursive operation of rm:
Usage:
Hadoop fs-rm-r
Example:
Hadoop fs-rm-r / home/myfile/dir
(9) check the end of the file
Usage:
Hadoop fs-tail
Example:
Hadoop fs-tail / home/myfile/test.sh
(10) display the total length of the document
Usage:
Hadoop fs-du
Example:
Hadoop fs-du / home/myfile/test.sh
(11) Statistics on the number and size of files
Usage:
Hadoop fs-count
Example:
Hadoop fs-count / home/myfile
(12) Statistics of file system space details
Usage:
Hadoop fs-df
Example:
Hadoop fs-df / home/myfile
(13) merge documents
Copy multiple files from HDFS, merge and sort into a single file to the local file system
Usage:
Hadoop fs-getmerge
Example:
Hadoop fs-getmerge / user/hduser0011/test / home/myfile/dir
FS Shell
The file system (FS) Shell command should be called in the form of hadoop fs.
All FS shell commands take the URI path as an argument. The URI format is scheme://authority/path.
For HDFS file systems, scheme is hdfs, and for local file systems, scheme is file. The scheme and authority parameters are optional, and if not specified, the default scheme specified in the configuration is used.
A HDFS file or directory such as / parent/child can be represented as hdfs://namenode:namenodeport/parent/child, or a simpler / parent/child (assuming the default value in your configuration file is namenode:namenodeport).
Most FS Shell commands behave similar to their corresponding Unix Shell commands, but the differences are noted in the details of the use of each command below. The error message is output to stderr, and other information is output to stdout.
Cat
How to use: hadoop fs-cat URI [URI...]
Outputs the contents of the path-specified file to stdout.
Example:
Hadoop fs-cat hdfs://host1:port1/file1 hdfs://host2:port2/file2
Hadoop fs-cat file:///file3 / user/hadoop/file4
Return value:
Success returns 0 and failure returns-1.
Chgrp
Usage: hadoop fs-chgrp [- R] GROUP URI [URI …]
Change the group to which the file belongs. Using-R will make the changes recursive under the directory structure. The user of the command must be the owner of the file or the superuser.
Chmod
Usage: hadoop fs-chmod [- R]
Change the permissions of the file. Using-R will make the changes recursive under the directory structure. The user of the command must be the owner of the file or the superuser.
Chown
Usage: hadoop fs-chown [- R] [OWNER] [: [GROUP]] URI [URI]
Change the owner of the file. Using-R will make the changes recursive under the directory structure. The user of the command must be a superuser.
CopyFromLocal
Usage: hadoop fs-copyFromLocal
The qualified source path is a local file, and the others are similar to the put command.
CopyToLocal
Usage: hadoop fs-copyToLocal [- ignorecrc] [- crc] URI
It is similar to the get command except that the destination path is a local file.
Cp
How to use: hadoop fs-cp URI [URI...]
Copy the file from the source path to the destination path. This command allows multiple source paths, where the destination path must be a directory.
Example:
Hadoop fs-cp / user/hadoop/file1 / user/hadoop/file2
Hadoop fs-cp / user/hadoop/file1 / user/hadoop/file2 / user/hadoop/dir
Return value:
Success returns 0 and failure returns-1.
Du
How to use: hadoop fs-du URI [URI...]
Displays the size of all files in the directory, or displays the size of this file when only one file is specified.
Example:
Hadoop fs-du / user/hadoop/dir1 / user/hadoop/file1 hdfs://host:port/user/hadoop/dir1
Return value:
Success returns 0 and failure returns-1.
Dus
Usage: hadoop fs-dus
Displays the size of the file.
Expunge
Usage: hadoop fs-expunge
Empty the Recycle Bin.
Get
Usage: hadoop fs-get [- ignorecrc] [- crc]
Copy files to the local file system. The-ignorecrc option is available to copy files that failed CRC verification. Use the-crc option to copy the file and CRC information.
Example:
Hadoop fs-get / user/hadoop/file localfile
Hadoop fs-get hdfs://host:port/user/hadoop/file localfile
Return value:
Success returns 0 and failure returns-1.
Getmerge
Usage: hadoop fs-getmerge
Accept a source directory and a target file as input, and connect all files in the source directory to the target file at cost. Addnl is optional and specifies that a newline character is added at the end of each file.
Ls
Usage: hadoop fs-ls
If it is a file, the file information is returned in the following format:
File name
Example:
Hadoop fs-ls / user/hadoop/file1 / user/hadoop/file2 hdfs://host:port/user/hadoop/dir1 / nonexistentfile
Return value:
Success returns 0 and failure returns-1.
Lsr
Usage: hadoop fs-lsr
Recursive version of the ls command. Similar to ls-R in Unix
Mkdir
Usage: hadoop fs-mkdir
Take the uri specified by the path as a parameter to create these directories. It behaves like Unix's mkdir-p, creating parent directories at all levels in the path.
Example:
Hadoop fs-mkdir / user/hadoop/dir1 / user/hadoop/dir2
Hadoop fs-mkdir hdfs://host1:port1/user/hadoop/dir hdfs://host2:port2/user/hadoop/dir
Return value:
Success returns 0 and failure returns-1.
MovefromLocal
Usage: dfs-moveFromLocal
Output a "not implemented" message.
Mv
How to use: hadoop fs-mv URI [URI...]
Move the file from the source path to the destination path. This command allows multiple source paths, where the destination path must be a directory. * * it is not allowed to move files between different file systems. **
Example:
Hadoop fs-mv / user/hadoop/file1 / user/hadoop/file2
Hadoop fs-mv hdfs://host:port/file1 hdfs://host:port/file2 hdfs://host:port/file3 hdfs://host:port/dir1
Return value:
Success returns 0 and failure returns-1.
Put
Usage: hadoop fs-put
Copy single or multiple source paths from the local file system to the destination file system. It also supports reading input from standard input and writing to the destination file system.
Hadoop fs-put localfile / user/hadoop/hadoopfile
Hadoop fs-put localfile1 localfile2 / user/hadoop/hadoopdir
Hadoop fs-put localfile hdfs://host:port/hadoop/hadoopfile
Hadoop fs-put-hdfs://host:port/hadoop/hadoopfile
Read input from standard input.
Return value:
Success returns 0 and failure returns-1.
Rm
How to use: hadoop fs-rm URI [URI...]
Deletes the specified file. Delete only non-empty directories and files. Refer to the rmr command for recursive deletion.
Example:
Hadoop fs-rm hdfs://host:port/file / user/hadoop/emptydir
Return value:
Success returns 0 and failure returns-1.
Rmr
How to use: hadoop fs-rmr URI [URI...]
The recursive version of rm.
Example:
Hadoop fs-rmr / user/hadoop/dir
Hadoop fs-rmr hdfs://host:port/user/hadoop/dir
Return value:
Success returns 0 and failure returns-1.
Setrep
Usage: hadoop fs-setrep [- R]
Change the copy coefficient of a file. The-R option is used to recursively change the copy coefficient of all files in the directory.
Example:
Hadoop fs-setrep-w 3-R / user/hadoop/dir1
Return value:
Success returns 0 and failure returns-1.
Stat
How to use: hadoop fs-stat URI [URI...]
Returns statistics for the specified path.
Example:
Hadoop fs-stat path
Return value:
Success returns 0 and failure returns-1.
Tail
Usage: hadoop fs-tail [- f] URI
Outputs the contents of 1K bytes at the end of the file to stdout. The-f option is supported, and the behavior is consistent with that in Unix.
Example:
Hadoop fs-tail pathname
Return value:
Success returns 0 and failure returns-1.
Test
Usage: hadoop fs-test-[ezd] URI
Options:
-e check whether the file exists. Returns 0 if it exists.
-z check whether the file is 0 bytes. If so, 0 is returned.
-d returns 1 if the path is a directory, otherwise 0.
Example:
Hadoop fs-test-e filename
Text
Usage: hadoop fs-text
Output the source file to text format. The allowed formats are zip and TextRecordInputStream.
Touchz
How to use: hadoop fs-touchz URI [URI...]
Create an empty file with 0 bytes.
Example:
Hadoop-touchz pathname
Return value:
Success returns 0 and failure returns-1. Zhengzhou does painless abortion price http://jbk.39.net/yiyuanzaixian/sysdfkyy/
The above are the common commands of Hadoop FS, and the editor believes that there are some knowledge points that we may see or use in our daily work. I hope you can learn more from this article. For more details, please follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.