Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

A complete Collection of hadoop commands

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Reference from hadoop official website documentation

Http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_shell.html#cp

FS Shell

Cat

Chgrp

Chmod

Chown

CopyFromLocal

CopyToLocal

Cp

Du

Dus

Expunge

Get

Getmerge

Ls

Lsr

Mkdir

MovefromLocal

Mv

Put

Rm

Rmr

Setrep

Stat

Tail

Test

Text

Touchz

FS Shell

The file system (FS) Shell command should be called in the form of bin/hadoop fs. All FS shell commands take the URI path as an argument. The URI format is scheme://authority/path. For HDFS file systems, scheme is hdfs, and for local file systems, scheme is file. The scheme and authority parameters are optional, and if not specified, the default scheme specified in the configuration is used. A HDFS file or directory such as / parent/child can be represented as hdfs://namenode:namenodeport/parent/child, or a simpler / parent/child (assuming the default value in your configuration file is namenode:namenodeport). Most FS Shell commands behave similar to their corresponding Unix Shell commands, but the differences are noted in the details of the use of each command below. The error message is output to stderr, and other information is output to stdout.

Cat

How to use: hadoop fs-cat URI [URI...]

Outputs the contents of the path-specified file to stdout.

Example:

Hadoop fs-cat hdfs://host1:port1/file1 hdfs://host2:port2/file2

Hadoop fs-cat file:///file3 / user/hadoop/file4

Return value:

Success returns 0 and failure returns-1.

Chgrp

Usage: hadoop fs-chgrp [- R] GROUP URI [URI …] Change group association of files. With-R, make the change recursively through the directory structure. The user must be the owner of files, or else a super-user. Additional information is in thePermissions User Guide. ->

Change the group to which the file belongs. Using-R will make the changes recursive under the directory structure. The user of the command must be the owner of the file or the superuser. See the HDFS permissions user Guide for more information.

Chmod

Usage: hadoop fs-chmod [- R] URI [URI …]

Change the permissions of the file. Using-R will make the changes recursive under the directory structure. The user of the command must be the owner of the file or the superuser. See the HDFS permissions user Guide for more information.

Chown

Usage: hadoop fs-chown [- R] [OWNER] [: [GROUP]] URI [URI]

Change the owner of the file. Using-R will make the changes recursive under the directory structure. The user of the command must be a superuser. See the HDFS permissions user Guide for more information.

CopyFromLocal

Usage: hadoop fs-copyFromLocal URI

It is similar to the put command except that the source path is a local file.

CopyToLocal

Usage: hadoop fs-copyToLocal [- ignorecrc] [- crc] URI

It is similar to the get command except that the destination path is a local file.

Cp

How to use: hadoop fs-cp URI [URI...]

Copy the file from the source path to the destination path. This command allows multiple source paths, where the destination path must be a directory.

Example:

Hadoop fs-cp / user/hadoop/file1 / user/hadoop/file2

Hadoop fs-cp / user/hadoop/file1 / user/hadoop/file2 / user/hadoop/dir

Return value:

Success returns 0 and failure returns-1.

Du

How to use: hadoop fs-du URI [URI...]

Displays the size of all files in the directory, or displays the size of this file when only one file is specified.

Example:

Hadoop fs-du / user/hadoop/dir1 / user/hadoop/file1 hdfs://host:port/user/hadoop/dir1

Return value:

Success returns 0 and failure returns-1.

Dus

Usage: hadoop fs-dus

Displays the size of the file.

Expunge

Usage: hadoop fs-expunge

Empty the Recycle Bin. Please refer to the HDFS design documentation for more information about the features of the Recycle Bin.

Get

Usage: hadoop fs-get [- ignorecrc] [- crc]

Copy files to the local file system. The-ignorecrc option is available to copy files that failed CRC verification. Use the-crc option to copy the file and CRC information.

Example:

Hadoop fs-get / user/hadoop/file localfile

Hadoop fs-get hdfs://host:port/user/hadoop/file localfile

Return value:

Success returns 0 and failure returns-1.

Getmerge

Usage: hadoop fs-getmerge [addnl]

Accept a source directory and a target file as input, and connect all files in the source directory to the target file at cost. Addnl is optional and specifies that a newline character is added at the end of each file.

Ls

Usage: hadoop fs-ls

If it is a file, the file information is returned in the following format:

File name file size modification date modification time permission user ID group ID

If it is a directory, it returns a list of its immediate child files, just as in Unix. The information of the directory return list is as follows:

Directory name modification date modification time permission user ID group ID

Example:

Hadoop fs-ls / user/hadoop/file1 / user/hadoop/file2 hdfs://host:port/user/hadoop/dir1 / nonexistentfile

Return value:

Success returns 0 and failure returns-1.

Lsr

Usage: hadoop fs-lsr

Recursive version of the ls command. Similar to ls-R in Unix.

Mkdir

Usage: hadoop fs-mkdir

Take the uri specified by the path as a parameter to create these directories. It behaves like Unix's mkdir-p, creating parent directories at all levels in the path.

Example:

Hadoop fs-mkdir / user/hadoop/dir1 / user/hadoop/dir2

Hadoop fs-mkdir hdfs://host1:port1/user/hadoop/dir hdfs://host2:port2/user/hadoop/dir

Return value:

Success returns 0 and failure returns-1.

MovefromLocal

Usage: dfs-moveFromLocal

Output a "not implemented" message.

Mv

How to use: hadoop fs-mv URI [URI...]

Move the file from the source path to the destination path. This command allows multiple source paths, where the destination path must be a directory. Moving files between different file systems is not allowed.

Example:

Hadoop fs-mv / user/hadoop/file1 / user/hadoop/file2

Hadoop fs-mv hdfs://host:port/file1 hdfs://host:port/file2 hdfs://host:port/file3 hdfs://host:port/dir1

Return value:

Success returns 0 and failure returns-1.

Put

Usage: hadoop fs-put.

Copy single or multiple source paths from the local file system to the destination file system. It also supports reading input from standard input and writing to the destination file system.

Hadoop fs-put localfile / user/hadoop/hadoopfile

Hadoop fs-put localfile1 localfile2 / user/hadoop/hadoopdir

Hadoop fs-put localfile hdfs://host:port/hadoop/hadoopfile

Hadoop fs-put-hdfs://host:port/hadoop/hadoopfile

Read input from standard input.

Return value:

Success returns 0 and failure returns-1.

Rm

How to use: hadoop fs-rm URI [URI...]

Deletes the specified file. Delete only non-empty directories and files. Refer to the rmr command for recursive deletion.

Example:

Hadoop fs-rm hdfs://host:port/file / user/hadoop/emptydir

Return value:

Success returns 0 and failure returns-1.

Rmr

How to use: hadoop fs-rmr URI [URI...]

The recursive version of delete.

Example:

Hadoop fs-rmr / user/hadoop/dir

Hadoop fs-rmr hdfs://host:port/user/hadoop/dir

Return value:

Success returns 0 and failure returns-1.

Setrep

Usage: hadoop fs-setrep [- R]

Change the copy coefficient of a file. The-R option is used to recursively change the copy coefficient of all files in the directory.

Example:

Hadoop fs-setrep-w 3-R / user/hadoop/dir1

Return value:

Success returns 0 and failure returns-1.

Stat

How to use: hadoop fs-stat URI [URI...]

Returns statistics for the specified path.

Example:

Hadoop fs-stat path

Return value:

Success returns 0 and failure returns-1.

Tail

Usage: hadoop fs-tail [- f] URI

Outputs the contents of 1K bytes at the end of the file to stdout. The-f option is supported, and the behavior is consistent with that in Unix.

Example:

Hadoop fs-tail pathname

Return value:

Success returns 0 and failure returns-1.

Test

Usage: hadoop fs-test-[ezd] URI

Options:

-e check whether the file exists. Returns 0 if it exists.

-z check whether the file is 0 bytes. If so, 0 is returned.

-d returns 1 if the path is a directory, otherwise 0.

Example:

Hadoop fs-test-e filename

Text

Usage: hadoop fs-text

Output the source file to text format. The allowed formats are zip and TextRecordInputStream.

Touchz

How to use: hadoop fs-touchz URI [URI...]

Create an empty file with 0 bytes.

Example:

Hadoop-touchz pathname

Return value:

Success returns 0 and failure returns-1.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report