In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-22 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you "what are the common fs commands in hadoop", the content is simple and clear, and I hope it can help you solve your doubts. Let the editor lead you to study and learn what are the common fs commands in hadoop.
1 introduction
The Hadoop File system (FS) provides a variety of shell commands by default that can be used to interact with the Hadoop distributed File system (HDFS) or any other supported file system that uses Hadoop Shell. Some of the most common commands are used for operations such as creating directories, copying files, viewing file contents, and changing file ownership or permissions.
Hadoop fs is equivalent to hdfs dfs.
[hadoop@hadoop002 hadoop] $hdfs dfs
Usage: hadoop fs [generic options]
2 Public commands
2.1Create directory command: hdfs dfs-mkdir
[hadoop@hadoop002 hadoop] $hdfs dfs-mkdir / 20180523
[hadoop@hadoop002 hadoop] $
2.2. Command to list the contents of the directory: hdfs dfs-ls
[hadoop@hadoop002 hadoop] $hdfs dfs-ls /
Found 5 items
Drwxr-xr-x-hadoop supergroup 0 2018-05-23 14:48 / 20180523
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 13:04 / gw_test.log3
Drwxr-xr-x-root root 0 2018-05-23 13:16 / root
Drwx--hadoop supergroup 0 2018-05-22 11:23 / tmp
Drwxr-xr-x-hadoop supergroup 0 2018-05-22 11:22 / user
[hadoop@hadoop002 hadoop] $
2.3 upload files in HDFS: hdfs dfs-put...
[hadoop@hadoop002 hadoop] $cd ~
[hadoop@hadoop002 ~] $ll
Total 4
-rw-rw-r--. 1 hadoop hadoop 0 May 21 17:09 authorized_key
-rw-r--r--. 1 hadoop hadoop 25 May 23 12:17 gw_test.log3
[hadoop@hadoop002 ~] $hdfs dfs-put gw_test.log3 / gw_test.log4
[hadoop@hadoop002 ~] $hdfs dfs-ls /
Found 6 items
Drwxr-xr-x-hadoop supergroup 0 2018-05-23 14:48 / 20180523
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 13:04 / gw_test.log3
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:50 / gw_test.log4
Drwxr-xr-x-root root 0 2018-05-23 13:16 / root
Drwx--hadoop supergroup 0 2018-05-22 11:23 / tmp
Drwxr-xr-x-hadoop supergroup 0 2018-05-22 11:22 / user
Download the file from HDFS: hdfs dfs-get
[hadoop@hadoop002 ~] $ll
Total 4
-rw-rw-r--. 1 hadoop hadoop 0 May 21 17:09 authorized_key
-rw-r--r--. 1 hadoop hadoop 25 May 23 12:17 gw_test.log3
[hadoop@hadoop002 ~] $hdfs dfs-get / gw_test.log4
[hadoop@hadoop002 ~] $ll
Total 8
-rw-rw-r--. 1 hadoop hadoop 0 May 21 17:09 authorized_key
-rw-r--r--. 1 hadoop hadoop 25 May 23 12:17 gw_test.log3
-rw-r--r--. 1 hadoop hadoop 25 May 23 14:52 gw_test.log4
2.5 View file contents: hdfs dfs-cat
[hadoop@hadoop002 ~] $hdfs dfs-cat / gw_test.log4
1111
2222
3333
4444
5555
[hadoop@hadoop002 ~] $
2.6 copy file command: hdfs dfs-cp...
[hadoop@hadoop002 ~] $hdfs dfs-cp / gw_test.log3 / 20180523
[hadoop@hadoop002 ~] $hdfs dfs-ls / 20180523
Found 1 items
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:55 / 20180523/gw_test.log3
[hadoop@hadoop002 ~] $
2.7move files from source to destination command: hdfs dfs-mv
[hadoop@hadoop002 ~] $hdfs dfs-ls /
Found 6 items
Drwxr-xr-x-hadoop supergroup 0 2018-05-23 14:55 / 20180523
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 13:04 / gw_test.log3
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:50 / gw_test.log4
Drwxr-xr-x-root root 0 2018-05-23 13:16 / root
Drwx--hadoop supergroup 0 2018-05-22 11:23 / tmp
Drwxr-xr-x-hadoop supergroup 0 2018-05-22 11:22 / user
[hadoop@hadoop002 ~] $hdfs dfs-mv / gw_test.log4 / 20180523
[hadoop@hadoop002 ~] $hdfs dfs-ls /
Found 5 items
Drwxr-xr-x-hadoop supergroup 0 2018-05-23 14:57 / 20180523
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 13:04 / gw_test.log3
Drwxr-xr-x-root root 0 2018-05-23 13:16 / root
Drwx--hadoop supergroup 0 2018-05-22 11:23 / tmp
Drwxr-xr-x-hadoop supergroup 0 2018-05-22 11:22 / user
[hadoop@hadoop002 ~] $hdfs dfs-ls / 20180523
Found 2 items
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:55 / 20180523/gw_test.log3
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:50 / 20180523/gw_test.log4
[hadoop@hadoop002 ~] $
[hadoop@hadoop002] $hdfs dfs-rm-r / 20180523
Deleted / 20180523
[hadoop@hadoop002 ~] $hdfs dfs-ls /
Found 4 items
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 13:04 / gw_test.log3
Drwxr-xr-x-root root 0 2018-05-23 13:16 / root
Drwx--hadoop supergroup 0 2018-05-22 11:23 / tmp
Drwxr-xr-x-hadoop supergroup 0 2018-05-22 11:22 / user
[hadoop@hadoop002 ~] $
2.8Delete files or directories from HDFS command:
Hdfs dfs-rm-Delete files
Hdfs dfs-rm-r-Delete the directory
[hadoop@hadoop002 ~] $hdfs dfs-ls / 20180523
Found 2 items
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:55 / 20180523/gw_test.log3
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:50 / 20180523/gw_test.log4
[hadoop@hadoop002 ~] $hdfs dfs-rm / 20180523/gw_test.log3
Deleted / 20180523/gw_test.log3
[hadoop@hadoop002 ~] $hdfs dfs-ls / 20180523
Found 1 items
-rw-r--r-- 1 hadoop supergroup 25 2018-05-23 14:50 / 20180523/gw_test.log4
[hadoop@hadoop002 ~] $
2.9 display the tail command of the file: hdfs dfs-tail
[hadoop@hadoop002 ~] $hdfs dfs-tail / gw_test.log3
1111
2222
3333
4444
5555
[hadoop@hadoop002 ~] $
2.10 display the total length of a specific file command: hdfs dfs-du
[hadoop@hadoop002 ~] $hdfs dfs-du / gw_test.log3
25 / gw_test.log3
[hadoop@hadoop002 ~] $
2.11 count directories and files command: hdfs dfs-count
[hadoop@hadoop002 ~] $hdfs dfs-count / gw_test.log3
0 1 25 / gw_test.log3
2.12 naming the space in the file system for details: hdfs dfs-df
[hadoop@hadoop002 ~] $hdfs dfs-df /
Filesystem Size Used Available Use%
Hdfs://hadoop002:9000 40028807168 704512 30802395136 0%
[hadoop@hadoop002 ~] $
The above is all the contents of the article "what are the common fs commands in hadoop?" Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.