In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces how to use the hadoop fs command, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let the editor take you to understand it.
1 local Hadoop fs-fs [local |]: declare the file system used by hadoop. If it is not declared, use the current configuration file to find it in the following order: hadoop-default.xml- in hadoop jar > hadoop-default.xml- under $HADOOP_CONF_DIR > hadoop-site.xml under $HADOOP_CONF_DIR. Use local to represent the local file system as the DFS of the hadoop. If you pass uri as an argument, it is a specific file system as DFS.
2 Hadoop fs-ls: equivalent to the ls of the local system, lists the contents of the files in the specified directory and supports pattern matching. Output formats such as filename (full path) size. Where n represents the number of replica and size represents the size (in bytes).
3 Hadoop fs-lsr: recursively lists file information that matches pattern, similar to ls, but recursively lists all subdirectory information.
4Hadoop fs-du: lists the specified total amount of file system space (in bytes) that matches Hadoop, which is equivalent to du-sb / * for directories and du-b for files under unix, such as name (full path) size (in bytes).
5 Hadoop fs-dus: equivalent to-du, with the same output format, but equivalent to unix's du-sb.
6 Hadoop fs-mv: move the formatted file to the specified target location. When src is multiple files, dst must be a directory.
7 Hadoop fs-cp: copy files to the target location. When src is multiple files, dst must be a directory.
8 Hadoop fs-rm [- skipTrash]: delete the specified file that matches pattern, which is equivalent to rm under unix.
9 Hadoop fs-rmr [skipTrash]: recursively delete all files and directories, which is equivalent to rm-rf under unix.
10 Hadoop fs-rmi [skipTrash]: equivalent to unix's rm-rfi.
11 Hadoop fs-put... Copy files from the local system to DFS
12 Hadoop fs-copyFromLocal... : equivalent to-put.
13 Hadoop fs-moveFromLocal... : is equivalent to-put, except that the source file is deleted after being copied.
14J Hadoop fs-get [- ignoreCrc] [- crc]: copy files from DFS to the local file system, and the files match pattern. If there are multiple files, the dst must be a directory.
15J Hadoop fs-getmerge: as the name implies, multiple files are copied from DFS and sorted into a single file to the local file system.
16 Hadoop fs-cat: displays the contents of the file.
17 Hadoop fs-copyToLocal [- ignoreCrc] [- crc]: equivalent to-get.
18th Hadoop fs-mkdir: creates a directory at the specified location.
19th Hadoop fs-setrep [- R] [- w]: sets the backup level of files, and the-R flag controls whether subdirectories and files are set recursively.
20 Hadoop fs-chmod [- R] PATH... : modify the permissions of the file,-R flag recursive modification. MODE is axiomagnegy gcopyright wjinrwx and so on, OCTALMODE is 755.
21 Hadoop fs-chown [- R] [OWNER] [: [GROUP]] PATH... Modify the owner and group of the file -R means recursion.
22 Hadoop fs-chgrp [- R] GROUP PATH... Equivalent to-chown... : GROUP...
23 Hadoop fs-count [- Q]: count the details of the number of files and the space occupied. The column of the output table means DIR_COUNT,FILE_COUNT,CONTENT_SIZE,FILE_NAME or QUOTA,REMAINING_QUOTA,SPACE_QUOTA,REMAINING_SPACE_QUOTA if-Q is added.
Thank you for reading this article carefully. I hope the article "how to use hadoop fs commands" shared by the editor will be helpful to everyone. At the same time, I also hope that you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.