In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces "the introduction of common commands of hadoop". In daily operation, I believe many people have doubts about the introduction of common commands of hadoop. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts of "introduction of common commands of hadoop"! Next, please follow the editor to study!
1.start-all.sh starts the thread
Hadoop fs-ls hdfs://cloud4:9000/user (cloud4 means hostname 9000 represents port number / root directory / user is the user folder)
Hadoop fs-ls / user (hdfs://cloud4:9000 can be omitted)
2.-ls/-lsr to view the file information in the directory
Hadoop fs-ls / View files and folders in the root directory
Hadoop fs-lsr / Recursively view all files and folders in the root directory
Hadoop fs-ls view / user/ under hdfs by default, for example: / user/root (very convenient)
3.-touchz create file
You can create a file
Hadoop fs-touchz / hello (create a hello file)
4.-mkdir create folder
(create a multi-level directory mkdir-p in linux for parents)
You can create one or more folders (hadoop does not need-p)
Hadoop fs-mkdir / user (create a user folder)
5.-text/-cat view the contents of the file
Hadoop fs-text / hello
Hadoop fs-cat / hello
6.-mv move / rename
This command option means to move the files of hdfs to the specified hdfs directory. Followed by two paths, the first represents the source file and the second represents the destination directory
Hadoop fs-mv / hello / user (file to folder: is mobile)
Hadoop fs-mv / hello / user/hello (file to file: is renamed)
7.-cp replication
This command option means to copy the files specified by hdfs to the specified hdfs directory. Followed by two paths, the first
One is the copied file, and the second is the destination.
Hadoop fs-cp / user/hello / user/root
8.-rm/-rmr deletes files
-rm: delete files / blank folders
This command option deletes the specified file or empty directory
Hadoop fs-rm / user/hello
-rmr: recursive deletion
This command option recursively deletes all subdirectories and files under the specified directory
Hadoop fs-rm / user
9.-upload files with put
This command option indicates that files on linux are copied to hdfs
Hadoop fs-put hadoop-env.sh / user
Upload the hadoop-env.sh file under the current linux directory to / user (under the user directory on the hdfs server)
Hadoop fs-put hadoop-env.sh / the name you called (hello- file name)
For example: hadoop fs-put hadoop-env.sh / hello
10.-copyFromLocal replicates locally
The usage is the same as-put
Hadoop fs-copyFromLocal hadoop-env.sh / user
11-moveFromLocal moves locally
This command indicates that the file is moved from linux to hdfs
Hadoop fs-moveFromLocal / home/repine/hehe.txt / user
12.-getmerge merge and download to the local
The command option means to merge the contents of all files in the directory specified by hdfs into the files of the local linux
Hadoop fs-getmerge / user / home/repine/abc.txt means to copy the contents of all files in the user directory to linux / home/repine/abc.txt
13.-setrep/-setrep-R/-setrep-R-w sets the number of copies
The command option is to modify the number of copies of the saved file, followed by the number of copies, followed by the file path
Hadoop fs-setrep 2 / user/hehe.txt setting / number of user/hehe.txt copies is 2
If the last path represents a folder, you need to follow the option-R to modify copies of all files in the folder
The number of copies of all files (not folders) under hadoop fs-setrep-R 2 / user setting / user is 2.
Another option is-w, which means to wait for the copy operation to finish before exiting the command.
Hadoop fs-setrep-R-w 1 / user/hehe.txt
14.-du counts the file size under the directory
Hadoop fs-du / View the size of each file in the root directory
Hadoop fs-dus / Total size of all files in the directory (that is, the size of the current folder)
Number of 15.-count Statistics Files (folders)
Hadoop fs-count / usr recursively counts all the information under the current file: digital representation (total number of folders, total number of files, total file size information)
Hadoop fs-lsr / usr authentication information
16.-chmod/-chmod-R modify file permissions
The use of this command option is similar to the use of chmod in linux's shell, which modifies the permissions of the file.
Hadoop fs-chmod 777 / user/hehe.txt permissions to modify the file
If you add the option-R, you can modify permissions for all files in the folder
Hadoop fs-chmod-R 777 / user permissions to modify all files in this folder
17. Hadoop jar guides jar to run on the command line
/ / the necessary treasure book of packaged and run programs
Job.setJarByClass (WordCountApp.class)
Under linux:
Hadoop jar / linux path / XXX.jar / hadoop file or folder / hadoop directory where jar is executed
18. Report basic statistics of HDFS
Bin/hadoop dfsadmin-report
19. Safety mode
Bin/hadoop dfsadmin-safemode leave/enter/get/wait
20. Copy files from HDFS to your local system
Bin/hadoop dfs-getin getin
Copy the in file from HDFS to the local system and name it getin
At this point, the study of "introduction to common commands of hadoop" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.