Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Hadoop [basic commands for HDFS]

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

1. Go to Hadoop user first, and then: cd / usr/local/hadoop-0.20.2-cdh4u5/

Ls for a moment

At this time, you will see a bin directory, bin directory, are our commands. Like in JDK, there are also our commands in the bin directory, such as Java and javac. So again, our Hadoop command is also in the bin directory.

2.cd bin

Ls for a moment

Start-all.sh (meaning to turn on clusters), stop-all.sh (to shut down all clusters), start-dfs.sh (to start all processes in HDFS), stop-dfs.sh (to shut down all processes in HDFS), hadoop-damon.sh (to turn on a specified machine), hadoop-damons.sh (to start multiple machines).

3. To enter the command for Hadoop, you must enter the bin directory of Hadoop in the format: hadoop fs

Is a parameter command.

Execute the script under the 4.Bin directory, which must be. / in order to run, don't forget, my God, you will forget it, and the word. / hadoop. / and Hadoop are linked together, not separated by spaces.

5. All right, here's the actual order:

. / hadoop fs-ls hdfs://h203:9000/

(fs represents the file system,-ls is followed by the contents of which directory you want to view, and the slash after 9000 indicates my directory.)

But I wrote it in another way:. / hadoop fs-ls /

(the result of this output is the same, because my hostname is H203, as long as on my cluster, the / after the ls space defaults to my full-path hdfs://h203:9000/, so both commands are the same.)

6. But if someone questions you, the result of this treatment tmp is not your local, but it is not, because only: ls /

This command comes out of the local system files, such as srv,tmp and so on.

7. Create a directory:. / hadoop fs-mkdir / user (a directory of user created)

. / hadoo fs-ls / (check to see if you have it)

Remember, this is in Hadoop, not just ls, but. / hadoop fs-ls / like this.

8. When user is created, I want to create one in user. The command is:. / hadoop fs-mkdir / user/hadoop (level-by-level creation).

Note that if I still want to use the command. / hadoop fs-ls / to see if I can see it, of course not, I can only see the user directory, so I have to use the command. / hadoop fs-ls / user to see it.

Why do you say that the Hadoop you just created is a directory? it is because what comes out is "drwxr-xr-x", and d means a directory.

9. Upload files:

(1. First create a file locally: vi a.txt

(2. Write

(3.Cat a.txt

(4. First, to upload it to the Hadoop under the cluster user

(5. Upload command:. / hadoop fs-put a.txt / user/hadoop

(6. At this time, let's check again:. / hadoop fs-ls / user/hadoop/

(7. It was uploaded at this time, but after I uploaded it, how do I think it was a file? then look at the front "- rw-r--r--",-r means a file.

(8. Next I'm going to look at the contents of the file:. / hadoop fs-cat / user/hadoop/a.txt

(9. If someone says that the a.txt is local, delete the local file: rm-rf a.txt, the local file is gone, but I still. / hadoop fs-cat / user/hadoop/a.txt, it is still visible, which is enough to prove that my a.txt file has been uploaded to my cluster.

10. The local a.txt has been deleted. I want to download the files in the cluster to the local:. / hadoop fs-get / user/hadoop/a.txt. / at this time, cat a.txt, I will see that it is already in the local.

11. Delete the file:. / hadoop fs-rm / user/hadoop/a.txt

Enter: DELETE hdfs:, this is deleted.

twelve。 Create a directory under home:. / hadoop fs-mkdir / user/hadoop/aaaa

Check the directory:. / hadoop fs-ls / user/hadoop/

13. Delete the directory:. / hadoop fs-rmr / user/hadoop/aaaa

14. Delete my Hadoop directory:. / hadoop fs-rmr / user/hadoop

(- rm is the command to delete files, and-rmr is the command to delete files and directories, generic)!

15. Create it for him again:. / hadoop fs-mkdir / user/hadoop/aaaa

Check to see if it has been created,. / hadoop fs-ls / user/hadoop (so you can create level 2 at once)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report