Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to configure Hdfs in Hadoop

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

Editor to share with you how to configure Hdfs in Hadoop, I believe most people do not know much about it, so share this article for your reference, I hope you can learn a lot after reading this article, let's go to know it!

1. Configure slaves and modify conf/slaves

Data-dfs

Copy the hadoop-2.20.1 directory and all files to DataNode and Client

Scp-rp/data/soft/hadoop-2.20.1192.168.1.12:/data/soft

Scp-rp/data/soft/hadoop-2.20.1192.168.1.13:/data/soft

2. Format the Hdfs file system namenode and enter the hadoop directory

[master@hadoop] $bin/hadoopnamenode-format enter Y, (pay attention to case sensitivity. Be sure to enter an uppercase Y here, otherwise the format file system will not be successful)

No surprise, it should prompt you that the formatting was successful. If it doesn't work, go to the hadoop/logs/ directory and check the log files.

3. Start the Hdfs distributed file system

Also start the Hdfs distributed file system when Hadoop Hdfs is configured. Go to the hadoop directory and there are many startup scripts under bin/ that you can start according to your needs.

* start-all.sh starts all Hadoop daemons. Including namenode,datanode,jobtracker,tasktrack

* stop-all.sh stops all Hadoop

* start-mapred.sh starts the Map/Reduce daemon. Including Jobtracker and Tasktrack

* stop-mapred.sh stops Map/Reduce daemon

* start-dfs.sh launches HadoopDFS daemon .Namenode and Datanode

* stop-dfs.sh stops DFS daemon

Activate all guardians.

[master@hadoop] bin/start-all.sh// will automatically find the IP or machine name in the conf/slaves file and start the corresponding slaves server

Similarly, if you want to stop hadoop, [master@hadoop] $bin/stop-all.sh

4. HDFS operation

Let's take a look at what happens to HDFS during Hadoop Hdfs configuration. Go to the Hadoop directory and run the hadoop command of the bin/ directory to view all the supported operations of Haoop and their usage. Here are a few simple operations as examples.

Create a catalog

[master@hadoop] $bin/hadoopdfs-mkdirtestdir

Create a directory called testdir in HDFS

Copy a file

[master@hadoop] $bin/hadoopdfs-put

/ home/dbrg/large.ziptestfile.zip

Copy the local file large.zip to the root directory / user/dbrg/ of HDFS, which is called testfile.zip

View existing files

[master@hadoop] $bin/hadoopdfs-ls

While running the hadoop program, I terminated it halfway, and then when I added or deleted files to hdfs, there was a Namenodeisinsafemode error:

Rmr:org.apache.hadoop.dfs.SafeModeException:Cannotdelete/user/hadoop/input.Namenodeisinsafemode

Commands for resolution:

Bin/hadoopdfsadmin-safemodeleave# closes safemode. This is the end of the introduction to Hadoop Hdfs configuration.

These are all the contents of the article "how to configure Hdfs in Hadoop". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report