Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Hadoop installation File configuration tutorial

2025-02-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Network Security >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly explains the "hadoop installation file configuration tutorial", the article explains the content is simple and clear, easy to learn and understand, the following please follow the editor's ideas slowly in depth, together to study and study the "hadoop installation file configuration tutorial" bar!

1. At present, it is only a stand-alone environment, namenode and datanode are on the same machine. The version of hadoop is 2.7.2 JDK and the version of JDK is jdk-8u131-linux-64.rpm.

two。 Install jdk

Rpm-ivh jdk-8u111-linux-x64.rpm

3. Installation key

Ssh-keygen-t rsa

The .ssh directory is automatically generated under the root directory

4. Write the public key into authorized_keys

5. Modify permissions

6. Turn off the firewall

7. Extract the hadoop installation package

Tar zxf hadoop-2.7.2.tar.gz

8. Modify / etc/profile

# java

JAVA_HOME=/usr/java/default

Export PATH=$PATH:$JAVA_HOME/bin

# hadoop

Export HADOOP_HOME=/hadoop_soft/hadoop-2.7.2

Export HADOOP_OPTS= "$HADOOP_OPTS-Djava.library.path=/hadoop_soft/hadoop-2.7.2/lib/native"

Export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

Export HADOOP_OPTS= "- Djava.library.path=$HADOOP_HOME/lib:$HADOOP_COMMON_LIB_NATIVE_DIR"

# # export LD_LIBRARY_PATH=/hadoop_soft/hadoop-2.7.2/lib/native/:$LD_LIBRARY_PATH

Export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

9. Modify the configuration file hadoop-2.7.2/etc/hadoop/

(1) core-site.xml fs.defaultFS is the node name and address of namenode

Fs.defaultFS

Hdfs://192.168.1.120:9000

Hadoop.tmp.dir

/ hadoop_soft/hadoop-2.7.2/current/tmp

Fs.trash.interval

4320

(2) hdfs-site.xml

Dfs.namenode.name.dir

/ hadoop_soft/hadoop-2.7.2/current/dfs/name

Dfs.namenode.data.dir

/ hadoop_soft/hadoop-2.7.2/current/data

Dfs.replication

one

Dfs.webhdfs.enabled

True

Dfs.permissions.superusergroup

Staff

Dfs.permissions.enabled

False

(3)。 Yarn-site.xml

Yarn.resourcemanager.hostname

192.168.1.115

Yarn.nodemanager.aux-services

Mapreduce_shuffle

Yarn.nodemanager.aux-services.mapreduce.shuffle.class

Org.apache.hadoop.mapred.ShuffleHandler

Yarn.resourcemanager.address

192.168.1.120:18040

Yarn.resourcemanager.scheduler.address

192.168.1.120:18030

Yarn.resourcemanager.resource-tracker.address

192.168.1.120:18025

Yarn.resourcemanager.admin.address

192.168.1.120:18141

Yarn.resourcemanager.webapp.address

192.168.1.120:18088

Yarn.log-aggregation-enable

True

Yarn.log-aggregation.retain-seconds

86400

Yarn.log-aggregation.retain-check-interval-seconds

86400

Yarn.nodemanager.remote-app-log-dir

/ tmp/logs

Yarn.nodemanager.remote-app-log-dir-suffix

Logs

(4)。 Copy mapred-site.xml.template to mapred-site.xml

Mapreduce.framework.name

Yarn

Mapreduce.jobtracker.http.address

192.168.1.120:50030

Mapreduce.jobhistory.address

192.168.1.120:10020

Mapreduce.jobhistory.webapp.address

192.168.1.120:19888

Mapreduce.jobhistory-done-dir

/ jobhistory/done

Mapreduce.intermediate-done-dir

/ jobhistory/done_intermediate

Mapreduce.job.ubertask.enable

True

(5)。 Edit slaves to add the IP of the host

192.168.1.120

(6)。 Add java_home to the hadoop-env.sh file and find the line of the file JAVA_HOME

10. Format the file system

Hdfs namenode-format

11. Start hadoop-2.7.2/sbin/start-all.sh

twelve。 Verify jps

6433 NameNode

6532 DataNode

7014 NodeManager

6762 SecondaryNameNode

6910 ResourceManager

7871 Jps

13.hadoop basic command

Hadoop fs-mkdir / hadoop-test

Hadoop fs-find /-name hadoop-test

Hadoop fs-put NOTICE.txt / hadoop-test/

Hadoop fs-rm-R

Thank you for reading, the above is the content of the "hadoop installation file configuration tutorial", after the study of this article, I believe you have a deeper understanding of the hadoop installation file configuration tutorial, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Network Security

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report