Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Centos6.5 installs hadoop cluster

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

Environmental preparation: 4 machines

192.168.217.174 node1-namenode

192.168.217.175 node2-secondary namenode

192.168.217.176 node3-datanode

192.168.217.177 node4-datanode

1. 4 machines login without password

1.1-do the password-free on the node1 machine first

Add each other's ip and machine name to each other in daily machine vim / etc/hosts

192.168.217.174 node1

192.168.217.175 node2

192.168.217.176 node3

192.168.217.177 node4

# ssh-keygen-t dsa-P''- f ~ / .ssh/id_dsa 1. Note: generate key

# cat ~ / .ssh/id_dsa.pub > > ~ / .ssh/authorized_keys 2. Note: key writing authentication mechanism

1.2-other 3 machines do password-free

Copy the public key generated on the node1 machine to the other three machines, and finally put it into the authentication mechanism of the daily machine.

# scp. / id_dsa.pub root@node: / opt/ # operates on node1

# ssh-keygen-t dsa-P''- f ~ / .ssh/id_dsa # all three machines execute this command

# cat/opt/id_dsa.pub > > ~ / .ssh/authorized_keys # all 3 machines execute this command

There is a problem with the copy:

Bash: all 4 sets of scp:command not found # yum install openssh-clients are installed

two。 Start installation

2.1-install JDK:-4 all installed

# upload jdk-7u79-linux-x64.rpm to / home directory

# rpm-ivh jdk-7u79-linux-x64.rpm

# cd / home/hadoop-2.5.1

# vim~/.bash_profile add environment variable

Export PATH

Export JAVA_HOME=/usr/java/jdk1.7.0_79

ExportPATH=$PATH:$JAVA_HOME/bin

ExportHADOOP_HOME=/home/hadoop-2.5.1

ExportPATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

# source ~ / .bash_profile

2.2-install hadoop

# upload hadoop-2.5.1_x64.tar.gz to / home directory [operate in node1]

# tar zxvfhadoop-2.5.1_x64.tar.gz

# vim / home/hadoop-2.5.1/etc/hadoop/hadoop-env.sh # namenode hostname

ExportJAVA_HOME=/usr/java/jdk1.7.0_79/ (add)

# vim / home/hadoop-2.5.1/etc/hadoop/core-site.xml

Fs.defaultFS

Hdfs://node1:9000 # namenode host and port for data upload

Hadoop.tmp.dir

/ opt/hadoop-2.5 # saved path

# vim / home/hadoop-2.5.1/etc/hadoop/hdfs-site.xml

Dfs.namenode.secondary.http-address

Node2:50090 # secondary namenode

Dfs.namenode.secondary.https-address

Node2:50091

# vim / home/hadoop-2.5.1/etc/hadoop/slaves # configure datanode hostname

Node2

Node3

Node4

# vim masters # create and configure secondary namenode hostname

Node2

# rm-rf hadoop-2.5.1/share/doc/ delete this directory to facilitate copying to other machines

2.3-copy to other machines

Under the / home directory

# scp-r hadoop-2.5.1 root@node (234): / home/

In / home/hadoop-2.5.1

# scp~/.bash_profile root@node: / root/

Format the hdfs file system before starting

# hdfs namenode-format can only be used on specified namenode (node1) machines

# start-dfs.sh can be started in any directory

Browser visit: 192.168.217.174purl 50070

(if it can't be turned on, turn off the firewall # serviceiptables stop)

At this point, the hadoop cluster has been installed

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report