Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Hadoop installation and configuration

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

1. Check the JDK version

two。 Time synchronization

3. Set password-free login # generate key (node1,node2,node3,node4) ssh-keygen-t dsa-P'- f ~ / .ssh/id_dsacd ~ / .ssh / ls# description # # id_ds private key # # is_dsa.pub public key # append the public key file to the local authentication file (node1,node2,node3 Node4) cat ~ / .ssh/id_dsa.pub > > ~ / .ssh/authorized_keysssh 192.168.2.136#Last login: Sun Jul 9 14:14:35 2017 from 192.168.2.136 indicates that the login is successful, and the exit#logout exit does not require a password to copy the public key in the node1 to node2,node3 Scp. / id_dsa.pub root@192.168.2.137:/opt/ scp. / id_dsa.pub root@192.168.2.138:/opt/ scp. / id_dsa.pub root@192.168.2.139:/opt/ in node4 add / opt/id_dsa.pub to the authentication file of node2,node3,node4 cat / opt/id_dsa.pub > > ~ / .ssh/authorized_keys

4. Set node1 to NameNode,node2,node3,node4 and datanode to upload the hadoop package to the / root directory of node1 tar-zxvf hadoop-2.5.1_x64.tar.gzmv hadoop-2.5.1 / home/cd / home/lscd hadoop-2.5.1lscd etc/hadoop# configuration 1, modify hadoopenv.shvi hadoop-env.sh# and add the installation directory of jdk to this file export JAVA_HOME=/usr/java/jdk1.7.0_79# configuration 2 Modify core-site.xml vi core-site.xml fs.defaultFS hdfs://192.168.2.136:9000 hadoop.tmp.dir / opt / hadoop-2.5 # configuration 3 Modify hdfs-site.xml vi hdfs-site.xml dfs.namenode.secondary.http-address 192.168.2.137 50090 dfs.namenode.secondary.https-address 192.168.2.137VR 50091 # configuration 4 Modify slaves vi slaves 192.168.2.137 192.168.2.138 192.168.2.139 # configuration 5, modify masters vi masters 192.168.2.137

5. Copy the file scp-r hadoop-2.5.1/ root@192.168.2.137:/home/scp-r hadoop-2.5.1/ root@192.168.2.138:/home/scp-r hadoop-2.5.1/ root@192.168.2.139:/home/

6. Configure the hadoop environment variable vi ~ / .bash_profileexport HADOOP_HOME=/home/hadoop-2.5.1export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

7. Copy the file scp ~ / .bash_profile root@192.168.2.137:/root/scp ~ / .bash_profile root@192.168.2.138:/root/scp ~ / .bash_profile root@192.168.2.139:/root/

8. Reload ~ / .bash_profilesource ~ / .bash_profile

9. Format the hdf file system, using hdfs namenode-format only on NameNode (node1)

10. View the generation file fsp_w_picpathcd / opt/hadoop-2.5/dfs/name/currentls-l

11. Start Node start-dfs.sh#start-all.sh

twelve。 Turn off Firewall service iptables stop

13. Monitor the page to access http://192.168.2.136:50070/dfshealth.html#tab-overview

14. Description: ensure that the configuration under / etc/hosts is correct

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report