In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces how to deploy the hadoop2.2.0 cluster environment, which has a certain reference value. Interested friends can refer to it. I hope you can learn a lot after reading this article.
Install vmware workstation 92.vm virtual network editor under 1.windows choose to add network VMnet0 NAT, subnet IP 192.168.2.03.install debian6, the easiest installation, do not download from the image to install 4. After the installation is complete, log in to the debian system 5.apt-cache search openssh6.apt-get install openssh-server7. Switch putty to debian8.apt-get install gcc9.apt-get install g++10.apt-get install sudo11.groupadd hadoop12.useradd-g hadoop hadoop13.passwd hadoop14.vi / etc/sudoers as root and add hadoop ALL= (ALL:ALL) ALL15.mkdir / home/hadoop16.chown hadoop:hadoop / home/hadoop15.su-hadoop16.ssh-keygen-t rsa-P "17.cat / home/hadoop/.ssh/id_rsa.pub > > / home/hadoop/.ssh/authorized_keys18.chown hadoop:hadoop / Home/hadoop/.ssh/authorized_keys19.chmod 600 / home/hadoop/.ssh/authorized_keys20.ssh localhost21. Upload jdk-7u45-linux-i586.tar.gz to the root directory 22. Upload hadoop-2.2.0.tar.gz to the root directory 23.tar-zxvf / root/jdk-7u45-linux-i586.tar.gz-C / usr/java/24.tar-zxvf / root/hadoop-2.2.0.tar.gz-C / opt/25.mv / opt/hadoop-2.2.0/ / opt/hadoop26.vi .bashrc remove the # before export LS_OPTIONS...-1A' and re-su-root to make the settings take effect 27.vi / etc/profile add export JAVA_HOME=/usr/java/jdk1.7.0_45 export JRE_HOME=$ {JAVA_HOME} / jre export HADOOP_HOME=/opt/hadoop export CLASSPATH=.:$ {JAVA_HOME} / lib:$ {JRE_HOME} / lib export PATH=$ {JAVA_HOME} / bin:$HADOOP_HOME/bin:$PATH export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export HADOOP_CONF_DIR=$HADOOP_HOME/conf export YARN_CONF_DIR=$HADOOP_HOME/conf and then Re-su-root to make the setting take effect $java check whether it takes effect 28.chown hadoop:hadoop / opt/hadoop29.mkdir / opt/hadoop/conf30.cp / opt/hadoop/etc/hadoop/* / opt/hadoop/conf31.vi / opt/hadoop/conf/mapred-env.sh add export JAVA_HOME=/usr/java/jdk1.7.0_4532.vi / opt/hadoop/conf/yarn-env.sh add export JAVA_HOME=/usr/java/jdk1.7.0_4533.vi / opt/hadoop/conf/hadoop-env.sh add export JAVA_HOME=/usr/java/jdk1.7.0_4534.vi / opt/hadoop/conf/slave add slave1 slave2 slave3 note line break 35. Modify the configuration files core-site, hdfs-site, mapred-site, yarn-env four xml files 36.vi / etc/hosts under conf add 192.168.2.128 master 192.168.2.129 slave1 192.168.2.130 slave2 192.168.2.131 slave337./opt/hadoop/bin/hdfs namenode-format38. Clone master into multiple slave in vmware. In this example, there are three 39. 0. Start master/slave1-3 four virtual machines 40./opt/hadoop/sbin/start-dfs.sh (hadoop user Action) 41./opt/hadoop/sbin/start-yarn.sh (hadoop user Action) check Live Nodes View Live Nodes if 3 starts successfully thank you for reading this article carefully. I hope the article "how to deploy a hadoop2.2.0 Cluster Environment" shared by the editor will be helpful to you. At the same time, I also hope that you will support and pay attention to the industry information channel, and more related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.