Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Centos6.5 install hadoop2.7.6 1master2node

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

The environment uses centos 6.5 minimum installation, hadoop2.7.6.tar.gz installation downloaded by hadoop, and package installation of 1.8.0,172 on the official website of oracle downloaded by Java.

After referring to other people's documents, I encountered different problems.

1. 3 centos6.5 vmware

192.168.17.205 Master

192.168.17.206 hadoop1

192.168.17.207 hadoop2

Add the same user to the three hosts and set the password:

Add user

Useradd hadoop

Modify the password

Passwd hadoop

SSH password-free login installation, configuration

Ensure that the master host can log on to the datanodes node without a password.

# ssh-keygen-t rsa-P''- f ~ / .ssh/id_rsa#ssh-copy-id-I ~ / .ssh/id_rsa.pub 192.168.17.206

test

[root@Master] # ssh 192.168.17.206Last login: Mon Apr 23 12:56:33 2018 from 192.168.17.1 [root@hadoop1] #

Installation and uninstallation of JDK

Uninstall JDK

# check the current installation of JDKrpm-qa | grep gcjrpm-qa | grep jdkrpm-e-- nodeps java-1.7.0-openjdk-1.7.0.45-2.4.3.3.el6.x86_64rpm-e-- nodeps java-1.6.0-openjdk-1.6.0.0-1.66.1.13.0.el6.x86_64# query the relevant java directories in the current system and delete whereis javajava: / etc/java / usr/lib / java / usr/share/java# delete the query result directory rm-fr / usr/share/javarm-fr / usr/lib/javarm-fr / etc/java

Install JDK

Create a java directory under the usr directory and download JDK and extract it to the / usr/java directory

Cd / usrmkdir javacd javawget http://download.oracle.com/otn-pub/java/jdk/8u102-b14/jdk-8u102-linux-x64.tar.gztar-zxvf jdk-8u172-linux-x64.tar.gz

You may have to download it manually and then upload it to html instead of tar.

Edit the vim / etc/profile file and append at the end

JAVA_HOME=/usr/java/jdk1.8.0_172PATH=$JAVA_HOME/bin:$PATHCLASSPATH=$JAVA_HOME/jre/lib/ext:$JAVA_HOME/lib/tools.jarexport PATH JAVA_HOME CLASSPATH

Make the / etc/profile file effective without rebooting the operating system

Source / etc/profile

Check the installation status of java

Java-version

two。 Install Hadoop2.7.6

Download hadoop-2.7.6

Cd / tmp

Wget http://apache.fayea.com/hadoop/common/hadoop-2.7.3/hadoop-2.7.6.tar.gz

Decompress hadoop-2.7.6.tar.gz

Tar-zxvf hadoop-2.7.6.tar.gz

Copy the file to the / usr directory

Cp-R / tmp/hadoop-2.7.3 / usr/hadoop

Configure the environment variable of hadoop and append it under / etc/profile

Vi / etc/profile

Append the following environment variable settings

HADOOP_HOME=/usr/hadoopexport HADOOP_INSTALL=$HADOOP_HOMEexport HADOOP_MAPRED_HOME=$HADOOP_HOMEexport HADOOP_COMMON_HOME=$HADOOP_HOMEexport HADOOP_HDFS_HOME=$HADOOP_HOMEexport YARN_HOME=$HADOOP_HOMEexport HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/nativeexport PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

Modify the etc/hadoop/hadoop-env.sh file

Vi etc/hadoop/hadoop-env.sh

# The java implementation to use.#export JAVA_HOME=$ {JAVA_HOME} export JAVA_HOME=/usr/java/jdk1.8.0_172

Modify the etc/hadoop/core-site.xml file

Hadoop.tmp.dir file:/usr/hadoop/tmp Abase for other temporary directories. Fs.defaultFS hdfs://Master:9000

Modify the etc/hadoop/hdfs-site.xml file

Dfs.namenode.secondary.http-address Master:50090 dfs.replication 1 dfs.namenode.name.dir file:/usr/hadoop/tmp/dfs/name dfs.datanode.data.dir file:/usr/hadoop/tmp/dfs/data

Modify the etc/hadoop/yarn-site.xml file

Yarn.resourcemanager.hostname Master yarn.nodemanager.aux-services mapreduce_shuffle

Modify the etc/hadoop/mapred-site.xml file

Mapreduce.framework.name yarn mapreduce.jobhistory.address Master:10020 mapreduce.jobhistory.webapp.address Master:19888

Modify etc/hadoop/slaves file, add

192.168.17.206

192.168.17.207

Package the folder / usr/hadoop and copy it to the datanode node machine to ensure that the configuration of the node machine environment is consistent with that of master to format the file system

Hdfs namenode-format

Start the file service

Start-dfs.sh

Make the HDFS directories required to execute MapReduce jobs:

Turn off the file service

Stop-dfs.sh

Create a user file system folder

Hdfs dfs-mkdir / user

Hdfs dfs-mkdir / user/hadoop

Copy local files to the distributed file system input

Hdfs dfs-mkdir / input

Hdfs dfs-put etc/hadoop/*.xml input

3. The error "WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform … using builtin-java classes where applicable" export HADOOP_OPTS= "- Djava.library.path=$HADOOP_HOME/lib:$HADOOP_COMMON_LIB_NATIVE_DIR" will always be prompted after installation.

Add the same configuration to the end of the hadoop-env.sh file.

3.2hadoop Live Nodes: 0 has said a lot about these online, mainly including: 1 turn off safemode mode, enter the following command: bin/hadoop dfsadmin-safemode leave2 to check whether the storage space is sufficient, enter the following command: df-hl3 reformat hdfs, follow step a) delete the logs file in the hadoop directory on master and all slave, and re-establish b) delete the hdfs storage directory on master and all slave That is, the directory pointed to by the value corresponding to the hadoop.tmp.dir attribute in the conf/core-site.xml configuration file, and re-establish 4 to check whether the firewall is off, enter the following command to view the status: service iptables status enter the following command to close the firewall service iptables stop5 re-check the configuration file, but the above methods have been used, still did not solve the problem, the following is how to solve the problem. Check the / etc/hosts file on each machine and delete the ip:name pairs that are useless or unclear, leaving only [root@hadoop1 hadoop] # cat / etc/hosts127.0.0.1 localhost192.168.17.205 Master192.168.17.206 hadoop1192.168.17.207 hadoop2

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report