Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Centos6.4 install hadoop2.4.1

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

# turn off the firewall

Service iptablesstop

1: install JDK

1.1 upload jdk-7u_65-i585.tar.gz

1.2 decompress jdk

# create a folder

Mkdir / home/hadoop/app

# decompression

Tar-zxvf jdk-7u55-linux-i586.tar.gz-C / home/hadoop/app

1.3 add java to the environment variable

Sudo vim / etc/profile

# add at the end of the file

Export JAVA_HOME=/home/hadoop/app/jdk-7u_65-i585

Export PATH=$PATH:$JAVA_HOME/bin

# refresh configuration

Source / etc/profile

2: install hadoop2.4.1

Note: hadoop2.x configuration file $HADOOP_HOME/etc/hadoop

Pseudo-distribution requires five configuration files to be modified

2.1 configure hadoop/ home/hadoop/app/hadoop-2.4.1/etc/hadoop

The first one: hadoop-env.sh

Vim hadoop-env.sh

# Line 27

Export JAVA_HOME=/home/hadoop/app/jdk1.7.0_65

The second one: core-site.xml

Fs.defaultFS

Hdfs://weekend-1206-01 (hostname): 9000

Hadoop.tmp.dir

/ home/hadoop/app/hadoop-2.4.1/data

Third: hdfs-site.xml hdfs-default.xml (3)

Dfs.replication

one

Fourth: mapred-site.xml (mv mapred-site.xml.templatemapred-site.xml)

Mv mapred-site.xml.template mapred-site.xml

Vim mapred-site.xml

Mapreduce.framework.name

Yarn

The fifth: yarn-site.xml

Yarn.resourcemanager.hostname

Weekend-1206-01 (hostname)

Yarn.nodemanager.aux-services

Mapreduce_shuffle

3: start hadoop

3.1Add hadoop to environment variable vim / etc/proflie

Export JAVA_HOME=/usr/java/jdk1.7.0_65

ExportHADOOP_HOME=/home/hadoop/app/hadoop-2.4.1

ExportPATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

# source / etc/profile

3.2Formenting namenode (initializing namenode)

# hadoop namenode-format

3. 3: start / home/hadoop/app/hadoop-2.4.1/sbin

Start HDFS first

# start-dfs.sh

Restart YARN

# start-yarn.sh

Use the jps command to verify

# jps

[note: add the corresponding hostname in vim/home/hadoop/app/hadoop-2.4.1/etc/hadoop/slaves when doing the cluster]

4: browser access:

Http://ip:50070 (HDFS Management Interface)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report