Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to install hadoop2.5.1 to a virtual machine

2025-04-07 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly shows you "how to install hadoop2.5.1 to the virtual machine", the content is easy to understand, clear, hope to help you solve your doubts, let the editor lead you to study and learn "how to install hadoop2.5.1 to the virtual machine" this article.

1. Virtual machine installation

System version RHEL6.3

two。 Virtual machine ip configuration

Using shared mode (nat):

Default:

Use vmnet8 to set the virtual machine to surf the Internet in dhcp mode, select "automatically obtain ip" under windows, and enable dhcp service under linux.

Set up manually:

Ip sets the same network segment as vmnet8

The gateway is set to vmnet8.

The DNS server settings are the same as the host

Realize the virtual machine host virtual machine Internet communication.

Vim / etc/sysconfig/network-scripts/ifcfg-eth0

.

BOOTPROTO=static

ONBOOT= "yes"

IPADDR=192.168.234.100-set to the same ip segment as the gateway

NETMASK=255.255.255.0

Gateway of GATEWAY=192.168.234.2-- vmnet8

.

Finally, don't forget to set DNS echo "nameserver 8.8.8.8" >

Restart reboot

Once set up, you can view it with SecureCRT, and you can ping the extranet.

Yum source configuration

View the documentation and use CentOS's yum source for upgrade or software installation

Rpm-qa | grep yum to check the installation of yum

Rpm-qa | grep yum | xargs rpm-e-- nodeps uninstall the original yum of rhel

Install the yum package

1. # rpm-ivh python-iniparse-0.3.1-2.1.el6.noarch.rpm 2. # rpm-ivh yum-metadata-parser-1.1.2-16.el6.x86_64.rpm

3. # rpm-ivh yum-3.2.29-30.el6.centos.noarch.rpm yum-plugin-fastestmirror-1.1.3 0-14.el6.noarch.rpm

Since the version 6.3 of the centos official website is no longer updated and the file does not exist, the yum related package of version 6.5 of the centos official website is selected.

Download 163Warehouse at the same time, http://mirrors.163.com/.help/CentOS6-Base-163.repo modifies $releaseser to 6.5,

Delete / etc/yum.repos.d/rhel-source.repo

Replace with centos-163.repo

Scp centos-163.repo root@192.168.234.112: / etc/yum.repos.d/

Install rz+sz to facilitate uploading and downloading files to the local area

Yum-y install lrzsz

Transfer all dependent files to other servers

Scp-r dep root@192.168.234.112: / home/hadoop/

Install java and configure environment variables

Java-version to view the existing java version

Rpm-qa | grep java to view the java installation package

Rpm-qa | grep java | xargs rpm-e-- nodeps uninstall the java installation package

Vim / etc/profile add java environment variable configuration

# set java path

JAVA_HOME=/usr/java/jdk1.7.0_45

JRE_HOME=/usr/java/jdk1.7.0_45/jre

PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin

CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib

Source / etc/profile application configuration

Other machines use the same configuration

Scp / etc/profile root@192.168.234.112:/etc/profile

Configure host

192.168.234.114 namenode

192.168.234.111 datanode1

192.168.234.112 datanode2

192.168.234.113 datanode3

Other namenode and datanode configurations are the same

Scp / etc/hosts root@192.168.234.112:/etc/hosts

Configure ssh

Http://www.cnblogs.com/xia520pi/archive/2012/05/16/2503949.html

Ssh-keygen-t dsa-P''- f ~ / .ssh/id_dsa

Cat ~ / .ssh/id_dsa.pub > > ~ / .ssh/authorized_keys

Scp to datanode

Install hadoop

Tar-xvf dep/hadoop/hadoop-2.5.1.tar.gz

Configure hadoop

Http://blog.csdn.net/greensurfer/article/details/39450369

Add data nodes to the slaves file in / home/hadoop/hadoop-2.5.1/etc/hadoop

Datanode1

Datanode2

Datanode3

Configure hostname

Formatting namenode error handling method

Http://blog.csdn.net/shirdrn/article/details/6562292

The hostname in / etc/sysconfig/network is modified and the network service is restarted to take effect

Service network restart

There are three ways to modify hostname: http://www.cnblogs.com/yuechaotian/archive/2013/02/20/2918175.html

Format the file system

Bin/hdfs namenode-format

Http://192.168.234.114:50070/ must remember to turn off the firewall of namenode, otherwise you can't access it.

Service iptables stop/status

Http://192.168.234.114:8088/

But starting. / start-dfs.sh will always have a warning.

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... Using builtin-java classes where applicable

Starting namenodes on [namenode]

Hadoop fs-mkdir / input command reported an error

Turn on debug mode

Export HADOOP_ROOT_LOGGER=DEBUG,console

You can see that it is because the glibc version is too low.

Http://f.dataguru.cn/thread-376971-1-1.html

There are two solutions.

1. Recompile hadoop in glibc2.12 environment

two。 Install glibc2.14 or above

I'll use the second one here.

Upgrade glibc to more than 2.14. Do not uninstall glibc before upgrading. After uninstalling, all commands are invalid and error is reported when rebooting.

Kernel panic-not attempted to kill init, I had to reinstall the system for a long time.

After reinstalling namenode, send hadoop from datanode

Reboot after formatting the file system, otherwise the data node will not be found

1. download

Glibc-2.15-60.el6.x86_64.rpm

Glibc-common-2.15-60.el6.x86_64.rpm

Installation

Rpm-ivh glibc-2.15-60.el6.x86_64.rpm glibc-common-2.15-60.el6.x86_64.rpm-replacefiles

Do not add-- replacefiles will report the error of file conflict.

Rerun. / start-dfs.sh, no more alarm

Hadoop fs-mkdir / input create folder

Hadoop fs-ls / list of presentation files

Hadoop error: NoRouteToHostException: No route to host

Http://blog.csdn.net/duguduchong/article/details/8471314

Turn off the firewall of the datanode data node

Test mapreduce

Reference http://www.cnblogs.com/xia520pi/archive/2012/05/16/2504205.html

Create the folder input, create the test file test1,test2

Mkdir input & & cd input

Echo "a b c d e" >

Echo "a b c d f" > test2

Hadoop fs-put input /

You can see that you have joined the hdfs.

Run WordCount

Cd / home/hadoop/hadoop-2.5.1/share/hadoop/mapreduce

Hadoop jar hadoop-mapreduce-examples-2.5.1.jar wordcount / input / output

I packed the bag myself.

Hadoop jar wordct.jar org.WordCount / input / output

Hadoop fs-cat / output/part-r-00000

These are all the contents of the article "how to install hadoop2.5.1 into a virtual machine". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report