Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Example Analysis of yarn and mapreduce in Hadoop

2025-01-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly introduces the example analysis of yarn and mapreduce in Hadoop, which has a certain reference value, and interested friends can refer to it. I hope you will gain a lot after reading this article.

Prepare the Linux environment

Click the VMware shortcut, right open the location of the file-> double-click vmnetcfg.exe-> VMnet1 host-only-> modify the subnet ip setting IP address range: 192.168.1.0 Subnet Mask: 255.255.255.0-> apply-> ok

Go back to windows-> Open Network and sharing Center-> change Adapter Settings-> right-click VMnet1-> Properties-> double-click IPv4-> set the IP:192.168.1.100 subnet mask for windows: 255.255.255.0-> Click OK

On virtual software-- My Computer-> Select virtual machine-> right-click-> settings-> network adapter-> host only-> ok

1.1 modify hostname

Vim / etc/sysconfig/network

NETWORKING=yes

HOSTNAME=itcast #

1.2 modify IP

There are two ways:

First: modify it through the Linux graphical interface (highly recommended)

Enter the Linux graphical interface-> right click on the two minicomputers on the top right-> Click Edit connections-> Select the current network System eth0-> click the edit button-> Select IPv4-> method Select manual-> click the add button-> add the IP:192.168.1.101 subnet mask: 255.255.255.0 Gateway: 192.168.1.1-> apply

Second: modify the configuration file (for losers and programmers only)

Vim / etc/sysconfig/network-scripts/ifcfg-eth0

DEVICE= "eth0"

BOOTPROTO= "static" #

HWADDR= "00:0C:29:3C:BF:E7"

IPV6INIT= "yes"

NM_CONTROLLED= "yes"

ONBOOT= "yes"

TYPE= "Ethernet"

UUID= "ce22eeca-ecde-4536-8cc2-ef0dc36d4a8c"

IPADDR= "192.168.1.101" #

NETMASK= "255.255.255.0" #

GATEWAY= "192.168.1.1" # #

1.3 modify the mapping relationship between hostname and IP

Vim / etc/hosts

192.168.1.101 hadoop02

1.4 turn off the firewall

# View firewall status

Service iptables status

# turn off the firewall

Service iptables stop

# check the boot status of the firewall

Chkconfig iptables-list

# turn off the firewall and boot

Chkconfig iptables off

1.5 restart Linux

Reboot

two。 Install JDK

2.1 after uploading alt+p, the sftp window appears, and then put d:\ xxx\ yy\ ll\ jdk-7u_65-i585.tar.gz

2.2 decompress jdk

# create a folder

Mkdir / home/hadoop/app

# decompression

Tar-zxvf jdk-7u55-linux-i586.tar.gz-C / home/hadoop/app

2.3 add java to the environment variable

Vim / etc/profile

# add at the end of the file

Export JAVA_HOME=/home/hadoop/app/jdk-7u_65-i585

Export PATH=$PATH:$JAVA_HOME/bin

# refresh configuration

Source / etc/profile

3. Install hadoop2.4.1

First upload the installation package of hadoop to the server / home/hadoop/ (note the difference between 32-bit and 64-bit)

Cp hadoop-2.4.1.tar.gz / home/hadoop/

Cd / home/hadoop/

Tar-zxvf hadoop-2.4.1.tar.gz

[root@hadoop02 hadoop] # cd hadoop-2.4.1

[root@hadoop02 hadoop-2.4.1] # cd etc

[root@hadoop02 etc] # cd hadoop

Pseudo-distribution requires five configuration files to be modified

3.1Configuring hadoop

The first one: hadoop-env.sh

Vim hadoop-env.sh

# Line 27

Export JAVA_HOME=/usr/lib/java/java-7-sun

The second one: core-site.xml

Fs.defaultFS

Hdfs://hadoop02:9000

Hadoop.tmp.dir

/ home/hadoop/hadoop-2.4.1/tmp

Third: hdfs-site.xml hdfs-default.xml (3)

Dfs.replication

one

Fourth: mapred-site.xml (mv mapred-site.xml.template mapred-site.xml)

Mv mapred-site.xml.template mapred-site.xml

Vim mapred-site.xml

Mapreduce.framework.name

Yarn

The fifth: yarn-site.xml

Yarn.resourcemanager.hostname

Hadoop0

Yarn.nodemanager.aux-services

Mapreduce_shuffle

3.2 add hadoop to the environment variable

Vim / etc/profile

Export JAVA_HOME=/home/hadoop/app/jdk-7u_65-i585

Export HADOOP_HOME=/home/hadoop/hadoop-2.4.1

Export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

Source / etc/profile

3.3.Formenting namenode (initializing namenode)

Hdfs namenode-format (hadoop namenode-format)

3.4 start hadoop

Start HDFS first

[root@hadoop02 hadoop] # cd / home/hadoop/hadoop-2.4.1

Sbin/start-dfs.sh

Restart YARN

Sbin/start-yarn.sh

3.5 verify that the startup is successful

Use the jps command to verify

27408 NameNode

28218 Jps

27643 SecondaryNameNode

28066 NodeManager

27803 ResourceManager

27512 DataNode

Http://192.168.1.101:50070 (HDFS Management Interface)

Http://192.168.1.101:8088 (MR Management Interface)

Testing of pseudo-distributed clusters [root@hadoop02 hadoop-2.6.0] # hadoop fs-ls hdfs://hadoop02:9000/

[root@hadoop02 hadoop] # cd / home/hadoop

[root@hadoop02 hadoop] # hadoop fs-put hadoop-2.6.0-x64.tar.gz hdfs://hadoop02:9000/

[root@hadoop02 hadoop] # hadoop fs-ls hdfs://hadoop02:9000/

-rw-r--r-- 1 root supergroup 180197035 2015-12-01 05:06 hdfs://hadoop02:9000/hadoop-2.6.0-x64.tar.gz

File storage directory [root@hadoop02 subdir0] # pwd

/ home/hadoop/hadoop-2.6.0/tmp/dfs/data/current/BP-493792533-192.168.88.155-1448973412341/current/finalized/subdir0/subdir0

Download the file hadoop fs-get / hadoop-2.6.0-x64.tar.gz

Mappreduce program testing

[root@hadoop02 subdir0] # / home/hadoop/hadoop-2.6.0/share/hadoop/mapreduce

[root@hadoop02 mapreduce] # vi test.data

The contents of the document are as follows

Hello tom

Hello jim

Hello kitty

Hello angelababy

Hello dingcheng

How are you .baby .do you have kong today wangshan?

Can you have supper with me?

If you can't

Na sure you are gen xiaoming haave supper

I will shengqi de

[root@hadoop02 mapreduce] # hadoop fs-mkdir / wc

[root@hadoop02 mapreduce] # hadoop fs-mkdir / wc/srcdata

[root@hadoop02 mapreduce] # hadoop-put test.data / wc/srcdata

[root@hadoop02 mapreduce] # hadoop fs-put test.data / wc/srcdata

[root@hadoop02 mapreduce] # hadoop fs-ls /

[root@hadoop02 mapreduce] # hadoop fs-ls / wc/srcdata

[root@hadoop02 mapreduce] # hadoop jar hadoop-mapreduce-examples-2.6.0.jar wordcount / wc/srcdata / wc/output

[root@hadoop02 mapreduce] # hadoop fs-ls / wc/output

Hadoop fs-cat / wc/output/part-r-00000

Hadoop fs# is the equivalent of a help command

Thank you for reading this article carefully. I hope the article "sample Analysis of yarn and mapreduce in Hadoop" shared by the editor will be helpful to you. At the same time, I also hope you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report