Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Detailed explanation of the process of installing hadoop

2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Download file for wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz hadoop

Install jdk

Http://www.linuxidc.com/Linux/2014-08/105906.htm

Install hadoop

Enter

/ root/zby/hadoop/hadoop-1.2.1/conf

Configure hadoop, mainly to configure three configuration files of core-site.xml,hdfs-site.xml,mapred-site.xml

4 files need to be edited:

Change the first file to jdk and follow the path.

Hadoop-env.sh

Export HADOOP_HEAPSIZE=256 modifies the memory used by hadoop

# export JAVA_HOME=/usr/lib/jvm/jdk7 needs to be edited

If you don't know the path, you can find it with the following command

[root@iZ28c21psoeZ conf] # echo $JAVA_HOME

/ usr/lib/jvm/jdk7

The second file: open the file to replace directly, the following Chinese comments are deleted and pasted.

Cd / opt/hadoop-1.2.1/conf

Vim core-site.xml

Hadoop.tmp.dir

/ hadoop

Dfs.name.dir

Hadoop/name

The third file: the following Chinese comments are deleted and pasted.

Vim hdfs-site.xml

Dfs.data.dir

/ hadoop/data

The fourth file: the following Chinese comments are deleted and pasted.

Vim mapred-site.xml

Mapred.job.tracker

Ldy:9001

Next, we need to modify vim / etc/profile.

Put the following code at the end, and you don't have to add it if the first five lines are already in effect when the JDK is installed.

Export JAVA_HOME=/usr/lib/jvm/jdk7

Export JRE_HOME=$ {JAVA_HOME} / jre

Export CLASSPATH=.:$ {JAVA_HOME} / lib:$ {JRE_HOME} / lib

Export PATH=$ {JAVA_HOME} / bin:$PATH

Export HADOOP_HOME=/opt/hadoop-1.2.1

Export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$PATH

Next, enter the directory:

/ opt/hadoop-1.2.1/bin

Perform a format operation on hadoop:

Hadoop-namenode-format

If you encounter the following error:

Warning: $HADOOP_HOME is deprecated.

/ opt/hadoop-1.2.1/bin/hadoop: line 350: / usr/lib/jdk7/bin/java: No such file or directory

/ opt/hadoop-1.2.1/bin/hadoop: line 434: / usr/lib/jdk7/bin/java: No such file or directory

/ opt/hadoop-1.2.1/bin/hadoop: line 434: exec: / usr/lib/jdk7/bin/java: cannot execute: No such file or directory

Check whether the first file is correct

[root@iZ28c21psoeZ conf] # echo $JAVA_HOME

/ usr/lib/jvm/jdk7

Then it was carried out, and the report was wrong again.

[root@iZ28c21psoeZ bin] # hadoop-namenode-format

Warning: $HADOOP_HOME is deprecated.

Unrecognized option:-namenode

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.

[root@iZ28c21psoeZ bin] #

There are two things that can be modified.

First (secondary): / opt/hadoop/conf/hadoop-env.sh

Modify parameter: the default value of export HADOOP_HEAPSIZE=256 # is 2000m, which is the amount of memory occupied by the Java virtual machine.

Second (main): save the following source code at the bottom of hadoop

View / opt/hadoop/bin/hadoop source code:

#

If [$EUID-eq 0]; then

HADOOP_OPTS= "$HADOOP_OPTS-jvm server $HADOOP_DATANODE_OPTS"

Else

HADOOP_OPTS= "$HADOOP_OPTS-server $HADOOP_DATANODE_OPTS"

Fi

#

Re-execution, look at the results, seems to be wrong again.

[root@iZ28c21psoeZ bin] #. / hadoop namenode-format

Warning: $HADOOP_HOME is deprecated.

16-07-04 18:49:04 INFO namenode.NameNode: STARTUP_MSG:

/ *

STARTUP_MSG: Starting NameNode

STARTUP_MSG: host = iZ28c21psoeZ/10.251.57.77

STARTUP_MSG: args = [- format]

STARTUP_MSG: version = 1.2.1

STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2-r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013

STARTUP_MSG: java = 1.7.00.060

* * /

[Fatal Error] core-site.xml:11:3: The element type "property" must be terminated by the matching end-tag "

18:49:04 on 16-07-04 FATAL conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "".

18:49:04 on 16-07-04 ERROR namenode.NameNode: java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "".

At org.apache.hadoop.conf.Configuration.loadResource (Configuration.java:1249)

At org.apache.hadoop.conf.Configuration.loadResources (Configuration.java:1107)

At org.apache.hadoop.conf.Configuration.getProps (Configuration.java:1053)

At org.apache.hadoop.conf.Configuration.set (Configuration.java:420)

At org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption (NameNode.java:1374)

At org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode (NameNode.java:1463)

At org.apache.hadoop.hdfs.server.namenode.NameNode.main (NameNode.java:1488)

Caused by: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag ".

At com.sun.org.apache.xerces.internal.parsers.DOMParser.parse (DOMParser.java:257)

At com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse (DocumentBuilderImpl.java:347)

At javax.xml.parsers.DocumentBuilder.parse (DocumentBuilder.java:177)

At org.apache.hadoop.conf.Configuration.loadResource (Configuration.java:1156)

... 6 more

16-07-04 18:49:04 INFO namenode.NameNode: SHUTDOWN_MSG:

/ *

SHUTDOWN_MSG: Shutting down NameNode at iZ28c21psoeZ/10.251.57.77

* * /

[root@iZ28c21psoeZ bin] #

According to the log prompt, there are errors in the three major configuration files:

Sure enough:

It's written.

Run it again and see:

[root@iZ28c21psoeZ bin] #. / hadoop namenode-format

Warning: $HADOOP_HOME is deprecated.

16-07-04 18:55:26 INFO namenode.NameNode: STARTUP_MSG:

/ *

STARTUP_MSG: Starting NameNode

STARTUP_MSG: host = iZ28c21psoeZ/10.251.57.77

STARTUP_MSG: args = [- format]

STARTUP_MSG: version = 1.2.1

STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2-r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013

STARTUP_MSG: java = 1.7.00.060

* * /

16-07-04 18:55:27 INFO util.GSet: Computing capacity for map BlocksMap

18:55:27 on 16-07-04 INFO util.GSet: VM type = 64-bit

18:55:27 on 16-07-04 INFO util.GSet: 2.0% max memory = 259522560

18:55:27 on 16-07-04 INFO util.GSet: capacity = 2 ^ 19 = 524288 entries

16-07-04 18:55:27 INFO util.GSet: recommended=524288, actual=524288

16-07-04 18:55:32 INFO namenode.FSNamesystem: fsOwner=root

16-07-04 18:55:33 INFO namenode.FSNamesystem: supergroup=supergroup

16-07-04 18:55:33 INFO namenode.FSNamesystem: isPermissionEnabled=true

16-07-04 18:55:42 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100

18:55:42 on 16-07-04 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min (s), accessTokenLifetime=0 min (s)

18:55:42 on 16-07-04 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = 0

16-07-04 18:55:42 INFO namenode.NameNode: Caching file names occuring more than 10 times

16-07-04 18:55:45 INFO common.Storage: Image file / hadoop/dfs/name/current/fsp_w_picpath of size 110 bytes saved in 0 seconds.

16-07-04 18:55:47 INFO namenode.FSEditLog: closing editlog: position=4, editlog=/hadoop/dfs/name/current/edits

16-07-04 18:55:47 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/dfs/name/current/edits

16-07-04 18:55:48 INFO common.Storage: Storage directory / hadoop/dfs/name has been successfully formatted.

16-07-04 18:55:48 INFO namenode.NameNode: SHUTDOWN_MSG:

/ *

SHUTDOWN_MSG: Shutting down NameNode at iZ28c21psoeZ/10.251.57.77

* * /

Perfect: go on:

Cd / opt/hadoop-1.2.1/bin

[root@iZ28c21psoeZ bin] # start-all.sh

Warning: $HADOOP_HOME is deprecated.

Starting namenode, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-namenode-iZ28c21psoeZ.out

Localhost: socket: Address family not supported by protocol

Localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

Localhost: socket: Address family not supported by protocol

Localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

Starting jobtracker, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-iZ28c21psoeZ.out

Localhost: socket: Address family not supported by protocol

Localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

[root@iZ28c21psoeZ bin] #

Translate:

Warning: $HADOOP_HOME is deprecated.

Start with namenode, log / opt / hadoop-1.2.1 / libexec /. . / Log / hadoop-root-namenode-iZ28c21psoeZ.out

Localhost: sockets: protocols not supported by home addresses

Localhost:ssh connects to host local host port 22: protocol not supported by home address

Localhost: sockets: protocols not supported by home addresses

Localhost:ssh connects to host local host port 22: protocol not supported by home address

Start with jobtracker, log / opt / hadoop-1.2.1 / libexec /. . / Log / hadoop-root-jobtracker-iZ28c21psoeZ.out

Localhost: sockets: protocols not supported by home addresses

After modifying the code:

According to the log, the port is wrong, just change the port of hadoop to the same port as the ssh port of the server.

Add a new export HADOOP_SSH_OPTS= "- p 1234" in conf/hadoop-env.sh

Let's do it again:

[root@ldy bin] # sh start-all.sh

Warning: $HADOOP_HOME is deprecated.

Starting namenode, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-namenode-ldy.out

Localhost: starting datanode, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-datanode-ldy.out

Localhost: starting secondarynamenode, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-secondarynamenode-ldy.out

Starting jobtracker, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-ldy.out

Localhost: starting tasktracker, logging to / opt/hadoop-1.2.1/libexec/../logs/hadoop-root-tasktracker-ldy.out

[root@ldy bin] # jps

27054 DataNode

26946 NameNode

27374 TaskTracker

27430 Jps

27250 JobTracker

27165 SecondaryNameNode

Ok now has 6 ports up, which is a success.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report