Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to build a hadoop environment for ubuntu

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article introduces the knowledge of "how to build a hadoop environment in ubuntu". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

1 reference blog post

Http://blog.sina.com.cn/s/blog_75f4e17c0101h6x4.html

Http://www.cnblogs.com/end/archive/2012/08/13/2636645.html

2 installation preparation

1. Ubuntu 14.04

2. JDK version 1.8.020025

3. Hadoop version Hadoop-2.6.0

3 installation step 3.1 Linux installation

Because of the limited number of devices, a 32-bit desktop version of Ubuntu built by the VMware Workstation virtual machine is used locally. If you have a lot of information on the Internet, you don't have to elaborate on it. It takes some time.

3.2 JDK installation 3.2.1 JDK download

The local selection is JDK 1.8.0 under 25 at the address: http://www.oracle.com/technetwork/java/javase/downloads/index.html

3.2.2 JDK installation

After downloading, put it in the specified location (such as / usr/java) and install it using the following command

Tar zxvf jdk-xxx-linux-xxx.tar.gz

(reference: http://docs.oracle.com/javase/7/docs/webnotes/install/linux/linux-jdk.html)

3.2.3 JDK configuration

1. Enter the / etc directory

two。 Add JDK configuration information to the profile file

# The following configuration is for Javaexport JAVA_HOME=/usr/java/jdk1.8.0_25export PATH=$PATH:$JAVA_HOME/binexport CLASSPATH=.:JAVA_HOME/lib:$JAVA_HOME/jre/lib

3. Use the following combination command to make the configuration effective

Chmod + x profile (this command requires super permissions)

Source profile

4. Use the following to verify the JDK installation

Java-version

The installation is successful if the installation version information is displayed

3.3 Hadoop installation 3.3.1 Hadoop download

Address: http://www.apache.org/dyn/closer.cgi/hadoop/common/

3.3.2 Hadoop installation

Put the downloaded Hadoop into the specified directory (/ usr/hadoop) and install it using the following command

Tar xzf hadoop-2.6.0.tar.gz

3.3.3 Hadoop configuration

1.linux profile (path: / etc) configuration, as shown below

# The following configuration is for hadoopexport HADOOP_INSTALL=/usr/hadoop/hadoop-2.6.0export PATH=$PATH:$HADOOP_INSTALL/bin

Point JAVA_HOME to the root directory of the JDK installation in 2.hadoop-env.sh (path: $HADOOP_HOME/etc/hadoop), as follows

# The java implementation to use.export JAVA_HOME=/usr/java/jdk1.8.0_25

Point JAVA_HOME to the root directory of the JDK installation in 3.yarn-env.sh (path: $HADOOP_HOME/etc/hadoop), as follows

# some Java parameters exportJAVA_HOME=/usr/java/jdk1.8.0_25

4. Open the conf/core-site.xml (path: $HADOOP_HOME/etc/hadoop) file and edit it as follows:

Fs.default.name hdfs://localhost:9000

5. Open the conf/mapred-site.xml (path: $HADOOP_HOME/etc/hadoop) file and edit it as follows:

Mapred.job.tracker localhost:9001

6. Open the conf/hdfs-site.xml (path: $HADOOP_HOME/etc/hadoop) file and edit it as follows:

Dfs.name.dir / usr/local/hadoop/datalog1,/usr/local/hadoop/datalog2 dfs.data.dir / usr/local/hadoop/data1,/usr/local/hadoop/data2 dfs.replication 2 4 Hadoop launch

The following two commands are recommended for the new version of the startup command Apache

Enter $HADOOP_HOME/sbin first.

1. / start-dfs.sh

2. / start-yarn.sh

In addition,. / start-all.sh can still be used, but it will prompt "not recommended". The part of it automatically directs to the above two shell scripts.

After startup, if you enter jps, you can see the following information, you can find the difference with version 1.x, without JobTracker and TaskTracker, replaced by ResourceManager and NodeManager. And you can see more scripts with yarn under sbin!

If the following five services exist after the execution of jps, the configuration of hadoop stand-alone environment is successful.

5 Test installation

Hadoop BYOD testing

5.1 Verification 1

After startup, you can view it with jps, and you can see that the two services have been started.

Start the WebAppProxyServer service and enter the following command on the command line:

$HADOOP_HOME/sbin/yarn-daemon.sh start proxyserver-- config $HADOOP_CONF_DIR

Start the JobHistoryServer service and enter the following command on the command line:

$HADOOP_PREFIX/sbin/mr-jobhistory-daemon.sh start historyserver-- config $HADOOP_CONF_DIR

5.2 Verification 2

After startup, you can use the following three connections to view

A. Http://host:50070/dfshealth.jsp opens the NameNode web interface

B. Http://host:19888/jobhistory, open the JobHistory web interface

C. Http://host:8088/cluster opens the cluster web interface

5.3 Verification 3

Start the map reduce program that comes with Hadoop

Enter $HADOOP_HOME/share/hadoop/mapreduce and enter the following command

Hadoop jar hadoop-mapreduce-examples-2.6.0.jar pi 10 100

If the normal operation ends, then OK

This is the end of the content of "how to build a hadoop environment for ubuntu". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report