In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces CentOS6.5 how to achieve Hadoop_2.8.1 compilation and HDFS pseudo-distributed deployment, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, let the editor take you to understand it.
CentOS6.5 implements Hadoop compilation and HDFS pseudo-distributed deployment
Three components of Hadoop 2.x
MapReduce (Others)-Computing
YARN-Resource and Job scheduling platform
HDFS-Stora
Environmental preparation:
[root@hadoop001 software] # pwd
/ opt/software
-rw-r--r--. 1 root root 8617253 May 14 07:05 apache-maven-3.3.9-bin.zip
-rw-r--r--. 1 root root 7546219 May 14 07:05 findbugs-1.3.9.zip
-rw-r--r--. 1 root root 34523353 May 14 07:05 hadoop-2.8.1-src.tar.gz
-rw-r--r--. 1 root root 424555111 May 14 07:09 hadoop-2.8.1.tar.gz
-rw-r--r--. 1 root root 173271626 May 14 07:09 jdk-8u45-linux-x64.gz
-rw-r--r--. 1 root root 96721446 May 14 07:07 .m2.tar.gz
-rw-r--r--. 1 root root 2401901 May 14 07:04 protobuf-2.5.0.tar.gz
1.Hadoop source code download
[root@hadoop001 software] # tar-xzvf hadoop-2.8.1-src.tar.gz
# check the source package environment
[root@hadoop001 hadoop-2.8.1-src] # cat BUILDING.txt
Requirements:
* Unix System
* JDK 1.7 +
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6or newer (if compiling native code), must be 3.0or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2.JAVA installation
[root@hadoop001] # mkdir-p / usr/java
[root@hadoop001 ~] # mv jdk-8u45-linux-x64.gz / usr/java
[root@hadoop001 ~] # cd / usr/java
[root@hadoop001 ~] # tar-xzvf jdk-8u45-linux-x64.gz
# modify users and user groups
[root@hadoop001 java] # ll
Total 169388
Drwxr-xr-x. 8 uucp 143 4096 Apr 10 2015 jdk1.8.0_45
-rw-r--r--. 1 root root 173271626 May 14 07:09 jdk-8u45-linux-x64.gz
[root@hadoop002 java] # chown-R root:root jdk1.8.0_45
[root@hadoop002 java] # ll
Total 169388
Drwxr-xr-x. 8 root root 4096 Apr 11 2015 jdk1.8.0_45
-rw-r--r--. 1 root root 173271626 Mar 16 15:25 jdk-8u45-linux-x64.gz
# JAVA global variable configuration
[root@hadoop001 java] # vi / etc/profile
Export JAVA_HOME=/usr/java/jdk1.8.0_45
Export PATH=$JAVA_HOME/bin:$PATH
Note: there is no need to uninstall the original JDK, the environment variable can override the original JDK
[root@hadoop001 java] # source / etc/profile
[root@hadoop001 java] # which java
/ usr/java/jdk1.8.0_45/bin/java
[root@hadoop001 java] # java-version
Java version "1.8.045"
Java (TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot (TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
3.Maven installation and deployment
[root@hadoop001 software] # ll
Total 466948
-rw-r--r--. 1 root root 8617253 May 14 07:05 apache-maven-3.3.9-bin.zip
[root@hadoop001 software] # unzip apache-maven-3.3.9-bin.zip
[root@hadoop001 software] # ll
Total 466952
Drwxr-xr-x. 6 root root 4096 Nov 10 2015 apache-maven-3.3.9
-rw-r--r--. 1 root root 8617253 May 14 07:05 apache-maven-3.3.9-bin.zip
# Maven global environment variable
[root@hadoop001 java] # vi / etc/profile
Export MAVEN_HOME=/opt/software/apache-maven-3.3.9
Export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 software] # source / etc/profile
[root@hadoop001 software] # which mvn
/ opt/software/apache-maven-3.3.9/bin/mvn
[root@hadoop001 software] # mvn-version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T11:41:47-05:00)
Maven home: / opt/software/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: / usr/java/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
# manual import of Maven warehouse files
[root@hadoop001 software] # mv .m2.tar.gz ~
[root@hadoop001 software] # cd ~
[root@hadoop001 ~] # tar-xzvf .m2.tar.gz
# default path and modification of Maven repository
[root@hadoop001 conf] # pwd
/ opt/software/apache-maven-3.3.9/conf/settings.xml
-- pro.xml project in maven repository, compiled, packaged and tested by mvn
4.protobuf source code installation
[root@hadoop001 software] # pwd
/ opt/software
[root@hadoop001 software] # tar-xzvf protobuf-2.5.0.tar.gz
[root@hadoop001 software] # ll
Total 466956
Drwxr-xr-x. 10 109965 5000 4096 Feb 26 2013 protobuf-2.5.0
-rw-r--r--. 1 root root 2401901 May 14 07:04 protobuf-2.5.0.tar.gz
[root@hadoop001 software] # chown-R root:root protobuf-2.5.0
[root@hadoop001 software] # cd protobuf-2.5.0
[root@hadoop001 protobuf-2.5.0] # yum install-y gcc gcc-c++ make cmake
[root@hadoop001 protobuf-2.5.0] #. / configure-prefix=/usr/local/protobuf
[root@hadoop001 protobuf-2.5.0] # make & & make install
# protobuf global environment variable
[root@hadoop001 java] # vi / etc/profile
Export PROTOC_HOME=/usr/local/protobuf
Export PATH=$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 protobuf-2.5.0] # source / etc/profile
[root@hadoop001 protobuf-2.5.0] # protoc-- version
Libprotoc 2.5.0
[root@hadoop001 protobuf-2.5.0] #
5.Findbugs installation
[root@hadoop001 software] # pwd
/ opt/software
[root@hadoop001 software] # unzip findbugs-1.3.9.zip
[root@hadoop001 software] # ll
Total 466960
Drwxr-xr-x. 7 root root 4096 Aug 21 2009 findbugs-1.3.9
-rw-r--r--. 1 root root 7546219 May 14 07:05 findbugs-1.3.9.zip
# Findbugs global environment variable
[root@hadoop002 software] # vi / etc/profile
Export FINDBUGS_HOME=/opt/software/findbugs-1.3.9
Export PATH=$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 software] #
[root@hadoop001 software] # source / etc/profile
[root@hadoop001 software] # findbugs-version
1.3.9
6. Other dependencies
Yum install-y openssl openssl-devel svn ncurses-devel zlib-devel libtool
Yum install-y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
7.Hadoop source code compilation
[root@hadoop001 hadoop-2.8.1-src] # mvn clean package-Pdist,native-DskipTests-Dtar
[INFO] Apache Hadoop Extras.. SUCCESS [3.050 s]
[INFO] Apache Hadoop Pipes.. SUCCESS [7.795 s]
[INFO] Apache Hadoop OpenStack support. SUCCESS [5.657 s]
[INFO] Apache Hadoop Amazon Web Services support. SUCCESS [6.914 s]
[INFO] Apache Hadoop Azure support.... SUCCESS [5.264 s]
[INFO] Apache Hadoop Client.. SUCCESS [7.341 s]
[INFO] Apache Hadoop Mini-Cluster. SUCCESS [1.698 s]
[INFO] Apache Hadoop Scheduler Load Simulator. SUCCESS [6.050 s]
[INFO] Apache Hadoop Tools Dist.. SUCCESS [6.549 s]
[INFO] Apache Hadoop Azure Data Lake support. SUCCESS [3.666 s]
[INFO] Apache Hadoop Tools.. SUCCESS [0.016 s]
[INFO] Apache Hadoop Distribution. SUCCESS [47.066 s]
[INFO]
[INFO] BUILD SUCCESS
[INFO]
[INFO] Total time: 12:39 min
[INFO] Finished at: 2018-05-14T22:30:38-04:00
[INFO] Final Memory: 190M/454M
[INFO]
-
-
Hodoop deployment
Stand-alone-no process
Pseudo-distributed deployment-process existence + 1 node development
Cluster deployment-process exists + n phase development / production
[root@hadoop001 software] # tar-xzvf hadoop-2.8.1.tar.gz
[root@hadoop001 software] # chown-R root:root hadoop-2.8.1
[root@hadoop001 hadoop-2.8.1] # ll
Drwxrwxr-x. 2 root root 4096 Jun 2 2017 bin-execute the command shell
Drwxrwxr-x. 3 root root 4096 Jun 2 2017 etc-profile
Drwxrwxr-x. 2 root root 4096 Jun 2 2017 include
Drwxrwxr-x. 3 root root 4096 Jun 2 2017 lib-Library
Drwxrwxr-x. 2 root root 4096 Jun 2 2017 libexec
-rw-rw-r--. 1 root root 99253 Jun 2 2017 LICENSE.txt
-rw-rw-r--. 1 root root 15915 Jun 2 2017 NOTICE.txt
-rw-r--r--. 1 root root 1366 Jun 2 2017 README.txt
Drwxrwxr-x. 2 root root 4096 Jun 2 2017 sbin-- starts and shuts down hadoop
Drwxrwxr-x. 4 root root 4096 Jun 2 2017 share-jar
# hadoop global environment variable
[root@hadoop001 hadoop-2.8.1] # vim / etc/profile
Export HADOOP_HOME=/opt/software/hadoop-2.8.1
Export PATH=$HADOOP_HOME/bin:$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop001 hadoop-2.8.1] # source / etc/profile
[root@hadoop001 hadoop-2.8.1] # which hadoop
/ opt/software/hadoop-2.8.1/bin/hadoop
[root@hadoop001 bin] # rm-rf * .cmd
[root@hadoop001 sbin] # rm-rf * .cmd
# configure core-site file
[root@hadoop001 hadoop] # vim core-site.xml
Click (here) to collapse or open
Fs.defaultFS
Hdfs://localhost:9000-HDFD port and path
Dfs.replication
1-number of cluster hosts
# Native ssh verification configuration
[root@hadoop001 hadoop] # cd ~
[root@hadoop001 ~] # cd .ssh
[root@hadoop001 .ssh] # ll
Total 0
[root@hadoop001 ~] # rm-rf .ssh
[root@hadoop001 ~] # ssh-keygen
[root@hadoop001 ~] # cd .ssh
[root@hadoop001 .ssh] # ll
Total 8
-rw-. 1 root root 1675 May 14 23:38 id_rsa
-rw-r--r--. 1 root root 396 May 14 23:38 id_rsa.pub
[root@hadoop001 .ssh] # cat id_rsa.pub > > authorized_keys
[root@hadoop001 .ssh] # ll
Total 12
-rw-r--r--. 1 root root 396 May 14 23:40 authorized_keys
-rw-. 1 root root 1675 May 14 23:38 id_rsa
-rw-r--r--. 1 root root 396 May 14 23:38 id_rsa.pub
# must be operated for the first time. The port has been modified locally.
[root@hadoop001 ~] # ssh localhost date
Ssh: connect to host localhost port 22: Connection refused
[root@hadoop001] # ssh-p2222 localhost date
The authenticity of host'[localhost]: 2222 ([: 1]: 2222) 'can't be established.
RSA key fingerprint is 09:b9:67:65:cb:e6:ca:31:5d:33:6c:3b:92:9e:c2:1a.
Are you sure you want to continue connecting (yes/no)? Yes
Warning: Permanently added'[localhost]: 2222'(RSA) to the list of known hosts.
Mon May 14 23:42:37 EDT 2018
# formatting the HDFS file system
[root@hadoop001 ~] # which hdfs
/ opt/software/hadoop-2.8.1/bin/hdfs
[root@hadoop001 ~] # hdfs namenode-format
# start the HDFS file system
[root@hadoop001 sbin] # / opt/software/hadoop-2.8.1/sbin/start-dfs.sh
Starting namenodes on [localhost]
Localhost: ssh: connect to host localhost port 22: Connection refused
Localhost: ssh: connect to host localhost port 22: Connection refused
Starting secondary namenodes [0.0.0.0]
0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused
# resolve port 22 error
[root@hadoop000 hadoop] # vi hadoop-env.sh
Add: export HADOOP_SSH_OPTS= "- p 2222"
# resolve ERROR:JAVA_HOME
[root@hadoop000 hadoop] # vi hadoop-env.sh
Export JAVA_HOME=/usr/java/jdk1.8.0_45
# start OK
[root@hadoop001 sbin] # / opt/software/hadoop-2.8.1/sbin/start-dfs.sh
[root@hadoop001 hadoop] # / opt/software/hadoop-2.8.1/sbin/start-dfs.sh
Starting namenodes on [localhost]
Localhost: starting namenode, logging to / opt/software/hadoop-2.8.1/logs/hadoop-root-namenode-hadoop001.out
Localhost: starting datanode, logging to / opt/software/hadoop-2.8.1/logs/hadoop-root-datanode-hadoop001.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to / opt/software/hadoop-2.8.1/logs/hadoop-root-secondarynamenode-hadoop001.out
# Detection startup
[root@hadoop000 hadoop-2.8.1] # jps
16243 Jps
15943 DataNode
5127 Launcher
16139 SecondaryNameNode
15853 NameNode
Thank you for reading this article carefully. I hope the article "how to achieve Hadoop_2.8.1 compilation and HDFS pseudo-distributed deployment" shared by the editor will be helpful to everyone. At the same time, I also hope that you will support it and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.