In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-23 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
-- Software version--
RHEL6.8hadoop2.8.1apache-maven-3.3.9findbugs-1.3.9protobuf-2.5.0.tar.gzjdk-8u45
-- Software version--
The overview of big data can be summarized as "5V + 1C":
Variety diversified Volume massive Velocity fast Vitality flexible Value valuable Complexity complex
1.Hadoop ecology
Macroscopic: Hadoop-based biosphere hadoop flume.
Narrow sense: apache hadoop hadoop.apache.org
2.Hadoop (Storage + Computing + Resource and Job scheduling)
Hadoop1.x
HDFS storage
MapReduce Computing + Resource and Job scheduling
Hadoop2.x enterprise is using
HDFS storage
MapReduce calculation
Both YARN resources and job scheduling platform computing components will on yarn
Hadoop3.x?
EC technology: Erasure Encoding, referred to as EC, is a new feature extended by Hadoop3 to HDFS, which is used to solve storage space files.
YARN: provides YARN's timeline service V.2 so that users and developers can test it and provide feedback.
Optimize Hadoop Shell script
Refactoring Hadoop Client Jar packages
Support for random Container
MapReduce task-level local optimization
Support for multiple NameNode
Some default service ports have been changed
Support for file system connectors
Load balancing has been added inside DataNode.
Refactoring daemon and task pair management
2.Maven deployment
Blog
2.1 decompression
[root@hadoop1 softwore] # pwd
/ opt/softwore
[root@hadoop1 softwore] # ls
Apache-maven-3.3.9-bin.zip hadoop-2.8.1-src.tar.gz jdk-8u45-linux-x64.gz
Findbugs-1.3.9.zip hadoop-2.8.1.tar.gz protobuf-2.5.0.tar.gz
[root@hadoop1 softwore] # unzip apache-maven-3.3.9-bin.zip
-- configure environment variables
[root@hadoop1 softwore] #
Vi / etc/profile
Export MAVEN_HOME=/opt/software/apache-maven-3.3.9
Export MAVEN_OPTS= "- Xms256m-Xmx512m"
Export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop1 softwore] # source / etc/profile
[root@hadoop1 apache-maven-3.3.9] # mvn-- version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: / opt/software/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: / usr/java/jdk1.8.0_45/jre
Default locale: zh_CN, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"
Maven deployment is binary deployment, decompress configuration environment variables
2.2 configure the mavne directory
2.3 View the configuration files and extract the warehouse files we have prepared
3.Hadoop compilation
Compiling hadoop through Requirements I can see that software such as jdk maven findbugs protocolBuffer cmake Zlib openssl-devel needs to be installed.
3.1 decompression
3.2 View pom.xml
3.3View BUILDING.txt
Requirements: compiling software environment requirements
* Unix System
* JDK 1.7 +
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6or newer (if compiling native code), must be 3.0or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
3.4 JDK deployment
[root@hadoop1 softwore] # tar-zxvf jdk-8u45-linux-x64.gz-C / usr/java [root@hadoop1 softwore] # ls-ld / usr/java/*
Drwxr-xr-x 8 root root 4096 December 15 2016 / usr/java/djdk1.7.0_79
Drwxr-xr-x 8 uucp 143 4096 April 11 2015 / usr/java/jdk1.8.0_45
Drwxr-xr-x 8 uucp 143 4096 October 7 2015 / usr/java/jdk1.8.0_65
[root@hadoop1 softwore] # vim / etc/profile
Export JAVA_HOME=/usr/java/jdk1.8.0_45
Export JRE_HOME=/usr/java/jdk1.8.0_45/jre
[root@hadoop1 softwore] # source / etc/profile
[root@hadoop1 softwore] # java-version
Java version "1.8.045"
Java (TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot (TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
Decompress the binary version and configure the environment variables.
3.5 MAVEN
Above we have installed maven and jdk to check whether the version meets the requirements of our compiled software.
[root@hadoop1 softwore] # mvn-- version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: / opt/software/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: / usr/java/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
[root@hadoop1 softwore] #
3.6 FINDBUGS
[root@hadoop1 softwore] # unzip findbugs-1.3.9.zip
[root@hadoop1 softwore] # vi / etc/profile
Export FINDBUGS_HOME=/opt/software/findbugs-1.3.9
Export PATH=$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH [root@hadoop1 softwore] # source / etc/profile
[root@hadoop1 softwore] # findbugs-version
1.3.9
Binary installation and decompression, and configure environment variables.
3.7 PROTOCBUF compilation and installation
[root@hadoop1 software] # tar xvf protobuf-2.5.0.tar.gz
[root@hadoop1 yum.repos.d] # yum install-y gcc gcc-c++ make cmake-compilation environment
[root@hadoop1 yum.repos.d] # cd / opt/software/protobuf-2.5.0
[root@hadoop1 protobuf-2.5.0] #. / configure-prefix=/usr/local/protobuf
[root@hadoop1 protobuf-2.5.0] # make & & make install
[root@hadoop1 protobuf-2.5.0] # vim / etc/profile
Export PROTOC_HOME=/usr/local/protobuf
Export PATH=$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop1 software] # source / etc/profile
[root@hadoop1 software] # protoc-- version
Libprotoc 2.5.0
[root@hadoop000 local] #
Compilation and installation complete, configure environment variables
3.8 OTHER dependency package
Yum install-y openssl openssl-devel svn ncurses-devel zlib-devel libtool
Yum install-y snappy snappy-devel bzip2 bzip2-devel\ n
Zo lzo-devel lzop autoconf automake
Other dependent packages are included in the system installation ios. Just perform the yum installation.
- -
At this point, all the compilation environments required for compiling hadoop are ready, so let's start the compilation process.
- -
3.9 compilation
[root@hadoop1 software] # tar xvf hadoop-2.8.1-src.tar.gz
[root@hadoop1 software] # cd hadoop-2.8.1-src
[root@hadoop1 hadoop-2.8.1-src] # pwd
/ opt/software/hadoop-2.8.1-src
[root@hadoop1 hadoop-2.8.1-src] #
Mvn clean package-Pdist,native-DskipTests-Dtar
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1.tar.gz
-the location where the compiled package is saved
Direct compilation requires networking to find dependent packages. Generally, the network situation is good, and it takes several hours.
Prepare the dependent package compilation method in advance
[root@hadoop1 software] # tar xvf hadoop-2.8.1-src.tar.gz
[root@hadoop1 software] # cd hadoop-2.8.1-src
[root@hadoop1 hadoop-2.8.1-src] # pwd
/ opt/software/hadoop-2.8.1-src
[root@hadoop1 hadoop-2.8.1-src] # pwd
/ opt/software/hadoop-2.8.1-src
[root@hadoop1 hadoop-2.8.1-src] # ls pom.xml-download jar from this file
Pom.xml
[root@hado
-rw-r--r-- 1 root root 96721446 April 25 09:57 .m2.tar.gz-Warehouse package released in advance
[root@hadoop1 software] # tar xvf .m2.tar.gz-C / root-do not want to download, use the warehouse software prepared in advance-- > decompress-> root home directory
[root@hadoop1 hadoop-2.8.1-src] # mvn clean package-Pdist,native-DskipTests-Dtar
…… ..
[INFO] Apache Hadoop Azure support.... SUCCESS [5.562 s]
[INFO] Apache Hadoop Client.. SUCCESS [13.396 s]
[INFO] Apache Hadoop Mini-Cluster. SUCCESS [1.969 s]
[INFO] Apache Hadoop Scheduler Load Simulator. SUCCESS [6.050 s]
[INFO] Apache Hadoop Tools Dist.. SUCCESS [11.110 s]
[INFO] Apache Hadoop Azure Data Lake support. SUCCESS [3.862 s]
[INFO] Apache Hadoop Tools.. SUCCESS [0.029 s]
[INFO] Apache Hadoop Distribution. SUCCESS [01:01 min]
[INFO]
[INFO] BUILD SUCCESS
[INFO]
[INFO] Total time: 13:45 min
[INFO] Finished at: 2018-04-25T14:03:20+08:00
[INFO] Final Memory: 161M/479M
[INFO]
[root@hadoop1 hadoop-2.8.1-src] # Software save path after successful compilation
[root@hadoop1 hadoop-2.8.1-src] # ls / opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-*
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1.tar.gz
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-dist-2.8.1.jar
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-dist-2.8.1-javadoc.jar
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-dist-2.8.1-sources.jar
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-dist-2.8.1-test-sources.jar
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1:
Bin etc include lib libexec LICENSE.txt NOTICE.txt README.txt sbin share
[root@hadoop1 hadoop-2.8.1-src] # du-sh / opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1.tar.gz
186M / opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1.tar.gz
/ opt/software/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1.tar.gz
Conclusion: complete the compilation and packaging of the source code, get the installation hadoop installation package, the difficulty lies in the preparation of the software package equipment and warehouse software, and then prepare to start the road of hadoop installation.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.