In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
When installing the CDH version of hadoop, there is no native package in the hadoop installation package of CDH, and the compression function of each task stage is not supported, but the compression function must be supported in the production task, so the CDH should be compiled to support the compression function.
The software versions used in this compilation are:
Cdh:hadoop-2.6.0-cdh6.7.0-src.tar.gzjdk:1.7 (trampled by many students, do not use version 1.8) maven:apache-maven-3.3.9-bin.tar.gz protobuf:protobuf-2.5.0.tar.gz system: Centos 7.5cdh official download address: http://archive.cloudera.com/cdh6/cdh/5
The address of the dependent software Baidu network disk
Repo.tar.gz link: https://pan.baidu.com/s/1wGCgV_3R3VUm2ka_aVA8GQ extraction code: lrejHadoop Hadoop-2.6.0-cdh6.7.0-src.tar.gz link: https://pan.baidu.com/s/1uRMGIhLSL9QHT-Ee4F16jw extraction code: jb1djdk jdk-7u80-linux-x64.tar.gz link: https://pan.baidu.com/s/1xSCQ8rjABVI-zDFQS5nCPA extraction code: lfzemaven apache-maven-3.3.9-bin.tar. Gz link: https://pan.baidu.com/s/1ddkdkLW7r7ahFZmgACGkVw extraction code: fdfzprotobuf protobuf-2.5.0.tar.gz link: https://pan.baidu.com/s/1RSNZGd_ThwknMB3vDkEfhQ extraction code: hvc2
First install the various dependencies:
Yum install-y svn ncurses-develyum install-y gcc gcc-c++ make cmakeyum install-y openssl openssl-devel svn ncurses-devel zlib-devel libtoolyum install-y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake cmake
Configure the environment variables for jdk:
[root@hadoop004 soft] # cat / etc/profile.d/java.sh export JAVA_HOME=/usr/java/jdkexport CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jarexport PATH=$JAVA_HOME/bin:$ path [root @ hadoop004 soft] # java-versionjava version "1.7.0x79" Java (TM) SE Runtime Environment (build 1.7.0_79-b15) Java HotSpot (TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
Configure the environment variables for maven:
# configure environment variable [root@hadoop004 soft] # cat / etc/profile.d/maven.sh MAVEN_HOME=/usr/local/mavenexport PATH=$MAVEN_HOME/bin:$ path [root @ hadoop004 local] # mvn-- versionApache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5 2015-11-11T00:41:47+08:00) Maven home: / usr/local/mavenJava version: 1.7.0: 79, vendor: Oracle CorporationJava home: / usr/java/jdk1.7.0_79/jreDefault locale: en_US, platform encoding: UTF-8OS name: "linux", version: "3.10.0-862.3.2.el7.x86_64", arch: "amd64" Family: "unix" # configure the local storage address of mvn / usr/local/maven/repo and put the downloaded repo.tar.gz decompressed content in this local warehouse directory So that cdh can be used directly when compiling, otherwise downloading will waste a lot of time # configuring mvn to download maven repository alimaven aliyun maven http://maven.aliyun.com/nexus/content/groups/public/ central from Aliyun
Compile and install protobuf and configure environment variables:
# compile and install command tar-zxvf protobuf-2.5.0.tar.gzcd protobuf-2.5.0./configure-- prefix=/root/protobuf/makemake install# configure environment variable [root@hadoop004 soft] # cat / etc/profile.d/protobuf.sh PROTOC_HOME=/usr/local/protobufexport PATH=$PROTOC_HOME/bin:$PATH# to check whether source / etc/profile.d/protobuf.sh [root@hadoop004 soft] # protoc-- versionlibprotoc 2.5.0 is installed successfully
Compile the hadoop source code for cdh:
Tar-xzvf hadoop-2.6.0-cdh6.7.0-src.tar.gzcd hadoop-2.6.0-cdh6.7.0mvn clean package-Pdist,native-DskipTests-Dtar
Error encountered:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part. @ 5ERROR 126 in / data/soft/hadoop-2.6.0-cdh6.7.0/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml [ERROR]-> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the-e switch. [ERROR] Re-run Maven using the-X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions Please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException[ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn-rf: hadoop-pipes
Cause of the problem: openssl-devel;yum install-y openssl-devel was missed in the process of installing dependencies
Execute the above installation command again:
Tar-xzvf hadoop-2.6.0-cdh6.7.0-src.tar.gzcd hadoop-2.6.0-cdh6.7.0mvn clean package-Pdist,native-DskipTests-Dtar Note: since the dependencies that cdh needs to download have been put into the local repository of maven in advance, a lot of download time has been saved.
Compiled file:
Location:. / hadoop-dist/target directory
File: hadoop-2.6.0-cdh6.7.0
Compressed files: hadoop-2.6.0-cdh6.7.0.tar.gz
Test to see if various compressions are supported:
[root@hadoop004 hadoop-2.6.0-cdh6.7.0] #. / bin/hadoop checknative19/04/18 15:09:34 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native19/04/18 15:09:34 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib libraryNative library checking:hadoop: true / data/soft/hadoop-2.6.0-cdh6.7.0/hadoop-dist/target/hadoop-2.6.0-cdh6.7.0/lib / native/libhadoop.so.1.0.0zlib: true / lib64/libz.so.1snappy: true / lib64/libsnappy.so.1lz4: true revision:99bzip2: true / lib64/libbz2.so.1openssl: true / lib64/libcrypto.so
Error reports and solutions when moving to other machines:
Note that if you copy the tar package to the new machine installation check the checknative error
Openssl: false Cannot load libcrypto.so (libcrypto.so: unable to open shared object file: there is no such file or directory)!
Run yum install openssl-devel-y in centos
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.