In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
0. Environment description:
The operating system used for compilation is:
[root@host11 hadoop-2.7.1-src] # cat / etc/redhat-release
CentOS release 6.5 (Final)
Version 2.7.1 of hadoop
1. Install dependent packages:
Yum install svn autoconf automake libtool cmake ncurses-devel openssl-devel gcc*
two。 Configure the java and maven environment:
Wget wget http://download.oracle.com/otn-pub/java/jdk/8u60-b27/jdk-8u60-linux-x64.tar.gz?AuthParam=1443446776_174368b9ab1a6a92468aba5cd4d092d0
Tar-zxvf jdk-8u60-linux-x64.tar.gz-C / usr/local
Cd / usr/local
Ln-s jdk1.8.0_60 jdk
Echo 'export JAVA_HOME=/usr/local/jdk' > > / etc/profile
Echo 'export PATH=$JAVA_HOME/bin:$PATH' > > / etc/profile
Wget http://mirrors.hust.edu.cn/apache/maven/maven-3/3.3.3/binaries/apache-maven-3.3.3-bin.tar.gz
Tar-zxvf apache-maven-3.3.3-bin.tar.gz-C / usr/local
Cd / usr/local
Ln-s apache-maven-3.3.3 maven
Echo 'export PATH=/usr/local/maven/bin:$PATH' > / etc/profile.d/maven.sh
3. Download and install protobuf (version 2.5 must be used)
Wget https://codeload.github.com/google/protobuf/zip/v2.5.0
Unzip protobuf-2.5.0.zip
Wget http://googletest.googlecode.com/files/gtest-1.5.0.tar.bz2
Tar-jxvf gtest-1.5.0.tar.bz2
Mv gtest-1.5.0. / protobuf-2.5.0/gtest
. / autogen.sh
. / configure
Make
Make check
Make install
Which protoc
[root@host11 protobuf-master] # which protoc
/ usr/local/bin/protoc
4. Download and install ant:
Wget http://mirrors.cnnic.cn/apache//ant/binaries/apache-ant-1.9.6-bin.zip
Unzip apache-ant-1.9.6-bin.zip
Mv apache-ant-1.9.6 / usr/local/ant
Echo 'export PATH=/usr/local/ant/bin:$PATH' > / etc/profile.d/ant.sh
5. Compile hadoop:
Tar-zxvf tar-zxvf hadoop-2.7.1-src.tar.gz
Mvn package-Pdist,native-DskipTests-Dtar
6. Fault handling:
First compilation failure:
[ERROR] Failed to execute goal on project hadoop-auth: Could not resolve dependencies for project org.apache.hadoop:hadoop-auth:jar:2.7.1: The following artifacts could not be resolved: org.mockito:mockito-all:jar:1.8.5, org.mortbay.jetty:jetty-util:jar:6.1.26, org.mortbay.jetty:jetty:jar:6.1.26, org.apache.tomcat.embed:tomcat-embed-core:jar:7.0.55 Org.apache.httpcomponents:httpclient:jar:4.2.5, org.apache.zookeeper:zookeeper:jar:3.4.6: Could not transfer artifact org.mockito:mockito-all:jar:1.8.5 from/to central (https://repo.maven.apache.org/maven2): GET request of: org/mockito/mockito-all/1.8.5/mockito-all-1.8.5.jar from central failed: SSL peer shut down incorrectly-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the-e switch.
[ERROR] Re-run Maven using the-X switch to enable full debug logging
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn-rf: hadoop-auth
Solution:
This is common because the plug-in has not been downloaded. Just execute the following command a few more times.
Mvn package-Pdist,native-DskipTests-Dtar
Second compilation failure:
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is' libprotoc 3.0.0, expected version is' 2.5.0'-> [Help 1]
The protobuf version is so new that you need to use version 2.5.
7. Log compiled successfully:
[INFO] Apache Hadoop Main.. SUCCESS [7.502 s]
[INFO] Apache Hadoop Project POM.. SUCCESS [4.844 s]
[INFO] Apache Hadoop Annotations.. SUCCESS [10.274 s]
[INFO] Apache Hadoop Assemblies.. SUCCESS [0.477 s]
[INFO] Apache Hadoop Project Dist POM. SUCCESS [4.568 s]
[INFO] Apache Hadoop Maven Plugins.... SUCCESS [11.000 s]
[INFO] Apache Hadoop MiniKDC.. SUCCESS [9.870 s]
[INFO] Apache Hadoop Auth.. SUCCESS [9.003 s]
[INFO] Apache Hadoop Auth Examples.... SUCCESS [9.321 s]
[INFO] Apache Hadoop Common.. SUCCESS [03:21 min]
[INFO] Apache Hadoop NFS.. SUCCESS [20.029 s]
[INFO] Apache Hadoop KMS.. SUCCESS [21.350 s]
[INFO] Apache Hadoop Common Project... SUCCESS [0.079 s]
[INFO] Apache Hadoop HDFS.. SUCCESS [10:57 min]
[INFO] Apache Hadoop HttpFS.. SUCCESS [01:15 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal. SUCCESS [46.255 s]
[INFO] Apache Hadoop HDFS-NFS.. SUCCESS [21.495 s]
[INFO] Apache Hadoop HDFS Project. SUCCESS [0.242 s]
[INFO] hadoop-yarn.. SUCCESS [0.137 s]
[INFO] hadoop-yarn-api.. SUCCESS [01:34 min]
[INFO] hadoop-yarn-common.. SUCCESS [01:31 min]
[INFO] hadoop-yarn-server.. SUCCESS [0.291 s]
[INFO] hadoop-yarn-server-common.. SUCCESS [35.037 s]
[INFO] hadoop-yarn-server-nodemanager. SUCCESS [44.224 s]
[INFO] hadoop-yarn-server-web-proxy... SUCCESS [4.315 s]
[INFO] hadoop-yarn-server-applicationhistoryservice. SUCCESS [17.461 s]
[INFO] hadoop-yarn-server-resourcemanager. SUCCESS [46.435 s]
[INFO] hadoop-yarn-server-tests.. SUCCESS [10.698 s]
[INFO] hadoop-yarn-client.. SUCCESS [8.976 s]
[INFO] hadoop-yarn-server-sharedcachemanager. SUCCESS [10.343 s]
[INFO] hadoop-yarn-applications.. SUCCESS [0.113 s]
[INFO] hadoop-yarn-applications-distributedshell. SUCCESS [7.395 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher. SUCCESS [4.006 s]
[INFO] hadoop-yarn-site.. SUCCESS [0.108 s]
[INFO] hadoop-yarn-registry.. SUCCESS [12.317 s]
[INFO] hadoop-yarn-project.. SUCCESS [18.781 s]
[INFO] hadoop-mapreduce-client.. SUCCESS [0.396 s]
[INFO] hadoop-mapreduce-client-core... SUCCESS [46.350 s]
[INFO] hadoop-mapreduce-client-common. SUCCESS [34.772 s]
[INFO] hadoop-mapreduce-client-shuffle. SUCCESS [8.779 s]
[INFO] hadoop-mapreduce-client-app.... SUCCESS [22.440 s]
[INFO] hadoop-mapreduce-client-hs. SUCCESS [12.865 s]
[INFO] hadoop-mapreduce-client-jobclient. SUCCESS [01:45 min]
[INFO] hadoop-mapreduce-client-hs-plugins. SUCCESS [6.051 s]
[INFO] Apache Hadoop MapReduce Examples. SUCCESS [8.077 s]
[INFO] hadoop-mapreduce.. SUCCESS [12.782 s]
[INFO] Apache Hadoop MapReduce Streaming. SUCCESS [24.680 s]
[INFO] Apache Hadoop Distributed Copy. SUCCESS [50.965 s]
[INFO] Apache Hadoop Archives.. SUCCESS [6.861 s]
[INFO] Apache Hadoop Rumen.. SUCCESS [12.928 s]
[INFO] Apache Hadoop Gridmix.. SUCCESS [6.784 s]
[INFO] Apache Hadoop Data Join.. SUCCESS [3.629 s]
[INFO] Apache Hadoop Ant Tasks.. SUCCESS [7.135 s]
[INFO] Apache Hadoop Extras.. SUCCESS [6.233 s]
[INFO] Apache Hadoop Pipes.. SUCCESS [31.548 s]
[INFO] Apache Hadoop OpenStack support. SUCCESS [10.084 s]
[INFO] Apache Hadoop Amazon Web Services support. SUCCESS [35:23 min]
[INFO] Apache Hadoop Azure support.... SUCCESS [36.126 s]
[INFO] Apache Hadoop Client.. SUCCESS [24.463 s]
[INFO] Apache Hadoop Mini-Cluster. SUCCESS [0.353 s]
[INFO] Apache Hadoop Scheduler Load Simulator. SUCCESS [12.506 s]
[INFO] Apache Hadoop Tools Dist.. SUCCESS [34.475 s]
[INFO] Apache Hadoop Tools.. SUCCESS [0.159 s]
[INFO] Apache Hadoop Distribution. SUCCESS [02:37 min]
[INFO]
[INFO] BUILD SUCCESS
[INFO]
[INFO] Total time: 01:12 h
[INFO] Finished at: 2015-10-03T03:54:29+08:00
[INFO] Final Memory: 91M/237M
[INFO]
[root@host11 hadoop-2.7.1-src] #
8. Check the generated package:
Cd / tmp/hadoop-2.7.1-src/hadoop-dist/target
[root@host11 target] # ls-ld hadoop*
Drwxr-xr-x 9 root root 4096 October 3 03:51 hadoop-2.7.1
-rw-r--r-- 1 root root 194796372 October 3 03:52 hadoop-2.7.1.tar.gz
-rw-r--r-- 1 root root 2823 October 3 03:52 hadoop-dist-2.7.1.jar
-rw-r--r-- 1 root root 390430395 October 303: 54 hadoop-dist-2.7.1-javadoc.jar
9. At this point, the compilation work ends smoothly.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.