In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
1. Install JDK
Download the JDK 7u55 version and install
Both JDK and JRE are needed. Tools.jar is included in JDK. This jar package is definitely needed.
Installed on / java
2. Download the Hadoop 2.2.0 source code
Wget http://apache.dataguru.cn/hadoop/common/stable/hadoop-2.2.0-src.tar.gz
Decompress
Tar zxvf hadoop-2.2.0-src.tar.gz-C / tmp
3. Install the dependency package
Yum-y install lzo-devel zlib-devel gcc autoconf automake libtool gcc-c++ openssl-devel openssl-devel
4. Install related compiling software
Protobuf compilation and installation
Tar-zxvf protobuf-2.5.0.tar.gz
Cd protobuf-2.5.0
. / configure-- prefix=/usr/local/protobuf
Make
Make install
Ant installation
Tar-zxvf apache-ant-1.9.2-bin.tar.gz
Mv apache-ant-1.9.2 / usr/local/ant
Maven installation
Tar-zxvf apache-maven-3.0.5-bin.tar.gz
Mv apache-maven-3.0.5 / usr/local/maven
Findbugs installation
Tar-zxfv findbugs-2.0.2.tar.gz
Mv findbugs-2.0.2 / usr/local/findbugs
Cmake compilation and installation
Tar-zvxf cmake-2.8.8.tar.gz
Cd cmake-2.8.8
. / bootstrap
Make
Make install
5. Configure the path
Vim / etc/profile
# java
Export JAVA_HOME=/java
Export JRE_HOME=$JAVA_HOME/jre
Export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib
Export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin
# maven
Export MAVEN_HOME=/usr/local/maven
Export MAVEN_OPTS= "- Xms256m-Xmx512m"
Export CLASSPATH=.:$CLASSPATH:$MAVEN_HOME/lib
Export PATH=$PATH:$MAVEN_HOME/bin
# protobuf
Export PROTOBUF_HOME=/usr/local/protobuf
Export CLASSPATH=.:$CLASSPATH:$PROTOBUF_HOME/lib
Export PATH=$PATH:$PROTOBUF_HOME/bin
# ant
Export ANT_HOME=/usr/local/ant
Export CLASSPATH=.:$CLASSPATH:$ANT_HOME/lib
Export PATH=$PATH:$ANT_HOME/bin
# findbugs
Export FINDBUGS_HOME=/usr/local/findbugs
Export CLASSPATH=.:$CLASSPATH:$FINDBUGS_HOME/lib
Export PATH=$PATH:$FINDBUGS_HOME/bin
Source / etc/profile
Effective immediately
9. Depend on Bug for modification
Vim / hadoop-2.2.0/hadoop-common-project/hadoop-auth/pom.xml
In the dependency section, add:
Org.mortbay.jetty
Jetty
Test
Org.mortbay.jetty
Jetty-util
Test
10. Compilation
Cd hadoop-2.2.0-src
Mvn clean package-Pdist,native-DskipTests-Dtar
[INFO]
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main.. SUCCESS [10.796s]
[INFO] Apache Hadoop Project POM. SUCCESS [8.171s]
[INFO] Apache Hadoop Annotations. SUCCESS [18.306s]
[INFO] Apache Hadoop Assemblies.. SUCCESS [1.704s]
[INFO] Apache Hadoop Project Dist POM. SUCCESS [8.222s]
[INFO] Apache Hadoop Maven Plugins... SUCCESS [17.120s]
[INFO] Apache Hadoop Auth.. SUCCESS [15.952s]
[INFO] Apache Hadoop Auth Examples... SUCCESS [12.085s]
[INFO] Apache Hadoop Common.. SUCCESS [4:57.617s]
[INFO] Apache Hadoop NFS.. SUCCESS [25.393s]
[INFO] Apache Hadoop Common Project.. SUCCESS [0.231s]
[INFO] Apache Hadoop HDFS.. SUCCESS [5:51.635s]
[INFO] Apache Hadoop HttpFS.. SUCCESS [1:27.220s]
[INFO] Apache Hadoop HDFS BookKeeper Journal. SUCCESS [59.011s]
[INFO] Apache Hadoop HDFS-NFS.. SUCCESS [11.979s]
[INFO] Apache Hadoop HDFS Project.... SUCCESS [0.195s]
[INFO] hadoop-yarn.. SUCCESS [1:41.292s]
[INFO] hadoop-yarn-api.. SUCCESS [1:53.028s]
[INFO] hadoop-yarn-common.. SUCCESS [1:47.889s]
[INFO] hadoop-yarn-server.. SUCCESS [0.712s]
[INFO] hadoop-yarn-server-common. SUCCESS [38.517s]
[INFO] hadoop-yarn-server-nodemanager. SUCCESS [53.352s]
[INFO] hadoop-yarn-server-web-proxy.. SUCCESS [13.733s]
[INFO] hadoop-yarn-server-resourcemanager. SUCCESS [49.935s]
[INFO] hadoop-yarn-server-tests.. SUCCESS [3.230s]
[INFO] hadoop-yarn-client.. SUCCESS [23.036s]
[INFO] hadoop-yarn-applications.. SUCCESS [0.690s]
[INFO] hadoop-yarn-applications-distributedshell. SUCCESS [7.623s]
[INFO] hadoop-mapreduce-client.. SUCCESS [0.581s]
[INFO] hadoop-mapreduce-client-core.. SUCCESS [1:26.644s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher.... SUCCESS [8.783s]
[INFO] hadoop-yarn-site.. SUCCESS [1.217s]
[INFO] hadoop-yarn-project.. SUCCESS [30.587s]
[INFO] hadoop-mapreduce-client-common. SUCCESS [1:19.185s]
[INFO] hadoop-mapreduce-client-shuffle. SUCCESS [17.693s]
[INFO] hadoop-mapreduce-client-app... SUCCESS [41.929s]
[INFO] hadoop-mapreduce-client-hs.... SUCCESS [18.209s]
[INFO] hadoop-mapreduce-client-jobclient. SUCCESS [24.663s]
[INFO] hadoop-mapreduce-client-hs-plugins. SUCCESS [7.631s]
[INFO] Apache Hadoop MapReduce Examples. SUCCESS [22.663s]
[INFO] hadoop-mapreduce.. SUCCESS [10.093s]
[INFO] Apache Hadoop MapReduce Streaming. SUCCESS [19.489s]
[INFO] Apache Hadoop Distributed Copy. SUCCESS [51.046s]
[INFO] Apache Hadoop Archives.. SUCCESS [7.621s]
[INFO] Apache Hadoop Rumen.. SUCCESS [20.543s]
[INFO] Apache Hadoop Gridmix.. SUCCESS [15.156s]
[INFO] Apache Hadoop Data Join.. SUCCESS [9.968s]
[INFO] Apache Hadoop Extras.. SUCCESS [9.504s]
[INFO] Apache Hadoop Pipes.. SUCCESS [15.708s]
[INFO] Apache Hadoop Tools Dist.. SUCCESS [5.261s]
[INFO] Apache Hadoop Tools.. SUCCESS [0.268s]
[INFO] Apache Hadoop Distribution.... SUCCESS [1:15.418s]
[INFO] Apache Hadoop Client.. SUCCESS [29.025s]
[INFO] Apache Hadoop Mini-Cluster.... SUCCESS [0.735s]
[INFO]
[INFO] BUILD SUCCESS
[INFO]
[INFO] Total time: 34:15.365s
[INFO] Finished at: Fri May 16 16:15:37 CST 2014
[INFO] Final Memory: 101M/385M
[INFO]
After the compilation is completed, it will be in the
Hadoop-2.2.0-src/hadoop-dist/target/
Generate a packet
Hadoop-2.2.0.tar.gz
This package is the final deployable hadoop package.
There may be intermediate failures throughout the programming process, for many reasons, possibly because the connection host download dependency is not successful.
Try: mvn clean package-Pdist,native-DskipTests-Dtar
Just do it a few more times.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.