Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to recompile hadoop source code under CentOS

2025-03-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "how to recompile hadoop source code under CentOS". In daily operation, I believe that many people have doubts about how to recompile hadoop source code under CentOS. Xiaobian consulted all kinds of materials and sorted out simple and easy-to-use methods of operation. I hope it will be helpful to answer the question of "how to recompile hadoop source code under CentOS". Next, please follow the editor to study!

I found Unable to load native-hadoop library for your platform... on the Internet. The reason for the using builtin-java classes where applicabl warning is that some of hadoop's local libraries are compiled with different versions of the C library from those on the local machine, so you can recompile hadoop in the native environment.

However, this warning has little impact on the use of hadoop.

However, as an obsessive-compulsive programmer, he tried some methods to no avail, so he had to compile the source code himself.

Switch to root user

Download the tar package of Ant Maven ProtocolBuffer findbugs CMake and put it in the / hadoop directory

The version I use is:

[hadoop@vm1 Downloads] $lsapache-ant-1.9.5.tar.gz findbugs-2.0.2.tar.gz jdk-8u45-linux-x64.gzapache-maven-3.0.5.tar.gz hadoop-2.7.0-src.tar.gz protobuf-2.5.0cmake-2.8.6 hadoop-2.7.0.tar.gz protobuf-2.5.0.tar.gzcmake-2.8.6 .tar.gz jdk-7u79-linux-x64.gzyum-y install lzo-devel zlib-devel gcc autoconf automake libtooltar zxf protobuf-2.5.0.tar.gzcd protobuf-2.5.0./configure

At this time, because protobuf needs C++ support, the following error will occur if the machine does not install C++:

Checking whether to enable maintainer-specific portions of Makefiles... Yeschecking build system type... X86_64-unknown-linux-gnuchecking host system type... X86_64-unknown-linux-gnuchecking target system type... X86_64-unknown-linux-gnuchecking for a BSD-compatible install... / usr/bin/install-cchecking whether build environment is sane... Yeschecking for a thread-safe mkdir-p. / bin/mkdir-pchecking for gawk... Gawkchecking whether make sets $(MAKE)... Yeschecking for gcc... Gccchecking whether the C compiler works... Yeschecking for C compiler default output file name... A.outchecking for suffix of executables... Checking whether we are cross compiling... Nochecking for suffix of object files... Ochecking whether we are using the GNU C compiler... Yeschecking whether gcc accepts-g. Yeschecking for gcc option to accept ISO C89... None neededchecking for style of include used by make... GNUchecking dependency style of gcc... Gcc3checking for galleys... Nochecking for caterpillar... Nochecking for gpp... Nochecking for aCC... Nochecking for CC... Nochecking for cxx... Nochecking for cc++... Nochecking for cl.exe... Nochecking for FCC... Nochecking for KCC... Nochecking for RCC... Nochecking for xlC_r... Nochecking for xlC... Nochecking whether we are using the GNU C++ compiler... Nochecking whether gathers + accepts-g. Nochecking dependency style of galleys... Nonechecking how to run the C++ preprocessor... / lib/cppconfigure: error: in `/ hadoop/protobuf-2.5.0':configure: error: C++ preprocessor "/ lib/cpp" fails sanity checkSee `config.log' for more details

-

It is needed at this time.

Yum install glibc-headersyum install gcc-c++

Then execute. / configure under the protobuf folder

That's great. So, goon,

Make make checkmake installtar apache-ant-1.9.2-bin.tar.gzmv apache-ant-1.9.2 / hadoop/app/ant192tar apache-maven-3.0.5-bin.tar.gzmv apache-maven-3.0.5 / hadoop/maven305tar zxf findbugs-2.0.2.tar.gzmv findbugs-2.0.2 / hadoop/findbugs202tar zxf cmake-2.8.6.tar.gzcd cmake-2.8.6./bootstrap; make Make installcd.. tar zxf hadoop-2.7.0-src.tar.gzmv hadoop-2.7.0-src / hadoop/hadoop270_srcchown-R hadoop:hadoop / hadoop/hadoop270_srcvi / etc/profileexport ANT_HOME=/hadoop/ant192export MAVEN_HOME=/hadoop/maven305export FINDBUGS_HOME=/hadoop/findbugs202export PATH=$ {ANT_HOME} / bin:$ {MAVEN_HOME} / bin:$ {FINDBUGS_HOME} / bin:$PATHsource / etc/profilesu-hadoopcd / hadoop/hadoop270_srcmvn clean package-DskipTests-Pdist,native,docs-Dtar

If this is the first time to configure maven, it will take a long time. It is best to configure the image address of maven.

This error may occur at the end of the compilation:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part. @ 5purl 124 in / home/hadoop/app/hadoop270_src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml

Zlib1g-dev and libssl-dev are not installed. Compiling local libraries requires the support of these two libraries.

Solution:

Yum install openssl-devel

Then revisit:

Mvn clean package-DskipTests-Pdist,native,docs-Dtar

Note: in a jdk1.8 environment, errors may occur:

[WARNING] The requested profile "native" could not be activated because it does not exist. [WARNING] The requested profile "docs" could not be activated because it does not exist. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part. @ 38 100 in / home/hadoop/app/hadoop270_src/hadoop-dist/target/antrun/build-main.xml

Solution: change 1.8 to 1.7

Then the compilation is successful:

[INFO] BUILD SUCCESS [INFO]-- [INFO] Total time: 2515 22.002s [INFO] Finished at: Tue Jul 07 21:20:38 PDT 2015 [INFO] Final Memory: 131M/405M [INFO] -[hadoop@vm1 hadoop270_src] $lsBUILDING.txt hadoop-dist hadoop-project NOTICE.txtdev-support hadoop-hdfs- Projecthadoop- project-dist pom.xmlhadoop-assemblies hadoop-mapreduce-project hadoop-tools README.txthadoop-client hadoop-maven-plugins hadoop-yarn-projecthadoop-common-project hadoop-minicluster LICENSE.txt [hadoop@vm1 hadoop270_src] $cd hadoop-dist/ [hadoop@vm1 hadoop-dist] $lspom.xml target [hadoop@vm1 hadoop-dist] $cd target/ [hadoop@vm1 target] $lsantrun hadoop- 2.7.0 hadoop-dist-2.7.0-javadoc.jar test-dirdist-layout-stitching.sh hadoop-2.7.0.tar.gz javadoc-bundle-optionsdist-tar-stitching.sh hadoop-dist-2.7.0.jar maven-archiver [hadoop@vm1 target] $pwd/hadoop/app/hadoop270_src/hadoop-dist/target

Configure the corresponding environment with your own compiled hadoop package, and start hdfs without (Unable to load native-hadoop library for your platform...) Using builtin-java classes where applicabl) warning:

[hadoop@vm1 hadoop-2.7.0] $. / sbin/start-dfs.sh Starting namenodes on [vm1] vm1: starting namenode, logging to / home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-namenode-vm1.outvm1: starting datanode, logging to / home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-datanode-vm1.outStarting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode Logging to / home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-secondarynamenode-vm1.out [hadoop@vm1 hadoop-2.7.0] $. / sbin/start-yarn.sh starting yarn daemonsstarting resourcemanager, logging to / home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-resourcemanager-vm1.outvm1: starting nodemanager Logging to / home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-nodemanager-vm1.out [hadoop@vm1 hadoop-2.7.0] $jps3251 NodeManager3540 Jps3145 ResourceManager2699 NameNode2828 DataNode2991 SecondaryNameNode here The study on "how to recompile hadoop source code under CentOS" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report