In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article will explain in detail how to compile Spark 1.5.2. The editor thinks it is very practical, so I share it with you as a reference. I hope you can get something after reading this article.
When compiling spark 1.5.2, an error occurred due to the low version of maven (it seems that there is also an error in using the domestic image of oschina, so the domestic image is not used, so it is slower, but it doesn't matter, as long as it is successful)
Upgrade maven to the latest version:
(1) uninstall the previous maven, sudo apt-get autoremove maven
(2) download the latest maven and apache-maven-3.3.9-bin.tar.gz
(3) sudo tar-zxvf apache-maven-3.3.9-bin.tar.gz-C / usr
(4) sudo gedit / .bashrc
(5) # MAVEN
Export MAVEN_HOME=/usr/apache-maven-3.3.9
Export PATH=$ {MAVEN_HOME} / bin:$PATH
(6) source / .bashrc
Set up Maven memory
Export MAVEN_OPTS= "- Xmx4g-XX:MaxPermSize=1g-XX:ReservedCodeCacheSize=512m"
(MaxPermSize=1g must be set here, otherwise OutOfMemoryError will appear)
Start compilation: mvn-Pyarn-Dyarn.version=2.6.0-Dhadoop.version=2.6.0-DskipTests clean-U package
-vmargs-Xms128M-Xmx512M-XX:PermSize=64M-XX:MaxPermSize=128M
-the size of heap memory initially allocated by Xms JVM, which defaults to 1x64 of physical memory.
-the maximum heap memory allocated by Xmx JVM is 1x4 of physical memory by default.
-the amount of non-heap memory initially allocated by XX:PermSize JVM, which defaults to 1x64 of physical memory.
-the maximum amount of non-heap memory allocated by XX:MaxPermSize JVM, which defaults to 1x4 of physical memory.
-the memory size of the Cenozoic heap area initially allocated by XX:NewSize JVM.
-the maximum allocated Cenozoic heap area memory size of XX:MaxNewSize JVM.
-the size of the cache space when XX:ReservedCodeCacheSize compiles the code.
This is the end of this article on "how to compile Spark 1.5.2". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, please share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.