In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1. Download spark
Http://mirrors.cnnic.cn/apache/spark/spark-1.3.0/spark-1.3.0-bin-hadoop2.3.tgz
2. Download scala
Http://www.scala-lang.org/download/2.10.5.html
3. Install scala
Mkdir / usr/lib/scala
Tar-zxvf scala-2.10.5.tgz
Mv scala-2.10.5 / usr/lib/scala
4. Set the scala path
Vim / etc/bashrc
Export SCALA_HOME=/usr/lib/scala/scala-2.10.5
Export PATH=$SCALA_HOME/bin:$PATH
Source / etc/bashrc
Scala-version
5. Distribution
Scp-r / usr/lib/scala/ hd2:/usr/lib/scala
Scp-r / usr/lib/scala/ hd3:/usr/lib/scala
Scp-r / usr/lib/scala/ hd4:/usr/lib/scala
Scp-r / usr/lib/scala/ hd5:/usr/lib/scala
Scp / etc/bashrc hd2:/etc/bashrc
Scp / etc/bashrc hd3:/etc/bashrc
Scp / etc/bashrc hd4:/etc/bashrc
Scp / etc/bashrc hd5:/etc/bashrc
6. Install spark
Tar-zxvf spark-1.3.0-bin-hadoop2.3.tgz
Mkdir / usr/local/spark
Mv spark-1.3.0-bin-hadoop2.3 / usr/local/spark
Vim / etc/bashrc
Export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3
Export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH
Source / etc/bashrc
Cd / usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf/
Cp spark-env.sh.template spark-env.sh
Vim spark-env.sh
Export JAVA_HOME=/java
Export SCALA_HOME=/usr/lib/scala/scala-2.10.5
Export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3
Export SPARK_MASTER_IP=192.168.137.101
Export SPARK_WORKER_MEMORY=10g
Export SPARK_DRIVER_MEMORY=9g
Export HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoop
Export SPARK_LIBRARY_PATH=$SPARK_HOME/lib
Export SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATH
Cp slaves.template slaves
Vim slaves
Hd1
Hd2
Hd3
Hd4
Hd5
7. Distribution
Scp / etc/bashrc hd2:/etc
Scp / etc/bashrc hd3:/etc
Scp / etc/bashrc hd4:/etc
Scp / etc/bashrc hd5:/etc
Scp-r / usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/
Scp-r / usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/
Scp-r / usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/
Scp-r / usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/
7. Start
In hd1, start
Cd $SPARK_HOME/sbin
. / start-all.sh
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.