In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
[TOC]
Spark stand-alone installation deployment 1. Install scala extract: tar-zxvf soft/scala-2.10.5.tgz-C app/ rename: mv scala-2.10.5/ scala configuration to environment variable: export SCALA_HOME=/home/uplooking/app/scalaexport PATH=$PATH:$SCALA_HOME/bin# although spark itself comes with scala, it is recommended to install 2. Install stand-alone spark extract: tar-zxvf soft/spark-1.6.2-bin-hadoop2.6.tgz-C app/ rename: mv spark-1.6.2-bin-hadoop2.6/ spark configuration to environment variable: export SPARK_HOME=/home/uplooking/app/sparkexport PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin test: run a simple spark program spark-shellsc.textFile (". / hello"). FlatMap (_ .split (")). Map ((_ 1). ReduceByKey (_ + _) .foreach (println) fully distributed installation and modification of spark-env.sh 1, cd / home/uplooking/app/spark/conf 2, cp spark-env.sh.template spark-env.sh 3, Vi spark-env.sh export JAVA_HOME=/opt/jdk export SCALA_HOME=/home/uplooking/app/scala export SPARK_MASTER_IP=uplooking01 export SPARK_MASTER_PORT=7077 export SPARK_WORKER_CORES=1 export SPARK_WORKER_INSTANCES=1 export SPARK_WORKER_MEMORY=1g export HADOOP_CONF_DIR=/home/uplooking/app/hadoop/etc/hadoop modifies the slaves configuration file to add two lines of records for uplooking02 uplooking03 deployment to uplooking02 and uplooking03 machines (these two machines Need to install scala in advance) scp-r / home/uplooking/app/scala uplooking@uplooking02:/home/uplooking/app scp-r / home/uplooking/app/scala uplooking@uplooking03:/home/uplooking/app-scp-r / home/uplooking/app/spark uplooking@uplooking02:/home/uplooking/app scp-r / home/uplooking/app/spark uplooking@uplooking03:/home/uplooking/app-load environment variables on uplooking02 and uplooking03 Need source to take effect scp / home/uplooking/.bash_profile uplooking@uplooking02:/home/uplooking scp / home/uplooking/.bash_profile uplooking@uplooking03:/home/uplooking start modification in order to avoid conflicts with start/stop-all.sh scripts in hadoop Renaming spark/sbin/start/stop-all.sh mv start-all.sh start-spark-all.sh mv stop-all.sh stop-spark-all.sh startup sbin/start-spark-all.sh will start a process on our configured master node uplooking01 Master will start a process on our configured slave node uplooking02 Set to start a process from node uplooking03 Worker simply verify that spark-shell bin/spark-shell scala > sc.textFile ("hdfs://ns1/data/hello") .flatMap (_ .split (")) .map ((_ 1). ReduceByKey (_ + _) .foreach (println) We found that spark executed this program very quickly. Calculate the desired result. One port: 8080Universe 4040 808080-> access port of spark cluster. A combination of 4040 export SPARK_MASTER_PORT=7077-> sparkUI access address 7077-- > hadoop port 9000 zookeeper-based HA configuration similar to 50070 and 8088 in hadoop-it's best to comment out two lines in spark-env.sh when the cluster stops # export SPARK_MASTER_IP=uplooking01 # export SPARK_MASTER_PORT=7077 the second thing to add a line to spark-env.sh Export SPARK_DAEMON_JAVA_OPTS= "- Dspark.deploy.recoveryMode=ZOOKEEPER-Dspark.deploy.zookeeper.url=uplooking01:2181 Uplooking02:2181,uplooking03:2181-Dspark.deploy.zookeeper.dir=/spark "explains that spark.deploy.recoveryMode is set to the directory where ZOOKEEPER spark.deploy.zookeeper.urlZooKeeper URL spark.deploy.zookeeper.dir ZooKeeper saves the recovery state." The default is / spark restart cluster start start-spark-all.sh on any spark node and manually start the master process on other slave nodes in the cluster: sbin/start-master.sh-- > verify HA through the browser method uplooking01:8080 / uplooking02:8080-- > Status: STANDBY Status: ALIVE in uplooking02. You only need to manually stop the spark process Master on master. Later, the process Master status on slave01 will be noticed from STANDBY programming ALIVE#. If you start it on uplooking02, uplooking02 will also be master, but uplooking01 will be neither. # because master is not specified in the configuration file, only slave# spark-start-all.sh and start-master.sh operations are specified, so it will also start master.Spark source code compilation on this machine after maven is installed, and configure the local spark repository (otherwise it will be slow to download from the Internet during compilation) You can then execute the following command from the spark source directory: mvn-Pyarn-Dhadoop.version=2.6.4-Dyarn.version=2.6.4-DskipTests clean package
The output after successful compilation is as follows:
. [INFO]-[INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM.. SUCCESS [3.617 s] [INFO] Spark Project Test Tags.. SUCCESS [17.419 s] [INFO] Spark Project Launcher.. SUCCESS [12.102 s] [INFO] Spark Project Networking.. SUCCESS [11.878 s] [INFO] Spark Project Shuffle Streaming Service. SUCCESS [7.324 s] [INFO] Spark Project Unsafe.. SUCCESS [16.326 s] [INFO] Spark Project Core.. SUCCESS [04:31 min] [INFO] Spark Project Bagel.. SUCCESS [11.671 s] [INFO] Spark Project GraphX.. SUCCESS [55.420 s] [INFO] Spark Project Streaming.. SUCCESS [02:03 min] [INFO] Spark Project Catalyst.. SUCCESS [02:40 min] [INFO] Spark Project SQL.. SUCCESS [03:38 min] [INFO] Spark Project ML Library.. SUCCESS [03:56 min] [INFO] Spark Project Tools.. SUCCESS [15.726 s] [INFO] Spark Project Hive.. SUCCESS [02:30 min] [INFO] Spark Project Docker Integration Tests. SUCCESS [11.961 s] [INFO] Spark Project REPL.. SUCCESS [42.913 s] [INFO] Spark Project YARN Shuffle Service. SUCCESS [8.391 s] [INFO] Spark Project YARN.. SUCCESS [42.013 s] [INFO] Spark Project Assembly.. SUCCESS [02:06 min] [INFO] Spark Project External Twitter. SUCCESS [19.155 s] [INFO] Spark Project External Flume Sink. SUCCESS [22.164 s] [INFO] Spark Project External Flume... SUCCESS [26.228 s] [INFO] Spark Project External Flume Assembly. SUCCESS [3.838 s] [INFO] Spark Project External MQTT.... SUCCESS [33.132 s] [INFO] Spark Project External MQTT Assembly. SUCCESS [7.937 s] [INFO] Spark Project External ZeroMQ.. SUCCESS [17.900 s] [INFO] Spark Project External Kafka... SUCCESS [37.597 s] [INFO] Spark Project Examples.. SUCCESS [02:39 min] [INFO] Spark Project External Kafka Assembly. SUCCESS [10.556 s] [INFO]-[INFO] BUILD SUCCESS [INFO]- -[INFO] Total time: 31:22 min [INFO] Finished at: 2018-04-24T18:33:58+08:00 [INFO] Final Memory: 89M/1440M [INFO]-- -
You can then see the compiled files in the following directory:
[uplooking@uplooking01 scala-2.10] $pwd/home/uplooking/compile/spark-1.6.2/assembly/target/scala-2.10 [uplooking@uplooking01 scala-2.10] $ls-lh Total amount 135m uplooking uplooking RW uplooking@uplooking01 scala-2.10-1 April 24 18:28 spark-assembly-1.6.2-hadoop2.6.4.jar
You can also see this file in the lib directory of the installed spark:
[uplooking@uplooking01 lib] $ls-lh total amount 291m uplooking uplooking RW uplooking@uplooking01 lib-1 uplooking uplooking 332K June 22 2016 datanucleus-api-jdo-3.2.6.jar-rw-r--r-- 1 uplooking uplooking 1.9m June 22 2016 datanucleus-core-3.2.10.jar-rw-r--r-- 1 uplooking uplooking 1.8m June 22 2016 datanucleus-rdbms-3.2.9.jar-rw-r--r-- 1 uplooking uplooking 6 .6m June 22 2016 spark-1.6.2-yarn-shuffle.jar-rw-r--r-- 1 uplooking uplooking 173m June 22 2016 spark-assembly-1.6.2-hadoop2.6.0.jar-rw-r--r-- 1 uplooking uplooking 108m June 22 2016 spark-examples-1.6.2-hadoop2.6.0.jar
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.