Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

SPARK installation

2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Use the installation package:

Scala-2.10.3.tgz

Spark-0.9.0-incubating-bin-hadoop2.tgz

Hadoop-2.3.0-cdh6.0.0.tar.gz

Jdk1.7.0_45

Download address: http://mirror.bit.edu.cn/apache/spark/spark-0.9.0-incubating/

Spark-0.9.0-incubating-bin-hadoop2.tgz

Http://archive.cloudera.com/cdh6/cdh/5/ downloads sacala and hadoop

Install softwar

Tarzxvf spark-0.9.0-incubating-bin-hadoop2.tgz

Tar zxvf scala-2.10.3.tgz

Configure user environment variable ~ / .bashrc

Export JAVA_HOME=/hadoop/jdk1.7.0_45

Export HADOOP_HOME=/hadoop/hadoop

Export HADOOP_CONF_DIR=/hadoop/hadoop-config

Export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH

Export CLASSPATH=$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$CLASSPATH

Export HADOOP_HOME_WARN_SUPPRESS=1

Export SCALA_HOME=/hadoop/scala-2.10.3

Export SPARK_EXAMPLES_JAR=/hadoop/spark/examples/target/spark-examples_2.10-0.9.0-incubating.jar

Export SPARK_HOME=/hadoop/spark-0.9.0-incubating-bin-hadoop2

Configure environment variables: / hadoop/spark-0.9.0-incubating-bin-hadoop2/conf

Modify the spark-env.sh file to add the necessary configuration for spark:

Export SCALA_HOME=/hadoop/scala-2.10.3

Export JAVA_HOME=/hadoop/jdk1.7.0_45

Export SPARK_HOME=/hadoop/spark

Copy environment variables to other nodes with immediate effect of source .bash _ profile

Test environment variable scala-version

Configure the conf file:

Vi slaves

Bigdata-2

Bigdata-4

Files are distributed to other nodes

Spark launch:

Cd / hadoop/spark-0.9.0-incubating-bin-hadoop2/sbin./start-all.sh View process: jps22580 NameNode25767 Master27758 Jps23024 ResourceManager22812 SecondaryNameNode Node process: 70869 Worker70150 NodeManager71462 Jps70023 DataNode~ installed ~ spark use: run SparkPi: cd / hadoop/spark/bin

Ll / hadoop/spark/bin

-rw-rw-r--. 1 hadoop hadoop 2601 February 3 03:13 compute-classpath.cmd

-rwxrwxr-x. 1 hadoop hadoop 3330 February 3 03:13 compute-classpath.sh

-rwxrwxr-x. 1 hadoop hadoop 2070 February 3 03:13 pyspark

-rw-rw-r--. 1 hadoop hadoop 1827 February 3 03:13 pyspark2.cmd

-rw-rw-r--. 1 hadoop hadoop 1000 February 3 03:13 pyspark.cmd

-rwxrwxr-x. 1 hadoop hadoop 3055 February 3 03:13 run-example

-rw-rw-r--. 1 hadoop hadoop 2047 February 3 03:13 run-example2.cmd

-rw-rw-r--. 1 hadoop hadoop 1012 February 3 03:13 run-example.cmd

-rwxrwxr-x. 1 hadoop hadoop 5151 February 3 03:13 spark-class

-rwxrwxr-x. 1 hadoop hadoop 3212 February 3 03:13 spark-class2.cmd

-rw-rw-r--. 1 hadoop hadoop 1010 February 3 03:13 spark-class.cmd

-rwxrwxr-x. 1 hadoop hadoop 3038 February 303: 13 spark-shell

-rwxrwxr-x. 1 hadoop hadoop 941 February 3 03:13 spark-shell.cmd

. / run-example org.apache.spark.examples.SparkPi spark://master:7077

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report