Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to install Spark on Master

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly explains "how to install Spark on Master". The content in the article is simple and clear, and it is easy to learn and understand. Please follow the editor's train of thought to study and learn how to install Spark on Master.

Install Spark

Spark needs to be installed on Master, Slave1 and Slave2 machines.

First install Spark on Master with the following steps:

Step 1: extract the Spark on Master:

We decompress it directly to the current directory:

At this point, we create the directory "/ usr/local/spark" of Spark:

Copy the extracted "spark-1.0.0-bin-hadoop1" to / usr/local/spark ":

Step 2: configure environment variables

Enter the configuration file:

Add "SPARK_HOME" to the configuration file and add the bin directory of spark to PATH:

Save exit after configuration, and then make the configuration effective:

Step 3: configure Spark

Enter the conf directory of Spark:

Add "SPARK_HOME" to the configuration file and add the bin directory of spark to PATH:

Copy spark-env.sh.template to spark-env.sh:

Add the following configuration information to the configuration file:

Where:

JAVA_HOME: specify the installation directory of Java

SCALA_HOME: specify the installation directory of Scala

SPARK_MASTER_IP: specifies the IP address of the Master node of the Spark cluster

SPARK_WORKER_MEMOERY: the specified Worker node can maximize the amount of memory allocated to Excutors, because our three machines are configured with 2g. In order to make the best use of memory, it is set to 2g here.

HADOOP_CONF_DIR: specifies the directory of the configuration files of our original Hadoop cluster

Save exit.

Next, configure the slaves file under the conf of Spark to add all the Worker nodes:

The contents of the file after opening:

We need to modify the content to:

You can see that we set up all three machines as Worker nodes, that is, our master node is both Master and Worker nodes.

Save exit.

Thank you for your reading, the above is the content of "how to install Spark on Master". After the study of this article, I believe you have a deeper understanding of how to install Spark on Master, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report