Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to analyze the parameters of spark-submit tool

2025-03-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article will explain in detail how to analyze the parameters of the spark-submit tool. The content of the article is of high quality, so the editor will share it for you as a reference. I hope you will have a certain understanding of the relevant knowledge after reading this article.

The parameter description to be passed in during execution

Usage: spark-submit [options] [app options]

Parameter name

Meaning

-- master MASTER_URL

It could be spark://host:port, mesos://host:port, yarn, yarn-cluster,yarn-client, local

-- deploy-mode DEPLOY_MODE

Where the Driver program runs, client or cluster

-- class CLASS_NAME

Main class name, including package name

-- name NAME

Application name

-- jars JARS

Third-party jar packages on which Driver depends

-- py-files PY_FILES

A comma-separated list of .zip, .egg, .py files placed on the Python application PYTHONPATH

-- files FILES

Comma-separated list of files to be placed in each executor working directory

-- properties-file FILE

Sets the file path for application properties. The default is conf/spark-defaults.conf.

-- driver-memory MEM

Memory size used by Driver programs

-- driver-java-options

-- driver-library-path

Library path of Driver program

-- driver-class-path

Classpath of Driver programs

-- executor-memory MEM

Executor memory size. Default is 1G.

-- driver-cores NUM

The number of CPU used by Driver programs is limited to Spark Alone mode.

-- supervise

Whether to restart Driver after failure is limited to Spark Alone mode.

-- total-executor-cores NUM

Total number of cores used by executor, limited to Spark Alone, Spark on Mesos mode

-- executor-cores NUM

Number of cores used per executor. Default is 1, which is limited to Spark on Yarn mode.

-- queue QUEUE_NAME

Which YARN queue to submit the application to? the default is default queue, which is limited to Spark on Yarn mode.

-- num-executors NUM

The number of executor started. The default is 2, which is limited to Spark on Yarn mode.

-- archives ARCHIVES

Only in Spark on Yarn mode

On how to analyze the parameters of the spark-submit tool to share here, I hope that the above content can be of some help to you, can learn more knowledge. If you think the article is good, you can share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report