In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article mainly introduces "two ways of submitting tasks in yarn mode in Spark". In daily operation, I believe that many people have doubts about the two ways of submitting tasks in yarn mode in Spark. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful to answer the doubts of "two ways of submitting tasks in yarn mode in Spark". Next, please follow the editor to study!
How 1.yarn-client submits tasks
Configuration
Add the Hadoop_HOME configuration directory to the client node configuration to submit the yarn task. The specific steps are as follows:
Export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
one
Note that client only needs the installation package of Spark to submit tasks, and no other configuration (such as slaves) is required!
Submit command
/ opt/spark/bin/spark-submit-master yarn--class org.apache.spark.examples.SparkPi / opt/spark/examples/jars/spark-examples_2.11-2.2.0.jar 100/opt/spark/bin/spark-submit-- master yarn-client-- class org.apache.spark.examples.SparkPi / opt/spark/examples/jars/spark-examples_2.11-2.2.0.jar 100/opt/spark/bin/spark-submit- -master yarn-- deploy-mode client-- class org.apache.spark.examples.SparkPi / opt/spark/example
Execution process
-1. The client submits an Application and starts a Driver process on the client.
-the 2.Driver process sends a request to RS (ResourceManager) to start the resource of AM (ApplicationMaster).
-3.RS receives the request and randomly selects a NM (NodeManager) to launch AM. The NM here is equivalent to the Worker node in Standalone.
-after 4.AM starts, it requests a batch of container resources from RS to start Executor.
-5.RS will find a batch of NM and return it to AM to start Executor.
-6.AM sends a command to NM to start Executor.
-after 7.Executor starts, it will reverse register to Driver,Driver and send task to Executor, and the execution and result will be returned to the driver side.
Summary
1.Yarn-client mode is also suitable for testing, because Driver runs locally, Driver will communicate a lot with Executor in yarn cluster, which will increase the traffic of client network card.
The role of 2.ApplicationMaster:
Request resources for the current Application
Send a message to NodeManager to start Executor.
Note: ApplicationMaster has the ability to launchExecutor and request resources, but not job scheduling.
How 2.yarn-cluster submits tasks
Submit command
/ opt/spark/bin/spark-submit-master yarn--deploy-mode cluster-class org.apache.spark.examples.SparkPi / opt/spark/examples/jars/spark-examples_2.11-2.2.0.jar 100/opt/spark/bin/spark-submit-master yarn-cluster-class org.apache.spark.examples.SparkPi / opt/spark/examples/jars/spark-examples_2.11-2.2.0.jar 100
The result is in yarn's log:
At this point, the study on "two ways of submitting tasks in yarn mode in Spark" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.