In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces the knowledge of "how to submit a Spark application with java". Many people will encounter this dilemma in the operation of actual cases, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
The first way
First open a file with vim, MyLauncher.java
The code is as follows:
Import org.apache.spark.launcher.SparkAppHandle
Import org.apache.spark.launcher.SparkLauncher
Import java.util.HashMap
Public class MyLauncher {
Public static void main (String [] args) throws Exception {
HashMap map = newHashMap ()
Map.put ("HADOOP_CONF_DIR", "/ home/hadoop/conf")
Map.put ("YARN_CONF_DIR", "/ home/hadoop/conf")
Map.put ("SPARK_CONF_DIR", "/ home/hadoop/spark/conf")
New SparkLauncher (map)
.setAppResource ("/ data/newStreaming/uesc-analyzer.jar")
.setMainClass ("ucloud.UESBash.testSchema")
.setMaster ("yarn-cluster")
.setConf (SparkLauncher.DRIVER_MEMORY, "2g")
.setVerbose (true) .startApplication ()
Thread.sleep (100000)
/ / Use handle API to monitor / control application.
}
}
Next, compile
Javac-cp / home/hadoop/spark/lib/spark-assembly-1.6.0-hadoop2.6.0-cdh6.4.9.jarMyLauncher.java
Then submit it for execution.
Java-cp/home/hadoop/spark/lib/spark-assembly-1.6.0-hadoop2.6.0-cdh6.4.9.jar:.MyLauncher
So you can see the operation on the interface of yarn.
Note: you may be wondering why I added a pause there, because normally we have to do some other actions to monitor the Spark application instead of quitting directly, which may cause the submission to be completed and exit, so that the application cannot be submitted.
Of course, there is another way, but it is not recommended. The method used above can better monitor the status of our Spark programs.
The second way
Import org.apache.spark.launcher.SparkAppHandle
Import org.apache.spark.launcher.SparkLauncher
Import java.util.HashMap
Public class MyLauncher {
Public static void main (String [] args) throws Exception {
HashMap map = newHashMap ()
Map.put ("HADOOP_CONF_DIR", "/ home/hadoop/conf")
Map.put ("YARN_CONF_DIR", "/ home/hadoop/conf")
Map.put ("SPARK_CONF_DIR", "/ home/hadoop/spark/conf")
Process spark = new SparkLauncher (map)
.setAppResource ("/ data/newStreaming/uesc-analyzer.jar")
.setMainClass ("ucloud.UESBash.testSchema")
.setMaster ("yarn-cluster")
.setConf (SparkLauncher.DRIVER_MEMORY, "2g")
.setVerbose (true) .launch ()
/ / Use handle API to monitor / control application.
Spark.waitFor ()
}
}
The second way is to start a child process to load the submission application.
That's all for "how to submit a Spark application with java". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.