Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Spark History Server configuration deployment

2025-03-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Brief introduction

In order to view the specific operation details through the WebUI console page, solve the problem that the application has finished running and cannot continue to view the monitoring cluster information. Unable to review the details of the running program, configure to open the spark.history service. Spark History Server can solve the above problem very well.

Profile location:

Spark-defaults.conf file in the $SPARK_HOME$/conf directory. The default spark-defaults.conf does not exist. We can create a new one based on the template file provided by Spark.

The configuration parameter describes the ip address of the Master node of the spark.master spark://172.20.101.157:7070172.20.101.164:7070172.20.101.165:7070#spark cluster: spark.history.updateInterval# default value: 10, the interval between updating log information in seconds: spark.history.retainedApplications# default value: 50, the number of Application history records saved in memory, if this value is exceeded The old application information will be deleted and the page will need to be rebuilt when the deleted application information is accessed again. Spark.history.ui.port# default value: 4040 web port spark.history.ui.acls.enable# of HistoryServer default value: false, whether the authorized user checks acl when viewing the application information. If enabled, only the application owner and the user specified by spark.ui.view.acls can view the application information Otherwise, do not check spark.eventLog.enabled # default value: false, whether to log Spark events, for applications to reconstruct webUIspark.eventLog.dir# default value: file:///tmp/spark-events, the path to save log-related information, which can be the HDFS path at the beginning of hdfs:// or the local path at the beginning of file://. You need to create spark.eventLog.compress # default value: false, whether to compress and record Spark events, provided that spark.eventLog.enabled is true, and default is the configuration case of snappy local file storage log:

Spark-defaults.conf

Spark.master spark://172.20.101.157:7070172.20.101.164:7070172.20.101.165:7070spark.history.ui.port 18080spark.history.retainedApplications 10spark.eventLog.compress truespark.eventLog.enabled truespark.eventLog.dir file:/data/sparkhistoryspark.history.fs.logDirectory file:/data/sparkhistory startup service $SPARK_HOME/sbin/start- History-server.sh access browser: http://nodeIP:18080

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report