Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the log file generated by submitting jobs to the cluster on the Spark platform?

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

Today, I will talk to you about what is the log file generated by submitting jobs to the cluster on the Spark platform. Many people may not know much about it. In order to make you understand better, the editor summarized the following content for you. I hope you can get something according to this article.

Created by Wang, Jerry, last modified on Aug 16, 2015

. / spark-class org.apache.spark.deploy.worker.Worker spark://NKGV50849583FV1:7077

NKGV50849583FV1:~/devExpert/spark-1.4.1/bin #. / spark-class org.apache.spark.deploy.worker.Worker spark://NKGV50849583FV1:7077

Added by Jerry: loading load-spark-env.sh! 1

Added by Jerry: …

/ root/devExpert/spark-1.4.1/conf

Added by Jerry, number of Jars: 1

Added by Jerry, launch_classpath: / root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar

Added by Jerry,RUNNER:/usr/jdk1.7.0_79/bin/java

Added by Jerry, printf argument list: org.apache.spark.deploy.worker.Worker spark://NKGV50849583FV1:7077

Added by Jerry I am in if-else branch: / usr/jdk1.7.0_79/bin/java-cp / root/devExpert/spark-1.4.1/conf/:/root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark- 1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar-Xms512m-Xmx512m-XX:MaxPermSize=256m org.apache.spark.deploy.worker.Worker spark://NKGV50849583FV1:7077

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

15-08-16 12:55:28 INFO Worker: Registered signal handlers for [TERM, HUP, INT]

12:55:28 on 15-08-16 WARN Utils: Your hostname, NKGV50849583FV1 resolves to a loopback address: 127.0.0.1; using 10.128.184.131 instead (on interface eth0)

15-08-16 12:55:28 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address

12:55:29 on 15-08-16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... Using builtin-java classes where applicable

15-08-16 12:55:29 INFO SecurityManager: Changing view acls to: root

15-08-16 12:55:29 INFO SecurityManager: Changing modify acls to: root

12:55:29 on 15-08-16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set (root); users with modify permissions: Set (root)

15-08-16 12:55:30 INFO Slf4jLogger: Slf4jLogger started

15-08-16 12:55:30 INFO Remoting: Starting remoting

15-08-16 12:55:30 INFO Remoting: Remoting started; listening on addresses: [akka.tcp://sparkWorker@10.128.184.131:42568]

12:55:30 on 15-08-16 INFO Utils: Successfully started service 'sparkWorker' on port 42568.

15-08-16 12:55:30 INFO Worker: Starting Spark worker 10.128.184.131:42568 with 8 cores, 30.4 GB RAM

15-08-16 12:55:30 INFO Worker: Running Spark version 1.4.1

15-08-16 12:55:30 INFO Worker: Spark home: / root/devExpert/spark-1.4.1

12:55:30 on 15-08-16 INFO Utils: Successfully started service 'WorkerUI' on port 8081.

12:55:30 on 15-08-16 INFO WorkerWebUI: Started WorkerWebUI at http://10.128.184.131:8081

12:55:30 on 15-08-16 INFO Worker: Connecting to master akka.tcp://sparkMaster@NKGV50849583FV1:7077/user/Master...

15-08-16 12:55:30 INFO Worker: Successfully registered with master spark://NKGV50849583FV1:7077

If I quit the worker session

I can also observe this in master's log:

28 15-08-16 12:55:30 INFO Master: Registering worker 10.128.184.131:42568 with 8 cores, 30.4 GB RAM

29 15-08-16 13:00:19 WARN Master: Removing worker-20150816125530-10.128.184.131-42568 because we got no heartbeat in 60 seconds

30 15-08-16 13:00:19 INFO Master: Removing worker worker-20150816125530-10.128.184.131-42568 on 10.128.184.131 on 42568

After reading the above, do you have any further understanding of the log files that submit jobs to the cluster on the Spark platform? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report