Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Spark error Resolution-- Error initializing SparkContext

2025-04-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Spark reported an error while submitting the spark job

. / spark-shell 19-05-14 05:37:40 WARN util.NativeCodeLoader: Unable to load native-hadooplibrary for your platform... Using builtin-java classes where applicableSetting default log level to "WARN". To adjust logging level use sc.setLogLevel (newLevel). For SparkR, use setLogLevel (newLevel). 05:37:49 on 19-05-14 ERROR spark.SparkContext: Error initializing SparkContext.org.apache.hadoop.ipc.RemoteException (org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation (StandbyState.java:88) at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation (NameNode.java:1826) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation (FSNamesystem.java:1404) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo (FSNamesystem.java:4208) at org. Apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo (NameNodeRpcServer.java:895) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo (AuthorizationProviderProxyClientProtocol.java:527) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo (ClientNamenodeProtocolServerSideTranslatorPB.java:824) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod (ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call (ProtobufRpcEngine.java:617) at org.apache.hadoop .ipc.RPC $Server.call (RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run (Server.java:2086) at org.apache.hadoop.ipc.Server$Handler$1.run (Server.java:2082) at java.security.AccessController.doPrivileged (Native Method) at javax.security.auth.Subject.doAs (Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs (UserGroupInformation.java:1693) at org.apache.hadoop.ipc. Cause Analysis of Server$Handler.run (Server.java:2080)

Today, I opened the history-server of spark. I used it well during the test, but later I found that I could not start the spark job and could not submit it.

By analyzing the log and looking at the Web interface of HDFS, it is found that my spark cannot connect to HDFS's ActiveNN, and the only service that spark needs to connect to HDFS to start is to write to the job log, so I looked at the spark-defaults.conf file that specified the sparkJob log write path, and sure enough, the path specified was standByNN.

Spark.eventLog.dir hdfs://hadoop002:8020/g6_direcory

So spark cannot write logs to HDFS by connecting to standByNN

Solve

Just change the log directory file path in spark-defaults.conf and spark-env.sh from a single NN to a namespace path.

My namespace is

Fs.defaultFS hdfs://ruozeclusterg6 modify spark-defaults.confspark.eventLog.enabled truespark.eventLog.dir hdfs://ruozeclusterg6:8020/g6_direcory modify spark-env.shSPARK_HISTORY_OPTS= "- Dspark.history.fs.logDirectory=hdfs://ruozeclusterg6:8020/g6_direcory" Test [hadoop@hadoop002 spark] $spark-shell 19-05-14 06:00:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... Using builtin-java classes where applicableSetting default log level to "WARN". To adjust logging level use sc.setLogLevel (newLevel). For SparkR, use setLogLevel (newLevel). Spark context Web UI available at http://hadoop002:4040Spark context available as' sc' (master = local [*] App id = local-1557828013138). Spark session available as' spark'.Welcome to _ / / _ / / _\ / _ _ / `/ _ _ / / _ /. _ _ /. _ / _ /\ _\ version 2.4.2 / _ / Using Scala version 2.11.12 (Java HotSpot (TM) 64-Bit Server VM, Java 1.8.0mm 131) Type in expressions to have them evaluated.Type: help for more information.scala > solution!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report