Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Run spark in idea and save it to hive, but because there is no hive information configured, spark will run the internal hive by default.

2025-04-07 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article introduces running spark in idea to save to hive, but because there is no configuration of hive information, spark will run the internal hive by default. The content is very detailed. Interested friends can use it for reference. I hope it will be helpful to you.

Reason: run spark in idea and save it to hive. Since no hive information is configured, spark will run internal hive by default.

Reference address: (I tried and solved it effectively)

Https://blog.csdn.net/zgjdzwhy/article/details/71056801

Http://mangocool.com/1473838702533.html

Running result: Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':

At org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect (SparkSession.scala:981)

At org.apache.spark.sql.SparkSession.sessionState$lzycompute (SparkSession.scala:110)

At org.apache.spark.sql.SparkSession.sessionState (SparkSession.scala:109)

At org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply (SparkSession.scala:878)

At org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply (SparkSession.scala:878)

At scala.collection.mutable.HashMap$$anonfun$foreach$1.apply (HashMap.scala:99)

At scala.collection.mutable.HashMap$$anonfun$foreach$1.apply (HashMap.scala:99)

At scala.collection.mutable.HashTable$class.foreachEntry (HashTable.scala:230)

At scala.collection.mutable.HashMap.foreachEntry (HashMap.scala:40)

At scala.collection.mutable.HashMap.foreach (HashMap.scala:99)

At org.apache.spark.sql.SparkSession$Builder.getOrCreate (SparkSession.scala:878)

At MockDataGenerate$.main (MockDataGenerate.scala:167)

At MockDataGenerate.main (MockDataGenerate.scala)

Caused by: java.lang.reflect.InvocationTargetException

At sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method)

At sun.reflect.NativeConstructorAccessorImpl.newInstance (NativeConstructorAccessorImpl.java:62)

At sun.reflect.DelegatingConstructorAccessorImpl.newInstance (DelegatingConstructorAccessorImpl.java:45)

At java.lang.reflect.Constructor.newInstance (Constructor.java:423)

At org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect (SparkSession.scala:978)

... 12 more

Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':

At org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect (SharedState.scala:169)

At org.apache.spark.sql.internal.SharedState. (SharedState.scala:86)

At org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply (SparkSession.scala:101)

At org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply (SparkSession.scala:101)

At scala.Option.getOrElse (Option.scala:121)

At org.apache.spark.sql.SparkSession.sharedState$lzycompute (SparkSession.scala:101)

At org.apache.spark.sql.SparkSession.sharedState (SparkSession.scala:100)

At org.apache.spark.sql.internal.SessionState. (SessionState.scala:157)

At org.apache.spark.sql.hive.HiveSessionState. (HiveSessionState.scala:32)

... 17 more

Caused by: java.lang.reflect.InvocationTargetException

At sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method)

At sun.reflect.NativeConstructorAccessorImpl.newInstance (NativeConstructorAccessorImpl.java:62)

At sun.reflect.DelegatingConstructorAccessorImpl.newInstance (DelegatingConstructorAccessorImpl.java:45)

At java.lang.reflect.Constructor.newInstance (Constructor.java:423)

At org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect (SharedState.scala:166)

... 25 more

Caused by: java.lang.reflect.InvocationTargetException

At sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method)

At sun.reflect.NativeConstructorAccessorImpl.newInstance (NativeConstructorAccessorImpl.java:62)

At sun.reflect.DelegatingConstructorAccessorImpl.newInstance (DelegatingConstructorAccessorImpl.java:45)

At java.lang.reflect.Constructor.newInstance (Constructor.java:423)

At org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient (IsolatedClientLoader.scala:264)

At org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata (HiveUtils.scala:358)

At org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata (HiveUtils.scala:262)

At org.apache.spark.sql.hive.HiveExternalCatalog. (HiveExternalCatalog.scala:66)

... 30 more

Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: / tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-

At org.apache.hadoop.hive.ql.session.SessionState.start (SessionState.java:522)

At org.apache.spark.sql.hive.client.HiveClientImpl. (HiveClientImpl.scala:188)

... 38 more

Caused by: java.lang.RuntimeException: The root scratch dir: / tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-

At org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir (SessionState.java:612)

At org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs (SessionState.java:554)

At org.apache.hadoop.hive.ql.session.SessionState.start (SessionState.java:508)

... 39 more

About running spark in idea to save to hive, but because there is no configuration hive information spark run will default to run the internal hive what to do to share here, I hope the above content can be of some help to you, can learn more knowledge. If you think the article is good, you can share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report