In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
The exception information is as follows:
At org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain (SparkSubmit.scala:569)
At org.apache.spark.deploy.SparkSubmit$.doRunMain$1 (SparkSubmit.scala:166)
At org.apache.spark.deploy.SparkSubmit$.submit (SparkSubmit.scala:189)
At org.apache.spark.deploy.SparkSubmit$.main (SparkSubmit.scala:110)
At org.apache.spark.deploy.SparkSubmit.main (SparkSubmit.scala)
Java.lang.RuntimeException: MetaException (message:java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.RegexSerDe not found)
At org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore (Table.java:290)
At org.apache.hadoop.hive.ql.metadata.Table.getDeserializer (Table.java:281)
At org.apache.hadoop.hive.ql.metadata.Table.getCols (Table.java:631)
At org.apache.hadoop.hive.ql.metadata.Table.checkValidity (Table.java:189)
At org.apache.hadoop.hive.ql.metadata.Hive.getTable (Hive.java:1017)
At org.apache.hadoop.hive.ql.metadata.Hive.getTable (Hive.java:950)
At org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation (HiveMetastoreCatalog.scala:201)
At org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation (HiveContext.scala:262)
At org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply (Catalog.scala:161)
At org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply (Catalog.scala:161)
At scala.Option.getOrElse (Option.scala:120)
At org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation (Catalog.scala:161)
At org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation (HiveContext.scala:262)
At org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable (Analyzer.scala:174)
At org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$6.applyOrElse (Analyzer.scala:186)
At org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$6.applyOrElse (Analyzer.scala:181)
At org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply (TreeNode.scala:188)
At org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply (TreeNode.scala:188)
Background:
CREATE TABLE apachelog (
Host STRING
Identity STRING
User STRING
Time STRING
Request STRING
Status STRING
Size STRING
Referer STRING
Agent STRING)
ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe'
WITH SERDEPROPERTIES (
"input.regex" = "([^] *) (- |\ [^\] *\]) ([^\"] *) (\ "[^\"] *\ ") (- | [0-9] *) (- | [0-9] *) (?: ([^\"] * |\ ".*\") ([^\ "] * |\" .*\ ")?"
)
STORED AS TEXTFILE
Because the use of 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe', instead of' org.apache.hadoop.hive.serde2.RegexSerDe', when creating the table causes the above exception to be reported all the time when using spark-sql or spark-shell access, the corresponding class is always not found, and importing the relevant Jar package still cannot be solved.
Solution:
Import when you want to start spark-shell,spark-sql-- jar xxxxx.jar imports the corresponding jar package. (note: under normal circumstances, everyone should think of importing the corresponding jar. But the problem I encountered is that if the path of the jar is a soft connection path, the above exception will still be reported, and the corresponding class cannot be found, so the actual path of the jar package must be imported. It may be because spark has bug in the processing of soft paths, which is not sure. )
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.