In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article shows you how to deal with the problem of using spark2.0 with hive0.13.1. The content is concise and easy to understand. It will definitely brighten your eyes. I hope you can get something through the detailed introduction of this article.
Function, spark2.0 is used with hive0.13.1 and data is saved to hive
Error: Invalid method name: 'alter_table_with_cascade'
Solution 1:
Leave the hive.metastore.uris configuration item in hive-site.xml empty. Use jdo-related configurations. This modifies the way spark extracts hive meta-information to extract it directly from the database. Although it can be solved, the customer does not give the data.
Solution 2:
Configure spark-defaults.conf to add spark.sql.hive.metastore.jars and spark.sql.hive.metastore.version entries.
After configuration, the Invalid method name: 'alter_table_with_cascade' exception disappears
As a result, NoSuchMethodException:org.apche.hadoop.,hive.ql.metada.Hive. Error of loadTable (org.apche.hadoop.fs.Path,java.lang.String,boolean,boolean).
Find this class Hive.java, inside the loadTable method unexpectedly requires 5 parameters. Sure enough, there is no loadTable method with four parameters. The Shim_v0_13 class in HiveShim.scala in Spark can only be modified with reference to Shim_v0_14, adding
LoadTable and loadPartition methods. Recompile, execute.
I can't believe it's ready. It seems that there are a lot of bug in spark, or am I using it wrong?
There is a strange phenomenon in solving the problem: there is no problem found by the loadTable method in yarn-client mode.
The above is the way to deal with the problem of using spark2.0 with hive0.13.1. Have you learned the knowledge or skills? If you want to learn more skills or enrich your knowledge reserve, you are welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.