In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces the relevant knowledge of "how to solve the tblproperties exception of SparkSql reading hive table". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
1 problem description
Cluster environment
Sparksql reported an error when reading hive table in Parquet format
The parquet table of hive is read normally by hive and impala, but an error is reported when reading with spark-sql.
Abnormal information
Com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input within/between Object entriesat [Source: (String) "{" type ":" struct "," fields ": [{" name ":" timestamp "," type ":" string "," nullable ": true," metadata ": {" HIVE_TYPE_STRING ":" string "}, {" name ":" xxx "," type ":" string "," nullable ": true," metadata ": {" HIVE_TYPE_STRING ":" string "} {"name": "xxx", "type": "string", "nullable": true, " Line: 1 At com.fasterxml.jackson.core.JsonParser._constructError (JsonParser.java:1804) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipAfterComma2 (ReaderBasedJsonParser.java:2323) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipComma (ReaderBasedJsonParser.java:2293) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken (ReaderBasedJsonParser.java:664) at org.json4s.jackson.JValueDeserializer.deserialize (JValueDeserializer.scala:47) at org.json4s.jackson.JValueDeserializer.deserialize (JValueDeserializer.scala) : 39) at org.json4s.jackson.JValueDeserializer.deserialize (JValueDeserializer.scala:32) at org.json4s.jackson.JValueDeserializer.deserialize (JValueDeserializer.scala:46) at org.json4s.jackson.JValueDeserializer.deserialize (JValueDeserializer.scala:39) at com.fasterxml.jackson.databind.ObjectReader._bindAndClose (ObjectReader.java:1611) at com.fasterxml.jackson.databind.ObjectReader.readValue (ObjectReader.java:1219) at org.json4s.jackson.JsonMethods$class.parse (JsonMethods.scala:25) at org.json4s.jackson.JsonMethods$.parse (JsonMethods.scala:55) at org.apache.spark.sql.types.DataType$.fromJson (DataType.scala:127) at org.apache.spark.sql.hive.HiveExternalCatalog$.org$apache$spark$sql$hive$HiveExternalCatalog$$getSchemaFromTableProperties (HiveExternalCatalog.scala:1382) at org.apache.spark.sql.hive.HiveExternalCatalog.restoreDataSourceTable (HiveExternalCatalog.scala:845) at org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$restoreTableMetadata (HiveExternalCatalog.scala:765) at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply (HiveExternalCatalog.scala:734) at org.apache.spark.sql .hive.HiveExternalCatalog $$anonfun$getTable$1.apply (HiveExternalCatalog.scala:734)
2 the reason for the problem is that there is a problem with the tblproperites of the hive table, and the json field in the tblproperites cannot be parsed properly, resulting in an error in reading the table by SparkSql. Hive and Impala do not parse tblproperites when reading tables, so it is normal.
3 problem solving
The problem of incomplete tblproperites should be the table in which hive stores tblproperites, and the parameter field is truncated, so find the TABLE_PARAMS table in the metastore library, check the PARAM_VALUE field, and find that the length of the field is only 256. find the problem.
Change the length of PARAM_VALUE to 8000, and the problem is solved
This is the end of the content of "how to solve the tblproperties exception when SparkSql reads hive table". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.