In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly explains "how to combine spark sql and hbase". Friends who are interested might as well take a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how to combine spark sql and hbase.
How does Q1:spark support impromptu? it's not spark sql, is it hive on spark?
Spark1.0 used to support ad hoc query technology is Shark
The ad hoc query technology supported by Spark 1.0 and Spark 1.0.1 is Spark SQL
The unreleased Spark 1.1 starts with Spark SQL as the core of ad hoc queries, and we expect that Hive on Spark can also support ad hoc queries.
Q2: does spark version 1.0.0 support hive on spark now? does it support cli interactive access?
Spark1.0.0 does not support hive on spark
The hive on spark project is under development and is expected to be released in Spark version 1.1
Spark 1.0.0 currently does not directly support cli access
How do Q3:spark sql and hbase combine?
Spark sql and hbase can take advantage of the RDD function of Spark core
When using hbase, you need to import hbase packages on Spark
Spark 1.0.0 currently does not directly support cli access
Does Q4:sparkSql support sql? Can the teacher convert the current PLSQL directly into SPARKSQL?
Spark sql now supports the basic functions of SQL-92, and will not be enhanced in subsequent versions.
PLSQL cannot be directly converted to Spark SQL yet.
For stronger SQL support, consider the functionality of Hive in Spark SQL in Spark1.0.0 and Spark1.0.1 versions
Q5: if hive on spark is supported, when to use spark sql and when to use hive on spark?
Hive on spark is expected to be released in Spark 1.1, and the core function is to use all the data and functions of existing Hive on Spark.
Spark SQL can be used if it is not related to hive.
In theory, with the continuous enhancement of Spark SQL in future versions, it is possible to do everything about hive.
Can I use group by for Q6:Spark SQL?
Spark SQL can use the features of group by
There is group by support in SchemaRDD
GroupBy (groupingExprs: Expression*) (aggregateExprs: Expression*): SchemaRDD
Performs a grouping followed by an aggregation.
The operations that Q7:spark sql now supports sql have not been able to find the corresponding document on the official website.
The official Document address of Spark SQL API
Http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SchemaRDD
The official website used by Spark SQL
Http://spark.apache.org/docs/latest/sql-programming-guide.html
At this point, I believe you have a deeper understanding of "how to combine spark sql and hbase". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.