Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize Spark SchemaRDD implicit conversion

2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

Today, I will talk to you about how to achieve Spark SchemaRDD implicit conversion, many people may not know much about it. In order to make you understand better, the editor has summarized the following content for you. I hope you can get something according to this article.

SchemaRDD has been used by us in Spark SQL. This article briefly introduces how to convert the standard RDD (org.apache.spark.rdd.RDD) to SchemaRDD and perform SQL-related operations.

01 / * * this is because people is an ordinary RDD, while the registerTempTable function does not belong to the RDD class and can only be called through the instance of SchemaRDD. Therefore, an error will occur in this call, and there are two solutions:

(1) the registerTempTable function is in the SQLContext class, so we can convert people to SchemaRDD, as follows:

02 * User: memory of the past 03 * Date: 14-12-1604 * Time: 10pm Time 1605 * bolg: http://www.iteblog.com06 * address: http://www.iteblog.com/archives/122407 * past memory blog, technical blog focusing on hadoop, hive, spark, shark, flume A lot of practical information 08 * memory blog Wechat Public account: iteblog_hadoop09 * / 10 scala > val peopleSchema = sqlContext.createSchemaRDD (people) 11 peopleSchema:org.apache.spark.sql.SchemaRDD = 12 SchemaRDD [29] at RDD at SchemaRDD.scala:10313 = = Query Plan = = 14 = Physical Plan = = 15 ExistingRdd [name#4,age#5] MapPartitionsRDD [28] at16 mapPartitions at basicOperators.scala:21717 18 scala > peopleSchema.registerTempTable ("people") 19 warning:there were 1deprecation warning (s) Re-run with-deprecation fordetails

This makes it possible to convert people to SchemaRDD.

(2) the above method is to convert ordinary RDD to SchemaRDD by calling sqlContext.createSchemaRDD explicitly. In fact, we can also use the implicit syntax of Scala to convert. Let's first look at the definition of the createSchemaRDD function.

1 / * * 2 * Creates a SchemaRDD from an RDD of case classes.3 * 4 * @ group userf5 * / 6 implicitdefcreateSchemaRDD [An import sqlContext.createSchemaRDD2 import sqlContext.createSchemaRDD3 4 scala > people.registerAsTable ("people") 5 warning:there were 1deprecation warning (s); re-run with-deprecation fordetails

This translates people into SchemaRDD stealthily. This is because Spark can implicitly convert RDD that contains case class into SchemaRDD.

After reading the above, do you have any further understanding of how to implement the implicit conversion of Spark SchemaRDD? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report