In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
(1) A case of ashes level-- UDTF seeking wordcount
Data format:
Each line is a string and separated by a space.
Code implementation:
Object SparkSqlTest {def main (args: Array [String]): Unit = {/ / screen redundant logs Logger.getLogger ("org.apache.hadoop") .setLevel (Level.WARN) Logger.getLogger ("org.apache.spark") .setLevel (Level.WARN) Logger.getLogger ("org.project-spark") .setLevel (Level.WARN) / / build programming entry val conf: SparkConf = new SparkConf () conf.setAppName ("SparkSqlTest") .setMaster ("local [2]") val spark: SparkSession = SparkSession.builder () .config (conf) .enableHiveSupport () .getOrCreate () / create the sqlcontext object val sqlContext: SQLContext = spark.sqlContext val wordDF: DataFrame = sqlContext.read.text ("C:\\ z_data\\ Test_data\\ ip.txt ") .toDF (" line ") wordDF.createTempView (" lines ") val sql=" | select t1.word Count (1) counts | from (| select explode (split (line,'\\ s') word | from lines) T1 | group by t1.word | order by counts "" .stripMargin spark.sql (sql). Show ()}}
Results:
(2) finding topN by window function
Data format:
Take the top three with the best grades in each course
Code implementation:
Object SparkSqlTest {def main (args: Array [String]): Unit = {/ / screen redundant logs Logger.getLogger ("org.apache.hadoop") .setLevel (Level.WARN) Logger.getLogger ("org.apache.spark") .setLevel (Level.WARN) Logger.getLogger ("org.project-spark") .setLevel (Level.WARN) / / build programming entry val conf: SparkConf = new SparkConf () conf.setAppName ("SparkSqlTest") .setMaster ("local [2]") val spark: SparkSession = SparkSession.builder () .config (conf) .enableHiveSupport () .getOrCreate () / create the sqlcontext object val sqlContext: SQLContext = spark.sqlContext val topnDF: DataFrame = sqlContext.read.json ("C:\\ z_data\\ Test_data\\ score.json ") topnDF.createTempView (" student ") val sql="select | t1.course course | | t1.name name, | t1.score score | from (| select | course, | name, | score, | row_number () over (partition by course order by score desc) top | from student) T1 where t1.top |
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.