In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces "the introduction of the conversion operation of the Spark operator". In the daily operation, I believe that many people have doubts about the introduction of the conversion operation of the Spark operator. The editor consulted all kinds of materials and sorted out a simple and easy-to-use method of operation. I hope it will be helpful to answer the doubts of "the introduction of the conversion operation of the Spark operator". Next, please follow the editor to study!
# # Overview
Each conversion operation produces a different RDD for the next operation.
# operator
Solving the problem is actually initializing the state of the problem, transforming the state of the problem through a series of operations Operate, and then reaching the state of completion of the solution.
# inert mechanism
The conversion process of RDD is lazy evaluation, that is, the whole conversion process only records the trajectory, and the real calculation will not occur. Only when an action is encountered, the real calculation will be triggered.
# # filter (func)
Filter out the elements that satisfy the function func and return to store a new dataset
Val conf = new SparkConf (). SetAppName ("spark"). SetMaster ("local") val sc = new SparkContext (conf) val rdd = sc.parallelize (List) val result = rdd.filter (_% 2) println (result.collect (). MkString (","))
# # map (func)
Each element is passed to the function func for operation and the result is returned as a new dataset.
Collect () returns the result of rdd as an array, but each number in the list is multiplied by 2
Val conf = new SparkConf (). SetAppName ("spark"). SetMaster ("local") val sc = new SparkContext (conf) val rdd = sc.parallelize (List) val mapResult = rdd.map (_ * 2) println (mapResult.collect (). ToBuffer)
# # flatMap (func)
Similar to map, but each input element can be mapped to 0 or more outputs, so func should return a sequence instead of a single element
Val conf = new SparkConf (). SetAppName ("RDD"). SetMaster ("local [*]") val sc = new SparkContext (conf) val arrayRDD: RDD [list [int]] = sc.makeRDD (Array (List (1J2)) ) val listRDD: RDD [Int] = arrayRDD.flatMap (data= > data) listRDD.collect (). Foreach (println) val conf = new SparkConf (). SetAppName ("spark"). SetMaster ("local") val sc = new SparkContext (conf) val rdd = sc.parallelize (Array ("a b c", "b c d") val result = rdd.flatMap (_ .split (")) println (result.collect (). MkString (", "))
# # sample
Parameter 1 whether the extracted data is put back
Parameter 2 sampling proportional floating point type
Parameter 3 seed, default
Val conf = new SparkConf () .setAppName ("spark"). SetMaster ("local") val sc = new SparkContext (conf) val rdd = sc.parallelize (1 to 10) val result = rdd.sample (false,0.5) println (result.collect (). MkString (","))
# # union
Union set
# # intersection
Find the intersection
# # distinct
Remove repetitive elements
At this point, the study of "introduction to the conversion operation of the Spark operator" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.