In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
How to learn spark-shell, in view of this problem, this article introduces the corresponding analysis and solutions in detail, hoping to help more partners who want to solve this problem to find a more simple and feasible method.
Today I'm going to introduce you to the application of spark-shell.
Spark-shell is a running script for spark. It has initialized sparkContext (sc) and SparkSesssion (Spark)
You can go to the installation path of spark
Bin/spark-shell
The above picture shows that it is correct. Next, you can operate spark.
Note that spark uses the scala language
Val text = sc.textFile ("/ usr/wordcount.txt")
Text.count ()
Run the results as shown. Note that here we are loading local files, not hdfs files
Let's manipulate the hdfs file. Write out the wordcount program
First, upload the file to hdfs
. / hdfs dfs-put / usr/a.txt / user/spark
Then manipulate the data.
Val text = sc.textFile ("hdfs://192.168.153.11:9000/user/spark/a.txt")
Val counts = text.flatMap (line = > line.split ("")) .map (word = > (word, 1)) .reduceByKey (_ + _)
Counts.saveAsTextFile ("hdfs://192.168.153.11:9000/user/spark/wordcount")
The following one is also the demo that calculates PI
The answers to the questions on how to learn spark-shell are shared here. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel to learn more about it.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.