In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article is about how to use java to write spark. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
Package hgs.spark;import java.util.ArrayList;import java.util.Iterator;import java.util.List;import org.apache.spark.SparkConf;import org.apache.spark.api.java.JavaPairRDD;import org.apache.spark.api.java.JavaRDD;import org.apache.spark.api.java.JavaSparkContext;import org.apache.spark.api.java.function.FlatMapFunction;import org.apache.spark.api.java.function.Function2;import org.apache.spark.api.java.function.PairFlatMapFunction;import scala.Tuple2 Public class JavaRDDWC {public static void main (String [] args) {/ / System.setProperty ("HADOOP_USER_NAME", "administrator"); / / winutils.exe System.setProperty of hadoop windows ("hadoop.home.dir", "D:\\ hadoop-2.7.1") SparkConf conf = new SparkConf () .setAppName ("javawc") .setMaster ("local [2]"); @ SuppressWarnings ("resource") JavaSparkContext context = new JavaSparkContext (conf); JavaRDD rdd = context.textFile ("D:\\ test.txt") / / split component array JavaRDD rdd1 = rdd.map (s-> s.split (","); / / only pairrdd can reducebykey JavaPairRDD rdd2 = rdd1.flatMapToPair (new flatMapFunc ()); JavaPairRDD rdd3 = rdd2.reduceByKey (new reducefunc ()); rdd3.saveAsTextFile ("D:\\ fff") Context.stop ();}} class reducefunc implements Function2 {/ * / private static final long serialVersionUID = 1L; @ Override public Integer call (Integer v1, Integer v2) throws Exception {return v1+v2 }} class flatmf implements FlatMapFunction {/ * / private static final long serialVersionUID = 1L; @ Override public Iterator call (String [] t) throws Exception {List list = new ArrayList (); for (String str: t) {list.add (str) } return list.iterator ();}} class flatMapFunc implements PairFlatMapFunction {/ * / private static final long serialVersionUID = 1L; @ Override public Iterator call (String [] t) throws Exception {List list = new ArrayList () For (String str: t) {list.add (new Tuple2 (str, 1));} return list.iterator ();}} Thank you for reading! This is the end of this article on "how to use java to write spark". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, you can share it for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.