In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces "how to achieve word counting in WordCount". In daily operation, I believe many people have doubts about how to achieve word counting in WordCount. The editor consulted all kinds of materials and sorted out simple and easy-to-use methods of operation. I hope it will be helpful to answer the doubts about "how to achieve word counting in WordCount". Next, please follow the editor to study!
One:
Public class WordCount {public static class TokenizerMapper extends Mapper {private final static IntWritable one = new IntWritable (1); private Text word = new Text () Public void map (Object key, Text value, Context context) throws IOException, InterruptedException {/ / splits each line into words and outputs them side by side as the result of the map method. StringTokenizer itr = new StringTokenizer (value.toString ()); / / Test whether there are more tags available while (itr.hasMoreTokens ()) {word.set (itr.nextToken ()); context.write (word, one) } public static class IntSumReducer extends Reducer {private IntWritable result = new IntWritable () The input of the / * reduce function is also in the form of a key/value, but its value is in the form of an iterator Iterable values, * that is to say, the input of reduce is a value,reduce corresponding to a set of values of key, and the context of context and map is the same. * * / public void reduce (Text key, Iterable values, Context context) throws IOException, InterruptedException {int sum = 0; for (IntWritableval: values) {sum + = val.get ();} result.set (sum) Context.write (key, result);}} public static void main (String [] arg) throws Exception {/ / initialize Configuration, which mainly reads mapreduce system configuration information, including hdfs and mapreduce, etc. Configuration conf = new Configuration (); / / build a job, Job job = Job.getInstance (conf, "word count"); / / load the computer program job.setJarByClass (WordCount.class) written by the programmer; / / implement the map function to generate intermediate results based on the pairs of inputs. Configure how mapreduce runs the map and reduce functions job.setMapperClass (TokenizerMapper.class); / / Combiner class, which implements the combine function, merging key-value pairs with the same key value in the intermediate result. The default is null, that is, intermediate results are not merged. The job.setCombinerClass (IntSumReducer.class) / / Reducer class implements the reduce function to merge the intermediate results to get the final result. Job.setReducerClass (IntSumReducer.class); / / defines the type of output key/value, that is, the type of key/value that is ultimately stored in the result file on hdfs job.setOutputKeyClass (Text.class); job.setOutputValueClass (IntWritable.class) / / the first line is to build the input data file, and the second line is to build the output data file, FileInputFormat.addInputPath (job, new Path ("hdfs://192.168.226.129:9000/rootdir/mapreduce.txt")) FileOutputFormat.setOutputPath (job, new Path ("hdfs://192.168.226.129:9000/rootdir/data/studytest/" + System.currentTimeMillis () + "/")); / / if job runs successfully, our program will exit System.exit (job.waitForCompletion (true)? 0: 1) normally;}}
Two:
Public class WordCount1 {public static final IntWritable ONE = new IntWritable (1); public static class WordCountMapper extends Mapper {@ Override protected void map (LongWritable key, Text value, Mapper.Context context) throws IOException, InterruptedException {String [] vs = value.toString () .split ("\\ s") For (String string: vs) {context.write (new Text (string), ONE) } public static class WordCountReduce extends Reducer {@ Override protected void reduce (Text key, Iterable values, Reducer.Context context) throws IOException, InterruptedException {int count = 0 For (IntWritable v: values) {count + = v.get ();} context.write (key,new IntWritable (count)) }} public static void main (String [] args) {try {Configuration conf = new Configuration (); Job job = Job.getInstance (conf, "word count"); job.setJarByClass (WordCount1.class) / / implement the map function to generate intermediate results according to the pairs of inputs. Job.setMapperClass (WordCountMapper.class); job.setMapOutputKeyClass (Text.class); job.setMapOutputValueClass (IntWritable.class); job.setReducerClass (WordCountReduce.class); FileInputFormat.addInputPath (job, new Path ("hdfs://192.168.226.129:9000/rootdir/mapreduce.txt")) FileOutputFormat.setOutputPath (job, new Path ("hdfs://192.168.226.129:9000/rootdir/data/studytest/" + System.currentTimeMillis () + "/")); System.exit (job.waitForCompletion (true)? 0: 1);} catch (IllegalStateException e) {e.printStackTrace () } catch (IllegalArgumentException e) {e.printStackTrace ();} catch (ClassNotFoundException e) {e.printStackTrace ();} catch (IOException e) {e.printStackTrace () } catch (InterruptedException e) {e.printStackTrace ();}
For example:
Enter:
Xujun xujun xj, cxj xxxxcccc ddd xujun xjyyy jjjccc yyy
Output:
At this point, 1ccc 1cccc 1cxj 1ddd 1jjj 1xj 2xujun 3xxxx 1yyy 2, the study on "how to count words in WordCount" is over. I hope I can solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.