Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Study log-contact with mapreduce for the first time

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

Wordcount program

Package org.robby.mr;import java.io.IOException;import java.util.StringTokenizer;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.input.TextInputFormat Import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;import org.apache.hadoop.util.GenericOptionsParser;public class WordCount {public static class Map extends Mapper {/ / object is the row table of each row, text is the content of each row, using the built-in data structure of hadoop / / text refers to the output rows, that is, words, and IntWritable is the number of words private final static IntWritable one = new IntWritable (1) Private Text word = new Text (); public void map (Object key, Text value, Context context) throws IOException, InterruptedException {StringTokenizer itr = new StringTokenizer (value.toString ()); while (itr.hasMoreTokens ()) {/ / select words in word and output with context, adding 1 word.set (itr.nextToken ()) to the corresponding words; context.write (word, one) } public static class Reduce extends Reducer {private IntWritable result = new IntWritable (); / / pass in a word and its number iterator public void reduce (Text key, Iterable values, Context context) throws IOException, InterruptedException {int sum = 0; for (IntWritableval: values) {sum + = val.get () } result.set (sum); context.write (key, result);}} public static void main (String [] args) throws Exception {Configuration conf = new Configuration (); String [] otherArgs = new GenericOptionsParser (conf, args). GetRemainingArgs (); if (otherArgs.length! = 2) {System.err.println ("Usage: wordcount"); System.exit (2);} Job job = Job.getInstance (conf) Job.setJarByClass (WordCount.class); / / Set up the input job.setInputFormatClass (TextInputFormat.class); TextInputFormat.addInputPath (job, new Path (args [0])); / / Mapper job.setMapperClass (Map.class); / / Reducer job.setReducerClass (Reduce.class); / / Output job.setOutputFormatClass (TextOutputFormat.class); job.setOutputKeyClass (Text.class); job.setOutputValueClass (IntWritable.class); TextOutputFormat.setOutputPath (job, new Path (args [1])) System.exit (job.waitForCompletion (true)? 0: 1);}}

Use hadoop jar web.jar [full name of class] [input directory] [output directory]

Both the input and output directories are hdfs directories.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report