In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
WordcountMapper class package com.sky.mr.wordcount;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Mapper;import org.junit.Test;import java.io.IOException Public class WordcountMapper extends Mapper {/ / because each row of text data is read, the map method is called. In order to avoid creating objects many times and wasting memory resources, create Text,IntWritable objects outside the / / map method Text k = new Text (); IntWritable v = new IntWritable (1) @ Override protected void map (LongWritable key, Text value, Context context) throws IOException, InterruptedException {/ / get the text content of each line String line = value.toString (); / / split String by spaces [] words = line.split (""); / / convert the data format and output for (String word: words) {k.set (word) Context.write (k, v);} WordcountReducer class package com.sky.mr.wordcount;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Reducer;import java.io.IOException;public class WordcountReducer extends Reducer {IntWritable v = new IntWritable () @ Override protected void reduce (Text key, Iterable values, Context context) throws IOException, InterruptedException {/ / find the total number of the same key in each group int sum = 0; for (IntWritable count:values) {sum + = count.get ();} / / output v.set (sum); context.write (key, v);}} WordcountDriver class package com.sky.mr.wordcount Import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import java.io.IOException Public class WordcountDriver {public static void main (String [] args) throws IOException, ClassNotFoundException, InterruptedException {/ / 1, get configuration information and job object Configuration conf = new Configuration (); Job job = Job.getInstance (conf); / / 2, set jar package path job.setJarByClass (WordcountDriver.class); / / 3, associate custom mapper and reducer class job.setMapperClass (WordcountMapper.class) Job.setReducerClass (WordcountReducer.class); / / 4, set Map output key and value type job.setMapOutputKeyClass (Text.class); job.setMapOutputValueClass (IntWritable.class); / / 5, set final result key,value type job.setOutputKeyClass (Text.class); job.setOutputValueClass (IntWritable.class) / / 6. Set the file input and output path FileInputFormat.setInputPaths (job,new Path (args [0])); FileOutputFormat.setOutputPath (job,new Path (args [1])); / / 7. Submit the job object encapsulating the running parameters of the MapReduce program to the Yarn cluster boolean result = job.waitForCompletion (true); System.exit (result?0:1);}} input file
Import org apache hadoop io
Import org apache hadoop io
Import org apache hadoop
Import java io IOException
Output file
IOException 1
Apache 3
Hadoop 3
Import 4
Io 3
Java 1
Org 3
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.