In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces how to use hadoop2x WordCount MapReduce, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let the editor take you to understand it.
Package com.jhl.haoop.examples
Import java.io.IOException
Import java.util.StringTokenizer
Import org.apache.hadoop.conf.Configuration
Import org.apache.hadoop.fs.Path
Import org.apache.hadoop.io.IntWritable
Import org.apache.hadoop.io.LongWritable
Import org.apache.hadoop.io.Text
Import org.apache.hadoop.mapreduce.Job
Import org.apache.hadoop.mapreduce.Mapper
Import org.apache.hadoop.mapreduce.Reducer
Import org.apache.hadoop.mapreduce.lib.input.FileInputFormat
Import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat
Import org.apache.hadoop.util.GenericOptionsParser
Public class WordCount {
/ / map region
Public static class TokenizerMapper extends
Mapper {
Private final static IntWritable one = new IntWritable (1); / / count each word once
Private Text word = new Text ()
Public void map (LongWritable key, Text value, Context context)
Throws IOException, InterruptedException {
/ / split [space tab\ t newline\ ncarriage return\ r\ f]
/ / public StringTokenizer (String str) {
/ / this (str, "\ t\ n\ r\ f", false)
/ /}
StringTokenizer itr = new StringTokenizer (value.toString ()); / / get the value of each row of data value.toString ()
While (itr.hasMoreTokens ()) {
Word.set (itr.nextToken ()); / / sets the key value of the map output
Context.write (word, one); / / output the key and value values of map in context
}
}
}
/ / reduce region
Public static class IntSumReducer extends
Reducer {
Private IntWritable result = new IntWritable ()
Public void reduce (Text key, Iterable values
Context context) throws IOException, InterruptedException {
Int sum = 0
For (IntWritableval: values) {/ / cycle through Iterable
Sum + = val.get (); / / accumulate
}
Result.set (sum); / / set the total number of times
Context.write (key, result)
}
}
/ / client region
Public static void main (String [] args) throws Exception {
Configuration conf = new Configuration (); / / get configuration information
/ / GenericOptionsParser is used for the commonly used Hadoop command options, and sets the corresponding value for the Configuration object as needed.
String [] otherArgs = new GenericOptionsParser (conf, args) .getRemainingArgs
If (otherArgs.length! = 2) {
System.err.println ("Usage: wordcount")
System.exit (2)
}
Job job = new Job (conf, "WordCount"); / / create Job, set Job configuration and name
Job.setJarByClass (WordCount.class); / / sets the class under which Job runs
Job.setMapperClass (TokenizerMapper.class); / / set Mapper class and Reducer class
Job.setCombinerClass (IntSumReducer.class)
Job.setReducerClass (IntSumReducer.class)
FileInputFormat.addInputPath (job, new Path (otherArgs [0])); / / sets the path to the input file and to the output file
FileOutputFormat.setOutputPath (job, new Path (otherArgs [1]))
Job.setOutputKeyClass (Text.class); / / sets the key and value types of the output result
Job.setOutputValueClass (IntWritable.class)
Boolean isSuccess = job.waitForCompletion (true); / / submit the Job, wait for the running result, and display the running information on the client
System.exit (isSuccess? 0: 1); / / end the program
}
}
Thank you for reading this article carefully. I hope the article "how to use hadoop2x WordCount MapReduce" shared by the editor will be helpful to everyone. At the same time, I also hope that you will support and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.