In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Mapreduce programming, this paper takes WordCount as an example to realize file character statistics.
Build a java project in eclipse and introduce jar in the hadoop lib directory and jar in the hadoop home directory.
Create a new WordCount class:
Package org.scf.wordcount
Import java.io.IOException
Import java.util.*
Import org.apache.hadoop.fs.Path
Import org.apache.hadoop.conf.*
Import org.apache.hadoop.io.*
Import org.apache.hadoop.mapred.*
Import org.apache.hadoop.util.*
Public class WordCount {
Public static class Map extends MapReduceBase implements Mapper {
Private final static IntWritable one = new IntWritable (1)
Private Text word = new Text ()
Public void map (LongWritable key, Text value, OutputCollector output, Reporter reporter) throws IOException {
String line = value.toString ()
StringTokenizer tokenizer = new StringTokenizer (line)
While (tokenizer.hasMoreTokens ()) {
Word.set (tokenizer.nextToken ())
Output.collect (word, one)
}
}
}
Public static class Reduce extends MapReduceBase implements Reducer {
Public void reduce (Text key, Iterator values, OutputCollector output, Reporter reporter) throws IOException {
Int sum = 0
While (values.hasNext ()) {
Sum + = values.next () .get ()
}
Output.collect (key, new IntWritable (sum))
}
}
Public static void main (String [] args) throws Exception {
JobConf conf = new JobConf (WordCount.class)
Conf.setJobName ("wordcount")
Conf.setOutputKeyClass (Text.class)
Conf.setOutputValueClass (IntWritable.class)
Conf.setMapperClass (Map.class)
Conf.setCombinerClass (Reduce.class)
Conf.setReducerClass (Reduce.class)
Conf.setInputFormat (TextInputFormat.class)
Conf.setOutputFormat (TextOutputFormat.class)
FileInputFormat.setInputPaths (conf, new Path (args [0]))
FileOutputFormat.setOutputPath (conf, new Path (args [1]))
JobClient.runJob (conf)
}
}
two。 Compile, run this class
Cd / home/Hadoop/
Mkdir wordcount_classes
Javac-classpath / usr/hadoop-1.0.4/hadoop-core-1.0.4.jar-d / home/Hadoop/wordcount_classes WordCount.java
Jar-cvf / home/Hadoop/wordcount.jar-C / home/Hadoop/wordcount_classes/.
Hadoop dfs-put/ home/Hadoop/test.txt / user/root/wordcount/input/file2
Hadoop dfs-put/ home/Hadoop/test1.txt / user/root/wordcount/input/file3
Hadoop jar / home/Hadoop/wordcount.jar org.scf.wordcount.WordCount / user/root/wordcount/input / user/root/wordcount/output
Hadoop dfs-ls / user/root/wordcount/output
Hadoop dfs-cat / user/root/wordcount/output/part-00000
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.