In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-09-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article will explain in detail how Eclipse runs WordCount in hadoop-002. The editor thinks it is very practical, so I share it with you for reference. I hope you can get something after reading this article.
1. If eclipse is prompted that the file cannot be compiled, it is prompted that there is no permission for a file.
Chmod-R 777 workspace
2. This error occurs when running the Hadoop test case in eclipse
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist:
The reason is that the system did not find the configuration file for hadoop
For 2.5.2, it is core-site.xml.
The configuration of fs.defaultFS is specified
Fs.defaultFS
Hdfs://localhost:9000
Solution:
JobConf conf = new JobConf (WordCount.class)
Conf.setJobName ("wordcount")
/ / conf.set ("fs.defaultFS", "hdfs://localhost:9000")
/ / conf.addResource (new Path ("/ opt/hadoop/etc/hadoop/core-site.xml"))
You can choose to comment out one line of the code for execution.
The complete code is as follows:
Package com.zwh;import java.io.IOException;import java.util.StringTokenizer;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat Import org.apache.hadoop.util.GenericOptionsParser;public class WordCount {public static class TokenizerMapper extends Mapper {private final static IntWritable one = new IntWritable (1); private Text word = new Text (); public void map (Object key, Text value, Context context) throws IOException, InterruptedException {StringTokenizer itr = new StringTokenizer (value.toString ()); while (itr.hasMoreTokens ()) {word.set (itr.nextToken ()) Context.write (word, one);}} public static class IntSumReducer extends Reducer {private IntWritable result = new IntWritable (); public void reduce (Text key, Iterable values, Context context) throws IOException, InterruptedException {int sum = 0; for (IntWritableval: values) {sum + = val.get () } result.set (sum); context.write (key, result);}} public static void main (String [] args) throws Exception {Configuration conf = new Configuration (); conf.set ("fs.defaultFS", "hdfs://localhost:9000"); Job job = new Job (conf, "word count"); job.setJarByClass (WordCount.class); job.setMapperClass (TokenizerMapper.class); job.setCombinerClass (IntSumReducer.class) Job.setReducerClass (IntSumReducer.class); job.setOutputKeyClass (Text.class); job.setOutputValueClass (IntWritable.class); FileInputFormat.addInputPath (job,new Path ("/ user/root/input/")); FileOutputFormat.setOutputPath (job,new Path ("/ user/root/output/wc")); System.exit (job.waitForCompletion (true)? 0: 1);}}
Schematic diagram
This is the end of the article on "how Eclipse runs WordCount in hadoop-002". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, please share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
The market share of Chrome browser on the desktop has exceeded 70%, and users are complaining about
The world's first 2nm mobile chip: Samsung Exynos 2600 is ready for mass production.According to a r
A US federal judge has ruled that Google can keep its Chrome browser, but it will be prohibited from
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
About us Contact us Product review car news thenatureplanet
More Form oMedia: AutoTimes. Bestcoffee. SL News. Jarebook. Coffee Hunters. Sundaily. Modezone. NNB. Coffee. Game News. FrontStreet. GGAMEN
© 2024 shulou.com SLNews company. All rights reserved.