Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to run WordCount with Eclipse in hadoop-002

2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article will explain in detail how Eclipse runs WordCount in hadoop-002. The editor thinks it is very practical, so I share it with you for reference. I hope you can get something after reading this article.

1. If eclipse is prompted that the file cannot be compiled, it is prompted that there is no permission for a file.

Chmod-R 777 workspace

2. This error occurs when running the Hadoop test case in eclipse

Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist:

The reason is that the system did not find the configuration file for hadoop

For 2.5.2, it is core-site.xml.

The configuration of fs.defaultFS is specified

Fs.defaultFS

Hdfs://localhost:9000

Solution:

JobConf conf = new JobConf (WordCount.class)

Conf.setJobName ("wordcount")

/ / conf.set ("fs.defaultFS", "hdfs://localhost:9000")

/ / conf.addResource (new Path ("/ opt/hadoop/etc/hadoop/core-site.xml"))

You can choose to comment out one line of the code for execution.

The complete code is as follows:

Package com.zwh;import java.io.IOException;import java.util.StringTokenizer;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IntWritable;import org.apache.hadoop.io.Text;import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Reducer;import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat Import org.apache.hadoop.util.GenericOptionsParser;public class WordCount {public static class TokenizerMapper extends Mapper {private final static IntWritable one = new IntWritable (1); private Text word = new Text (); public void map (Object key, Text value, Context context) throws IOException, InterruptedException {StringTokenizer itr = new StringTokenizer (value.toString ()); while (itr.hasMoreTokens ()) {word.set (itr.nextToken ()) Context.write (word, one);}} public static class IntSumReducer extends Reducer {private IntWritable result = new IntWritable (); public void reduce (Text key, Iterable values, Context context) throws IOException, InterruptedException {int sum = 0; for (IntWritableval: values) {sum + = val.get () } result.set (sum); context.write (key, result);}} public static void main (String [] args) throws Exception {Configuration conf = new Configuration (); conf.set ("fs.defaultFS", "hdfs://localhost:9000"); Job job = new Job (conf, "word count"); job.setJarByClass (WordCount.class); job.setMapperClass (TokenizerMapper.class); job.setCombinerClass (IntSumReducer.class) Job.setReducerClass (IntSumReducer.class); job.setOutputKeyClass (Text.class); job.setOutputValueClass (IntWritable.class); FileInputFormat.addInputPath (job,new Path ("/ user/root/input/")); FileOutputFormat.setOutputPath (job,new Path ("/ user/root/output/wc")); System.exit (job.waitForCompletion (true)? 0: 1);}}

Schematic diagram

This is the end of the article on "how Eclipse runs WordCount in hadoop-002". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, please share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report