Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to build a hadoop environment for eclipse

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Network Security >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly explains "how to build hadoop environment in eclipse". The content in the article is simple and clear, and it is easy to learn and understand. Please follow the editor's train of thought to study and learn "how to build hadoop environment in eclipse".

First, install Eclipse

Download Eclipse and extract the installation, such as installing to / usr/local, that is, / usr/local/eclipse

Version 4.3.1 download address: http://pan.baidu.com/s/1eQkpRgu

Install the hadoop plug-in on eclipse

1. Download the hadoop plug-in

Download address: http://pan.baidu.com/s/1mgiHFok

This zip file contains the source code, we can use the compiled jar, after decompression, the hadoop.eclipse-kepler-plugin-2.2.0.jar in the release folder is the compiled plug-in.

2. Put the plug-in in the eclipse/plugins directory

3. Restart eclipse and configure Hadoop installation directory

If the plug-in is installed successfully, after opening Windows-Preferences, there will be an Hadoop Map/Reduce option on the left side of the window. Click this option and set the Hadoop installation path on the right side of the window.

4. Configure Map/Reduce Locations

Open Windows-Open Perspective-Other

Select Map/Reduce and click OK

On the lower right, you can see the following figure

Click the Map/Reduce Location tab and click the elephant icon on the right to open the Hadoop Location configuration window:

Enter Location Name, any name is fine. Configure Map/Reduce Master and DFS Mastrer,Host and Port to match the settings of core-site.xml.

Click the "Finish" button to close the window.

Click DFSLocations- > myhadoop on the left (the location name configured in the previous step). If you can see the user, the installation is successful.

If the installation fails as shown in the figure below, check that Hadoop is started and that eclipse is configured correctly.

Third, create a new WordCount project

File- > Project, select Map/Reduce Project, enter the project name WordCount, etc.

Create a new class in the WordCount project, named WordCount, with the following code:

Import java.io.IOException; import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.util.GenericOptionsParser Public class WordCount {public static class TokenizerMapper extends Mapper {private final static IntWritable one = new IntWritable (1); private Text word = new Text (); public void map (Object key, Text value, Context context) throws IOException, InterruptedException {StringTokenizer itr = new StringTokenizer (value.toString ()); while (itr.hasMoreTokens ()) {word.set (itr.nextToken ()); context.write (word, one) } public static class IntSumReducer extends Reducer {private IntWritable result = new IntWritable (); public void reduce (Text key, Iterable values,Context context) throws IOException, InterruptedException {int sum = 0; for (IntWritableval: values) {sum + = val.get ();} result.set (sum); context.write (key, result) } public static void main (String [] args) throws Exception {Configuration conf = new Configuration (); String [] otherArgs = new GenericOptionsParser (conf, args). GetRemainingArgs (); if (otherArgs.length! = 2) {System.err.println ("Usage: wordcount"); System.exit (2);} Job job = new Job (conf, "wordcount"); job.setJarByClass (WordCount.class); job.setMapperClass (TokenizerMapper.class) Job.setCombinerClass (IntSumReducer.class); job.setReducerClass (IntSumReducer.class); job.setOutputKeyClass (Text.class); job.setOutputValueClass (IntWritable.class); FileInputFormat.addInputPath (job, new Path (otherArgs [0])); FileOutputFormat.setOutputPath (job, new Path (otherArgs [1])); System.exit (job.waitForCompletion (true)? 0: 1);}} IV. Run

1. Create a directory input on HDFS

Hadoop fs-mkdir input

2. Copy the local README.txt to the input of HDFS

Hadoop fs-copyFromLocal / usr/local/hadoop/README.txt input

3. Click WordCount.java, right, and click Run As- > Run Configurations to configure the running parameters, that is, input and output folders

Hdfs://localhost:9000/user/hadoop/input hdfs://localhost:9000/user/hadoop/output

Click the Run button to run the program.

4. After the operation is completed, view the running results.

Method 1:

Hadoop fs-ls output

You can see that there are two outputs, _ SUCCESS and part-r-00000

Execute hadoop fs-cat output/*

Method 2:

Expand DFS Locations, as shown in the following figure, and double-click to open part-r00000 to view the results

Thank you for reading, the above is the content of "how to build the hadoop environment for eclipse". After the study of this article, I believe you have a deeper understanding of how to build the hadoop environment for eclipse, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Network Security

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report