Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How does windows eclipse remotely connect to the hadoop cluster and submit tasks to run

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly explains how windows eclipse connects to the hadoop cluster remotely and submits tasks to run. Interested friends may wish to have a look at it. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how windows eclipse can remotely connect to the hadoop cluster and submit tasks to run.

1 download plug-in

Hadoop-eclipse-plugin-2.5.1.jar

Download the corresponding version of the plug-in from the Internet

2 configure plug-in

Put the plug-in in the..\ eclipse\ plugins directory, restart eclipse, and configure Hadoop installation directory. If the plug-in is installed successfully, after opening Windows-Preferences, there will be a Hadoop Map/Reduce option on the left side of the window. Click this option to set the Hadoop installation path on the right side of the window. (under windows, you only need to extract the hadoop-2.5.1.tar.gz to the specified directory)

3 configure Map/Reduce Locations

Open Map/Reduce Locations in Window-- > Show View, click OK, and the console will appear:

Create a new Hadoop Location in Map/Reduce Locations. In this View, right-- > New Hadoop Location. In the pop-up dialog box you need to configure Location name, such as Hadoop, as well as Map/Reduce Master and DFS Master. The Host and Port in this are the addresses and ports you configured in mapred-site.xml and core-site.xml respectively.

Click the "Finish" button to close the window.

4 check whether the connection is successful

Click DFSLocations- > master on the left (the location name configured in the previous step). If you can see the user, the installation is successful.

5 create a new MapReduce project and run

1. Right click New- > Map/Reduce Project

two。 Create a new WordCount.java (find the case of mapreduce in the share directory of Hadoop, copy come)

3. Create a test directory in hdfs using the command line (output directory can not be created, run MR will be created automatically), and upload a word.txt file (write a few words at random)

Hadoop fs-mkdir / test/

Hadoop fs-copyFromLocal word.txt / test/word.txt

4. Right-click WordCount.java-> Run As-- > Run Configurations to set the input and output directory paths, as shown in the figure:

5. Click WordCount.java right click-- > Run As-- > Run on Hadoop

There will be a lot of problems at this time:

Problem 1: null pointer error

1. Download the windows version of winutils

On GitHub, someone provides the windows version of winutils. The project address is https://github.com/srccodes/hadoop-common-2.2.0-bin. Download the zip package of this project directly. After download, the file name is hadoop-common-2.2.0-bin-master.zip. Unpack it to a directory casually. Don't worry about its version. After downloading, add winutils.exe to your hadoop-x.x.x/bin. Put the hadoop.dll under the C:/wondwos/system32, and if it's not solved, put a copy in the bin directory.

two。 Configure environment variables

Add the user variable HADOOP_HOME, which is the directory where the downloaded zip package was unzipped, and then add $HADOOP_HOME\ bin to the system variable path.

Restart the computer, run the program again, and execute normally. If you don't want to restart the computer, you can add:

System.setProperty ("hadoop.home.dir", "F:\\ hadoop\ hadoop-2.5.1")

Note: F:\\ hadoop\\ hadoop-2.5.1 is the path of the hadoop unzipped by my computer.

Problem 2: winutils.exe error

If the first error is completely resolved, the second error will not occur, either because the winutils.exe is not placed under bin in the hadoop directory, or because the configuration environment variable does not take effect.

Question 3: the question of authority

The cause of the problem is that the local user administrator (native windows user) wants to operate the hadoop system remotely and does not have permission.

1. In the configuration file of hdfs, change dfs.permissions to False and restart the hadoop cluster.

Dfs.permissions

False

If "true", enable permission checking in HDFS.

If "false", permission checking is turned off

But all other behavior is unchanged.

Switching from one parameter value to the other does not change the mode

Owner or group of files or directories.

2. Perform this operation hadoop fs-chmod-R 777 / user/hadoop

3. Change the computer name to the user name of the hadoop user

For the first and third methods above, I did not try, I do not know if it is useful, the second method works. The second method is to modify the permissions of the corresponding folder in HDFS. The path behind / user/hadoop is the file path in HDFS, so that after modification, our administrator has the permission to write files in the corresponding directory of HDFS (all users have write permission)

Question 4: log output problem

After copying the hadoop.dll, run WordCount and find that the run ends without any information output for a while

Solution: you can write a log4j log file, put it under the src of the project, look at the log output, and you may find a problem in the output log.

At this point, I believe you have a deeper understanding of "how to remotely connect to the hadoop cluster and submit tasks to run". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report