In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article is about how eclipse debugs linux hadoop remotely under windows7. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
First of all, let's make a few points:
Remote debugging has very strict compatibility requirements for local hadoop version, remote hadoop version, and eclipse version. The version I use is as follows:
(1) Local hadoop:0.20.203 (there may be protocol errors if the client and server hadoop versions do not match)
(2) remote hadoop:0.20.203 (because each version of the ipc protocol has changed)
Download address: http://archive.apache.org/dist/hadoop/core/hadoop-0.20.203.0/
(3) eclipse version: indigo (this is also very important. If it doesn't match, eclipse may not recognize the hadoop plug-in at all)
File MD5 code and download address:
9017a39354fa65375c6ee748963cf2ff eclipse-jee-indigo-SR2-win32.zip
Http://www.eclipse.org/downloads/packages/eclipse-ide-java-ee-developers/indigosr2
(4) client operating system: win7, remote operating system: linux (ubuntu/centos, etc., as long as it is linux)
If any of the above combinations do not match, there may be an exception, which is beyond the scope of this article, so be sure to check your own version before starting below.
Step 1: start the hadoop daemon first
For more information, see http://www.cnblogs.com/flyoung2008/archive/2011/11/29/2268302.html
Step 2: install the hadoop plug-in on eclipse
1. Copy the hadoop installation directory / contrib/eclipse-plugin/hadoop-0.20.203.0-eclipse-plugin.jar to the eclipse installation directory / plugins/.
two。 Restart eclipse and configure hadoop installation directory.
If the plug-in is installed successfully, open Window-- > Preferens, and you will find the Hadoop Map/Reduce option, in which you need to configure Hadoop installation directory. Exit when the configuration is complete.
3. Configure Map/Reduce Locations.
Open Map/Reduce Locations in Window-- > Show View.
Create a new Hadoop Location in Map/Reduce Locations. In this View, right-- > New Hadoop Location. In the pop-up dialog box you need to configure Location name, such as Hadoop, as well as Map/Reduce Master and DFS Master. The Host and Port in this are the addresses and ports you configured in mapred-site.xml and core-site.xml respectively. Such as:
Map/Reduce Master
192.168.1.101 9001
DFS Master
192.168.1.101 9000
Exit after configuration. Click DFS Locations-- > Hadoop if the folder (2) shows that the configuration is correct, and if the "connection denied" is displayed, please check your configuration.
If there is a problem with this step, please see the last "Note (1)".
Step 3: create a new project.
File-- > New-- > Other-- > Map/Reduce Project
You can take the project name as you like, such as WordCount.
Copy the hadoop installation directory / src/example/org/apache/hadoop/example/WordCount.java to the project you just created. Or refer to: http://my.oschina.net/leejun2005/blog/83058
Step 4: upload the simulation data folder.
In order to run the program, we need an input folder and an output folder.
Create a new word.txt locally
Java C++ python cjava C++ javascript helloworld hadoopmapreduce java hadoop hbase
Create the / tmp/workcount directory on HDFS with the command of hadoop, as follows: bin/hadoop fs-mkdir / tmp/wordcount
Copy the local word.txt to HDFS through the copyFromLocal command: bin/hadoop fs-copyFromLocal / home/grid/word.txt / tmp/wordcount/word.txt
Step 5: run the project
1. In the new project Hadoop, click WordCount.java, and right-click-- > Run As-- > Run Configurations
two。 In the Run Configurations dialog box that pops up, click Java Application, right-- > New, and a new application named WordCount will be created.
3. To configure the running parameters, click Arguments, and enter "the input folder you want to send to the program and the folder where you require the program to save the calculation results" in Program arguments, such as:
Hdfs://centos1:9000/tmp/wordcount/word.txt hdfs://centos1:9000/tmp/wordcount/out
4. If you run Times java.lang.OutOfMemoryError: Java heap space configure VM arguments (under Program arguments)
-Xms512m-Xmx1024m-XX:MaxPermSize=256m
For understanding of jvm memory allocation, please refer to: http://my.oschina.net/leejun2005/blog/122963
Note:
(1) step 2 eclipse may report an unreachable error: "Map/Reduce location status updater". Org/codehaus/jackson/map/JsonMappingException
After query, it is due to the lack of some packages in hadoop's eclipse plug-in
Follow the instructions in this article to modify the package and rerun successfully
Http://hi.baidu.com/wangyucao1989/blog/item/279cef87c4b37c34c75cc315.html
I have packed the question about this eclipse plug-in. Please download it from the following address:
Http://vdisk.weibo.com/s/xEJGZ
(2) step 5, an error may be reported:
15:32:44 on 12-04-24 ERROR security.UserGroupInformation: PriviledgedActionException as:Administrator cause:java.io.IOException: Failed to set permissions of path:\ tmp\ hadoop-Administrator\ mapred\ staging\ Administrator-519341271\ .staging to 0700
Exception in thread "main" >
At org.apache.hadoop.fs.FileUtil.checkReturnValue (FileUtil.java:682)
This is due to the problem of file permissions under Windows, which can run normally under Linux, and there is no such problem.
Reference: http://blog.csdn.net/keda8997110/article/details/8533633
The solution is to modify F:\ programming development\ hadoop\ older\ hadoop-0.20.203.0rc1\ hadoop-0.20.203.0\ src\ core\ org\ apache\ hadoop\ fs\ RawLocalFileSystem.java and comment out checkReturnValue (some rough, under Window, you don't have to check)
Private void checkReturnValue (boolean rv, Path p, FsPermission permission) throws IOException {/ * if (! rv) {throw new IOException ("Failed to set permissions of path:" + p + "to" + String.format ("o", permission.toShort ();} * /}
Then recompile, package the hadoop-core-0.20.203.0.jar, and run it again to succeed. About this method of recompiling hadoop, I also have tutorial documentation in the above package. For convenience, I have uploaded the compiled and packaged jar, and you can replace it directly.
Ok, now you should be able to debug your hadoop code remotely under windows7. If it is not successful, please check each step and configuration item carefully, Good Luck~
Connect hadoop in linux with eclipse under windows, and execute mr
Http://superlxw1234.iteye.com/blog/1583164
Note: myeclipse has two points to pay attention to:
Cat eclipse.desktop # chmod 777 Note permissions [Desktop Entry] Encoding=UTF-8Name=EclipseComment=Eclipse IDEExec=/home/june/soft/eclipse/eclipseIcon=/home/june/soft/eclipse/icon.xpmTerminal=falseStartupNotify=trueType=ApplicationCategories=Application;Development
(1) there can be no spaces, double quotation marks and other symbols in Icon=/home/june/download/myeclipse-8.4.200-linux-gtk-x86/icon.xpm, and spaces can be escaped.
(2) the jar package of hadoop can be dropped directly to / home/june/Genuitec/MyEclipse 8.5M2/dropins. There is no need to find any plugins directory.
Thank you for reading! This is the end of this article on "how to debug linux hadoop remotely under windows7". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, you can share it out for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.