In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
What are the common mistakes in deploying nutch to eclipse? for this question, this article describes in detail the corresponding analysis and answers, hoping to help more partners who want to solve this problem to find a simpler and easier way.
Common errors in Nutch deployment to eclipse Failed to set permissions of path:\ tmp\ hadoop-hadoop\ mapred\ staging\ hadoop1847455384\ .staging to 0700
I seem to have encountered this problem before when deploying hadoop on eclipse. But now I don't know how to solve it.
Method 1:
Comment out
Get rid of compile-core-native
Create-native-configure dependencies in
3. Modify line hadoop-1.1.2/src/core/org/apache/hadoop/fs/FileUtil.java 691to change throw new IOException to LOG.warn
4. Compile the project with ant. After successful compilation, take out the hadoop-core.jar folder from the build folder, put it into the hadoop project in our eclipse, replace the original hadoop-core.jar in the project, and put it under the lib folder. Then add the jar package through buildpath.
Method 2:
Do not modify the build.xml, directly modify the "checkReturnValue (rv, f, permission);" in the FileUtil.java file, comment it out directly, and then compile it with the ant command to generate the hadoop-core.jar package, and also replace the hadoop-core.jar in the project with hadoop-core.jar.
After resolving the previous exception Failed to set permissions of path:, the running file has another exception:
The operation parameters are: urls-dir crawl-depth 2-topN 2
The exception is:
Input path does not exist: file:/E:/qjay/wrokespace2013/trunk/-dir
This error is caused by misspecifying the parameter Program arguments. There are only a few mistakes that come and go and have been tossing around all day. I deeply realize that sometimes we should not be so narrow-minded in thinking about problems and dare to make bold assumptions. Never thought it was Program arguments's problem. But if we find the problem, we'll find it soon.
Eclipse view hadoop source code
Eclipse View hadoop source code import was not successful. Take hadoop-1.2.1.tar.gz as an example, we first extract the hadoop project, then compress it into a file in zip format, and then import it successfully.
Total number of urls rejected by filters: 0
The problem I encountered was not caused by the problem with the program itself, but because the configuration parameters were passed incorrectly.
When running the injector class in eclipse, the parameters passed should be:
Main class: org.apache.nutch.crawl.Injector
VM arguments:-Dhadoop.log.dir=logs-Dhadoop.log.file=hadoop.log
Program arguments: crawldb urls
Keep running, and this time you will find that you have started to report an error. Well, it's a good thing to report a mistake, but the fear is that it won't report it. It's easy to report an error, and continue to solve the following exception:
Nutch java.lang.Exception: java.lang.RuntimeException: Error in configuring object
The following errors occurred during nutch deployment, mainly due to improper configuration of plug-ins. The problem can be solved by setting the plugin.folders property of the nutch-site.xml file.
Modify the nutch-default file under conf to change the value of plugin.folders from plugins to. / src/plugin
Because the plugin file is in the src directory.
This is the answer to the questions about the common mistakes in the deployment of nutch to eclipse. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel for more related knowledge.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.