Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the common problems in running Hadoop 2.x MapReduce programs with Eclipse

2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

In this issue, the editor will bring you what are the common questions about running Hadoop 2.x MapReduce programs using Eclipse. The article is rich in content and analyzes and describes for you from a professional point of view. I hope you can get something after reading this article.

1. When we write the MapReduce program and click Run on Hadoop, the Eclipse console outputs the following:

This message tells us that the log4j.properties file was not found. Without this file, the log will not be printed when the program goes wrong, so it will be difficult for us to debug.

Solution: copy the log4j.properties file in the $HADOOP_HOME/etc/hadoop/ directory to the MapReduce project src folder.

2. When executing MapReduce programs, Eclipse may report errors about heap benefits. At this point, the out directory executed by the MapReduce program has been created, but the directory is empty at this time, and we need to delete this output directory before rerunning the program. As shown in the following figure:

Analysis: first we can enter the command (java-client-XX:+UnlockDiagnosticVMOptions-XX:+PrintFlagsFinal-version | grep-I heapsize) to see the maximum heap currently supported by JDK. Then increase the heap size on this basis.

Solution: set the VM arguments parameter in the running configuration of the current running program, as shown in the following figure:

Hadoop is developed in the Java language, but there are some requirements and operations that are not suitable for using java, so the concept of local library (Native Libraries) is introduced, through which Hadoop can perform some operations more efficiently.

Currently, in Hadoop, local libraries are applied to the compression of files:

Zlib

Gzip

When using both compression methods, Hadoop loads local libraries from the $HADOOP_HOME/lib/native/Linux-* directory by default.

If the load is successful, the output is:

DEBUG util.NativeCodeLoader-Trying to load the custom-built native-hadoop library...

INFO util.NativeCodeLoader-Loaded the native-hadoop library

If the load fails, the output is:

INFO util.NativeCodeLoader-Unable to load native-hadoop library for your platform... Using builtin-java classes where applicable

You can set whether to use local libraries in the configuration file core-site.xml of Hadoop:

Hadoop.native.lib

True

Should native hadoop libraries, if present, be used.

The default configuration of Hadoop is to enable local libraries.

In addition, you can set the location to use the local library in the environment variable:

Export JAVA_LIBRARY_PATH=/path/to/hadoop-native-libs

Sometimes you will find that the native library that comes with Hadoop cannot be used, in which case you need to compile the local library yourself. In the $HADOOP_HOME directory, use the following command:

Ant compile-native

After the compilation is complete, you can find the appropriate file in the $HADOOP_HOME/build/native directory, and then specify the path to the file or move the compiled file to the default directory.

These are the common problems that Xiaobian shared with you about running Hadoop 2.x MapReduce programs using Eclipse. If you happen to have similar doubts, please refer to the above analysis to understand. If you want to know more about it, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report