In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you "what problems will be encountered in the installation and configuration of hadoop", the content is simple and clear, and I hope it can help you solve your doubts. Let me lead you to study and learn what problems will be encountered in the installation and configuration of hadoop.
Use environment:
Ubuntu14.04 64-bit system
Java7
Hadoop 2.4.1
A few points of attention
1. The link to download hadoop's official website is not available, so download it from open source China.
2. Be sure to operate under the same linux account authority on the following occasions. I have some problems because I am used to creating configuration files and operating with sudo.
Format hdfs
Ssh login account without password
Create a new edit profile
Start hadoop, etc.
3. Hadoop should be careful of the problems caused by version differences and choose appropriate teaching materials, for example, compared with 2.x, 2.x has no JobTracker and TaskTracker.
A few questions:
Question 1. Localhost: Error: JAVA_HOME is not set and could not be found.
Change $JAVA_HOME in / hadoop/etc/hadoop/hadoop-env.sh under the hadoop directory to an absolute path
# The java implementation to use.#export JAVA_HOME=$ {JAVA_HOME} export JAVA_HOME=/usr/lib/jvm/java7
Problem 2. The local library cannot be imported. It is said that the 64-bit machine will appear.
The error is as follows:
This script is Deprecated. Instead use stop-dfs.sh and stop-yarn.sh 14-08-10 07:07:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... Using builtin-java classes where applicable Stopping namenodes on [Java HotSpot (TM) 64-Bit Server VM warning: You have loaded library / home/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. It's highly recommended that you fix the library with 'execstack-c', or link it with'- z noexecstack'. Cluster1] sed:-e expression # 1, char 6: unknown option to `s'-c: Unknown cipher type 'cd' ^ Ccluster1: stopping namenode cluster1: stopping datanode VM: ssh: Could not resolve hostname VM: Name or service not known stack: ssh: Could not resolve hostname stack: Name or service not known
The solution is to modify the following variables in hadoop-env.sh, which I add directly to the end of the file.
Export HADOOP_HOME=/data/server/hadoopexport HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/nativeexport HADOOP_OPTS= "- Djava.library.path=$HADOOP_HOME/lib"
Question 3. When sbin/start-all.sh starts, there is only one NodeManager problem with jps.
Nob@hadoop0:/data/server/hadoop/sbin$ jps9922 NodeManager10236 Jps
Reason: I used the Super Admin sudo command when I formatted hdfs, and the user permissions were inconsistent when I started
Question 4. After sbin/start-all.sh starts, use jps to check if there is no JobTracker and TaskTracker?
Nob@hadoop0:/data/server/hadoop/sbin$ jps9444 DataNode9922 NodeManager9633 SecondaryNameNode9790 ResourceManager9291 NameNode10236 Jps
The reason is that JobTracker and TaskTracker do not exist in the hadoop-2.x version. You can refer to the blog http://blog.csdn.net/skywalker_only/article/details/37905463. The command to start NameNode and DataNode is start-dfs.sh, and the command to start yarn is start-yarn.sh.
These are all the contents of the article "what will happen to the installation and configuration of hadoop?" Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.