In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article will explain in detail the method of building Hadoop services in Centos7 in Linux. The editor thinks it is very practical, so I share it with you for reference. I hope you can get something after reading this article.
Download Hadoop official website: http://hadoop.apache.org/releases.html
Configure the jdk environment first
Decompress to / usr/local after download
Tar-zxvf hadoop-2.8.0.tar.gz-C / usr/local
Change hadoop-2.8.0 to hadoop for convenience
Mv / usr/local/hadoop-2.8.0 / usr/local/hadoop
View hostname
Hostname / / the host name of the first parameter
Check whether it is password-free.
Ssh localhost / / localhost hostname here
Note: password is usually required for the first installation.
Ssh-keygen-t dsa-P''- f ~ / .ssh/id_dsacat ~ / .ssh/id_dsa.pub > > ~ / .ssh/authorized_keys
Verify again
Ssh-keygen-t dsa-P''- f ~ / .ssh/id_dsacat ~ / .ssh/id_dsa.pub > > ~ / .ssh/authorized_keys
Note: if there is no Enter password, it means no password.
Configure environment variables
Vim / etc/profile
Add at the end
Export HADOOP_HOME=/usr/local/hadoop export PATH=$JAVA_HOME/bin:$PATH:$HOME/bin:$HADOOP_HOME/bin
Make the environment variable effective immediately
Source / etc/profile
Create a temporary file store for Hadoop
Mkdir / usr/local/hadoop/tmp
Edit the configuration file for Hadoop
Cd / usr/local/hadoop/etc/hadoop/vim hadoop-env.sh
Add at the end
Export JAVA_HOME=/usr/local/java/jdk1.8.0_131/ export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
Note: the path should be correct.
Make the environment variable effective immediately
Source hadoop-env.sh
Configure another file
Vim core-site.xml
Add the following to the
Fs.defaultFS hdfs://localhost:9000 hadoop.tmp.dir file:/usr/local/hadoop/tmp
Configure the next file hdfs-site.xml
Vim hdfs-site.xml
Add the following to the
Dfs.replication 1 dfs.namenode.name.dir file:/usr/local/hadoop/tmp/dfs/name dfs.datanode.data.dir file:/usr/local/hadoop/tmp/dfs/data
Configure the next file, mapred-site.xml, because this file does not exist by default. We configure mapred-site.xml.template as a template.
Cp mapred-site.xml.template mapred-site.xml
Vim mapred-site.xml
Add the following to the
Mapreduce.framework.name yarn
Configure the next file yarn-site.xml
Vim yarn-site.xml
Add the following to the
Yarn.nodemanager.aux-services mapreduce_shuffle
Configure the last file, yarn-env.sh
Vim yarn-env.sh
Note: export JAVA_HOME around line 23 (delete # uncomment) and change the path of the correct jdk
Format namenode
Cd / usr/local/hadoopbin/hdfs namenode-format
Note: if successful, you will see the prompt of "successfully formatted" and "Exitting withstatus 0". If it is "Exitting withstatus 1", it is an error.
Start hdfs through a script
Sbin/start-dfs.sh
Open a browser to access http://localhost:50070 and verify that hdfs is configured successfully
And then start yarn.
Sbin/start-yarn.sh
Open a browser to access http://localhost:8088 and verify that yarn is configured successfully
Note: due to a cache problem in the browser, opening the address may fail to close the browser and reopen it once or twice. If it still fails, check whether the configuration file is incorrect.
View started hadoop processes
Jps
Note: if there is no NameNode or DataNode, the configuration is not successful
This is the end of this article on "how to build Hadoop services in Centos7 in Linux". I hope the above content can be helpful to you, so that you can learn more knowledge. if you think the article is good, please share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.