In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
Under redhat6.4 (used as hadoop master: recording all data distribution-namenode)-- install a single instance computer
1. Modify hostname
[root@hadoop01] # vim / etc/hosts
[root@hadoop01] # hostname
Hadoop01
2. Turn off the firewall
[root@hadoop01] # service iptables stop
Iptables: Setting chains to policy ACCEPT: filter [OK]
Iptables: Flushing firewall rules: [OK]
Iptables: Unloading modules: [OK]
[root@hadoop01] # chkconfig iptables off
[root@hadoop01] # vi / etc/selinux/config
3. Configure the java environment
[root@hadoop01] # vi / etc/profile
Export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
Export JRE_HOME=$JAVA_HOME/jre
Export CLASSPARH=$JACA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
Export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
[root@hadoop01] # source / etc/profile / / Environment takes effect
[root@hadoop01] # mv hadoop-2.7.6 / opt
4. Create a new hadoop folder
Mkdir / root/hadoop
Mkdir / root/hadoop/tmp
Mkdir / root/hadoop/var
Mkdir / root/hadoop/dfs
Mkdir / root/hadoop/dfs/name
Mkdir / root/hadoop/dfs/data
5. Configure hadoop
[root@hadoop01 hadoop] # pwd
/ opt/hadoop-2.7.6/etc/hadoop
[root@hadoop01 hadoop] vim hadoop-env.sh / / modify JAVA_HOME
Export HADOOP_OPTS= "- Djava.library.path=$ {HADOOP_HOME} / lib/native"
JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
[root@hadoop01 hadoop] vim yarn-env.sh / / modify JAVA_HOME
JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
[root@hadoop01 hadoop] # vim core-site.xml
Hadoop.tmp.dir
/ root/hadoop/tmp
Abase for other temporary directories.
Io.file.bufefer.size
131072
Fs.default.name
Hdfs://hadoop01:9000
[root@hadoop01 hadoop] # vim hdfs-site.xml
Dfs.namenode.secondary.http-address
Hdfs://hadoop01:9001
Dfs.namenode.dir
/ root/hadoop/dfs/name
Path on the local filesystem where theNameNode stores the namespace and transactions logs persistently.
Dfs.datanode.dir
/ root/hadoop/dfs/data
Comma separated list of paths on the localfilesystem of a DataNode where it should store its blocks.
Dfs.replication
three
Dfs.webhdfs.enabled
True
[root@hadoop01 hadoop# vim mapred-site.xml
Mapred.job.tracker
Hadoop01:9001
Mapred.local.dir
/ root/hadoop/var
Mapreduce.framework.name
Yarn
Mapreduce.jobhistory.address
Hadoop01:10020
Mapreduce.jobhistory.webapp.address
Hadoop01:19888
[root@hadoop01 hadoop] # vim yarn-site.xml
Yarn.nodemanager.aux-services
Mapreduce_shuffle
Yarn.resourcemanager.hostname
Hadoop01
Yarn.resourcemanager.webapp.address
Hadoop01:8088
Yarn.log-aggregation-enable
True
Yarn.log-aggregation.retain-seconds
86400
[root@hadoop01 hadoop# vi / etc/profile
Export HADOOP_HOME=/opt/hadoop-2.7.6
Export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
Export HADOOP_OPTS= "- Djava.library.path=$HADOOP_HOME/lib"
Export PATH=.:$ {JAVA_HOME} / bin:$ {HADOOP_HOME} / bin:$PATH:$HADOOP_HOME/sbin
6. Initialize and start HADOOP
[root@hadoop01 bin] #. / hadoop namenode-format / / initialize
[root@hadoop01 current] # pwd
/ root/hadoop/dfs/name/current
[root@hadoop01 current] # ls / / verify that initialization is successful
Fsimage_0000000000000000000 fsimage_0000000000000000000.md5 seen_txid VERSION
[root@hadoop01 sbin] # pwd
/ opt/hadoop-2.7.6/sbin
[root@hadoop01 sbin] #. / start-dfs.sh / / start HDFS
Starting namenodes on [hadoop01]
The authenticity of host 'hadoop01 (172.19.15.151)' can't be established.
RSA key fingerprint is 00:57:cb:a6:0d:28:05:17:58:0b:5f:8c:9d:2e:c5:1f.
Are you sure you want to continue connecting (yes/no)? Yes
Hadoop01: Warning: Permanently added 'hadoop01172.19.15.151' (RSA) to the list of known hosts.
Root@hadoop01's password:
Hadoop01: starting namenode, logging to / opt/hadoop-2.7.6/logs/hadoop-root-namenode-hadoop01.out
Root@localhost's password:
Localhost: starting datanode, logging to / opt/hadoop-2.7.6/logs/hadoop-root-datanode-hadoop01.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is 00:57:cb:a6:0d:28:05:17:58:0b:5f:8c:9d:2e:c5:1f.
Are you sure you want to continue connecting (yes/no)? Yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known hosts.
Root@0.0.0.0's password:
0.0.0.0: starting secondarynamenode, logging to / opt/hadoop-2.7.6/logs/hadoop-root-secondarynamenode-hadoop01.out
[root@hadoop01 sbin] #. / start-yarn.sh / / start yarn
[root@hadoop01 sbin] # start-yarn.sh
-bash: start-yarn.sh: command not found
[root@hadoop01 sbin] #. / start-yarn.sh
Starting yarn daemons
Starting resourcemanager, logging to / opt/hadoop-2.7.6/logs/yarn-root-resourcemanager-hadoop01.out
Root@localhost's password:
Localhost: starting nodemanager, logging to / opt/hadoop-2.7.6/logs/yarn-root-nodemanager-hadoop01.out
7. Check whether the startup is successful
[root@hadoop01 sbin] # jps
27916 DataNode
28629 Jps
28239 ResourceManager
27779 NameNode
28083 SecondaryNameNode
28531 NodeManager
8. Web access
Http://172.19.15.151:8088/cluster
Http://172.19.15.151:50070
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.