Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Big data-hadoop pseudo-distributed deployment (HDFS) the final piece of the whole network

2025-03-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

1. Software environment RHEL6jdk-8u45hadoop-2.8.1.tar.gzsshxx.xx.xx.xx ip address hadoop1xx.xx.xx.xx ip address hadoop2xx.xx.xx.xx ip address hadoop3xx.xx.xx.xx ip address hadoop4xx.xx.xx.xx ip address hadoop5

This deployment only involves pseudo-distributed deployment and only requires host hadoop1

two。 Pseudo-distributed deployment document refer to the official website hadoop

Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop 2.8.3 > Hadoop: Setting up a Single Node Cluster

Add hadoop user

[root@hadoop1 ~] # useradd hadoop

[root@hadoop1 ~] # vi / etc/sudoers

# find root ALL= (ALL) ALL and add

Hadoop ALL= (ALL) NOPASSWD:ALL

Upload and decompress

[root@hadoop1 software] # rz # upload hadoop-2.8.1.tar.gz

[root@hadoop1 software] # tar-xzvf hadoop-2.8.1.tar.gz

Soft connection

[root@hadoop1 software] # ln-s / opt/software/hadoop-2.8.1 / opt/software/hadoop

Set environment variabl

[root@hadoop1 software] # vi / etc/profile

Export HADOOP_HOME=/opt/software/hadoop

Export PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH

[root@hadoop1 software] # source / etc/profile

Set up users, user groups

[root@hadoop1 software] # chown-R hadoop:hadoop hadoop

[root@hadoop1 software] # chown-R hadoop:hadoop hadoop/*

[root@hadoop1 software] # chown-R hadoop:hadoop hadoop-2.8.1

[root@hadoop1 software] # cd hadoop

[root@hadoop1 hadoop] # rm-f * .txt

[root@hadoop1 software] # su-hadoop

[root@hadoop1 hadoop] # ll

Total 32

Drwxrwxr-x. 2 hadoop hadoop 4096 Jun 2 14:24 bin

Drwxrwxr-x. 3 hadoop hadoop 4096 Jun 2 14:24 etc

Drwxrwxr-x. 2 hadoop hadoop 4096 Jun 2 14:24 include

Drwxrwxr-x. 3 hadoop hadoop 4096 Jun 2 14:24 lib

Drwxrwxr-x. 2 hadoop hadoop 4096 Aug 20 13:59 libexec

Drwxr-xr-x. 2 hadoop hadoop 4096 Aug 20 13:59 logs

Drwxrwxr-x. 2 hadoop hadoop 4096 Jun 2 14:24 sbin

Drwxrwxr-x. 4 hadoop hadoop 4096 Jun 2 14:24 share

# bin: executable file

# etc: configuration file

# sbin: shell script to start and close configuration files such as hdfs,yarn

[hadoop@hadoop1 ~] # cd / opt/software/hadoop

[hadoop@hadoop1 hadoop] # vi etc/hadoop/core-site.xml

Fs.defaultFS

Hdfs://172.168.18.133:9000 # configure IP of your own machine

[hadoop@hadoop1 hadoop] # vi etc/hadoop/hdfs-site.xml

Dfs.replication

one

-added in red section

Configure the ssh trust relationship for hadoop users

# Public key / key configuration password-less login

[hadoop@hadoop1] # ssh-keygen-t rsa-P'- f ~ / .ssh/id_rsa

[hadoop@hadoop1] # cat ~ / .ssh/id_rsa.pub > > ~ / .ssh/authorized_keys

[hadoop@hadoop1] # chmod 0600 ~ / .ssh/authorized_keys

# check the date to see if the configuration is successful

[root@hadoop1 ~] # ssh hadoop1 date

Wed Apr 25 15:22:58 CST 2018

[root@hadoop1 ~] # ssh localhost date

Wed Apr 25 15:23:25 CST 2018

# the first time you execute a command, you can choose to enter yes interactively.

Formatting and starting

[hadoop@hadoop1 hadoop] # bin/hdfs namenode-format

[hadoop@hadoop1 hadoop] # sbin/start-dfs.sh

ERROR:

Hadoop1: Error: JAVA_HOME is not set and could not be found.

Localhost: Error: JAVA_HOME is not set and could not be found.

Solution: add environment variables

[hadoop@hadoop1 hadoop] # vi etc/hadoop/hadoop-env.sh

# change export JAVA_HOME=$ {JAVA_HOME} to

Export JAVA_HOME=/usr/java/jdk1.8.0_45

# continue to start

[hadoop@hadoop1 hadoop] # sbin/start-dfs.sh

Check whether it is successful

[hadoop@hadoop1 hadoop] # jps

8026 Jps

7485 NameNode

7597 DataNode

7807 SecondaryNameNode

Visit: http://192.168.137.130:50070

Modify the processes started by dfs to start the three started processes with hadoop1:

Namenode: hadoop1 bin/hdfs getconf-namenodes

Datanode: localhost datanodes (using default slaves file) etc/hadoop/slaves

Secondarynamenode: 0.0.0.0

[hadoop@hadoop1 ~] # cd / opt/software/hadoop

[hadoop@hadoop1 hadoop] # echo "hadoop1" >. / etc/hadoop/slaves

[hadoop@hadoop1 hadoop] # cat. / etc/hadoop/slaves

Hadoop1

[hadoop@hadoop1 hadoop] # vi. / etc/hadoop/hdfs-site.xml

Dfs.namenode.secondary.http-address

Hadoop1:50090

Dfs.namenode.secondary.https-address

Hadoop1:50091

[hadoop@hadoop1 hadoop] # sbin/stop-dfs.sh

[hadoop@hadoop1 hadoop] # sbin/start-dfs.sh

Access: http://192.168.137.130:50070 interface

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report