In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces how Linux installs Hadoop and Hbase, which has certain reference value. Interested friends can refer to it. I hope you will gain a lot after reading this article. Let Xiaobian take you to understand it together.
Three CentOS7 mainframes
192.168.122.101 hdfs1
192.168.122.102 hdfs2
192.168.122.103 hdfs3
Hdfs1 is the master node and the others are slave nodes.
The installation configuration of the three is the same, do ssh password-free authentication, if only in the master node hdfs1 operation, only do hdfs1 to other nodes trust can be.
If all three trust each other, then it's the same on either one.
Modify kernel parameters
vim /etc/sysctl.conf
net.ipv4.tcp_syn_retries = 1 net.ipv4.tcp_synack_retries = 1 net.ipv4.tcp_keepalive_time = 600 net.ipv4.tcp_keepalive_probes = 3 net.ipv4.tcp_keepalive_intvl =15 net.ipv4.tcp_retries2 = 5 net.ipv4.tcp_fin_timeout = 2 net.ipv4.tcp_max_tw_buckets = 65536 net.ipv4.tcp_tw_recycle = 1 net.ipv4.tcp_tw_reuse = 1 net.ipv4.tcp_max_orphans = 32768 net.ipv4.tcp_syncookies = 1 net.ipv4.tcp_max_syn_backlog = 16384 net.ipv4.tcp_wmem = 8192 131072 16777216 net.ipv4.tcp_rmem = 32768 131072 16777216 net.ipv4.tcp_mem = 786432 1048576 1572864 net.ipv4.ip_local_port_range = 1024 65000 net.ipv4.ip_conntrack_max = 65536 net.ipv4.netfilter.ip_conntrack_max=65536 net.ipv4.netfilter.ip_conntrack_tcp_timeout_established=180 net.core.somaxconn = 16384 net.core.netdev_max_backlog = 16384 vm.max_map_count = 262144
modify handle limit
vim /etc/security/limits.conf
* soft noproc 655360 * hard noproc 655360 * soft nofile 655360 * hard nofile 655360
Set hostname resolution
On different nodes, set the corresponding host name
hostnamectl set-hostname hdfs1
Add hosts record, you can also use dns to parse, more flexible.
vim /etc/hosts
192.168.122.101 hdfs1 192.168.122.102 hdfs2 192.168.122.103 hdfs3
Create users and directories
useradd hadoop passwd hadoop mkdir -p /apps/ mkdir -pv /data/hdfs/hadoop mkdir -pv /data/hdfs/hbase chown hadoop.hadoop /data/hadoop /data/hbase
Set ssh password free
su - hadoop ssh-keygen ssh-copy-id hadoop@hdfs1 ssh-copy-id hadoop@hdfs1 ssh-copy-id hadoop@hdfs1
ssh-keygen When generating the key, keep entering to complete the creation
ssh-copy-id requires hadoop password
Download jdk
Download Address:
www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html
You need to log in to download
tar zxvf jdk-8u271-linux-x64.tar.gz mv jdk-8u271-linux-x64 /apps/ cd /apps/ ln -s jdk1.8.0_271 jdk cd - if grep '# modify by script' /etc/profile >>/dev/null 2>&1; then echo "alread set JAVA_HOME" else cp /etc/profile /etc/profile_bak$(date +%Y%m%d%H%M%S) cat >>/etc/profile
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.