Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Security Certification Kerberos deployment for hadoop and hbase

2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)06/01 Report--

(follow the previous article)

5. Kerberos1, jsvc

All nodes:

# cd ~ / soft

# wget http://mirror.bit.edu.cn/apache/commons/daemon/source/commons-daemon-1.0.15-native-src.tar.gz

# tar zxfcommons-daemon-1.0.15-native-src.tar.gz

# cd commons-daemon-1.0.15-native-src/unix;./configure; make

# cp jsvc / usr/local/hadoop-2.4.0/libexec/

# cd ~ / soft

# wget http://mirror.bit.edu.cn/apache//commons/daemon/binaries/commons-daemon-1.0.15-bin.tar.gz

# tar zxf commons-daemon-1.0.15-bin.tar.gz

# cpcommons-daemon-1.0.15/commons-daemon-1.0.15.jar/usr/local/hadoop-2.4.0/share/hadoop/hdfs/lib/

# cpcommons-daemon-1.0.15/commons-daemon-1.0.15.jar/usr/local/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/

# rm-f / usr/local/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar

# rm-f/usr/local/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/commons-daemon-1.0.13.jar

# # vim/usr/local/hadoop-2.4.0/etc/hadoop/hadoop-env.sh

ExportJSVC_HOME=/usr/local/hadoop-2.4.0/libexec/

2. 256-bit encryption

All nodes:

# wget-c http://download.oracle.com/otn-pub/java/jce/7/UnlimitedJCEPolicyJDK7.zip?AuthParam=1400207941_ee158c414c707a057960c521a7b29866

# unzipUnlimitedJCEPolicyJDK7.zip

# cp UnlimitedJCEPolicy/*.jar/usr/java/jdk1.7.0_65/jre/lib/security/

Cp: overwrite "/ usr/java/jdk1.7.0_51/jre/lib/security/local_policy.jar"? Y

Cp: overwrite "/ usr/java/jdk1.7.0_51/jre/lib/security/US_export_policy.jar"? Y

3. Deploy KDC

Host test3:

Install kdc server

# yum-y install krb5\ *

Profile krb5.conf

[logging]

Default = FILE:/var/log/krb5libs.log

Kdc= FILE:/var/log/krb5kdc.log

Admin_server = FILE:/var/log/kadmind.log

[libdefaults]

Default_realm = cc.cn

Dns_lookup_realm = false

Dns_lookup_kdc = false

Ticket_lifetime = 365d

Renew_lifetime = 365d

Forwardable = true

[realms]

Cc.cn = {

Kdc = test3

Admin_server = test3

}

[kdc]

Profile = / var/kerberos/krb5kdc/kdc.conf

Profile kdc.conf

# vim / var/kerberos/krb5kdc/kdc.conf

[kdcdefaults]

Kdc_ports = 88

Kdc_tcp_ports = 88

[realms]

Cc.cn = {

# master_key_type = aes256-cts

Acl_file = / var/kerberos/krb5kdc/kadm5.acl

Dict_file = / usr/share/dict/words

Admin_keytab = / var/kerberos/krb5kdc/kadm5.keytab

Supported_enctypes = aes256-cts:normal aes128-cts:normaldes3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normaldes-cbc-md5:normal des-cbc-crc:normal

}

Profile kadm5.acl

# vim / var/kerberos/krb5kdc/kadm5.acl

* / admin@cc.cn *

Create a database

# kdb5_util create-r cc.cn-s

Enter KDC database master key:

Start and boot

# service krb5kdc start

# service kadmin start

# chkconfig krb5kdc on

# chkconfig kadmin on

Create an administrator user

# kadmin.local

Kadmin.local: addprinc root/admin

Enter password for principal "root/admin@cc.cn":

VI. Hadoop integrates Kerberos1 and configures node authentication

Host test1:

# yum-y install krb5\ *

# scp test3:/etc/krb5.conf / etc/

# kadmin-p root/admin

Kadmin: addprinc-randkey root/test1

Kadmin: addprinc-randkey HTTP/test1

Kadmin: ktadd-k / hadoop/krb5.keytab root/test1 HTTP/test1

Host test2:

# yum-y install krb5\ *

# scp test3:/etc/krb5.conf / etc/

# kadmin-p root/admin

Kadmin: addprinc-randkey root/test2

Kadmin: addprinc-randkey HTTP/test2

Kadmin: ktadd-k / hadoop/krb5.keytab root/test2 HTTP/test2

Host test3:

# kadmin.local

Kadmin.local: addprinc-randkey root/test3

Kadmin.lcoal: addprinc-randkey HTTP/test3

Kadmin.local: ktadd-k / hadoop/krb5.keytab root/test3 HTTP/test3

2. Add configuration file core-site.xml

Host test1:

# vim/usr/local/hadoop-2.4.0/etc/hadoop/core-site.xml

Hadoop.security.authentication

Kerberos

Hadoop.security.authorization

True

Profile hdfs-site.xm

Host test1:

# vim / usr/local/hadoop-2.4.0/etc/hadoop/hdfs-site.xml

Dfs.journalnode.keytab.file

/ hadoop/krb5.keytab

Dfs.journalnode.kerberos.principal

Root/_HOST@cc.cn

Dfs.journalnode.kerberos.internal.spnego.principal

HTTP/_HOST@cc.cn

Dfs.block.access.token.enable

True

Dfs.namenode.keytab.file

/ hadoop/krb5.keytab

Dfs.namenode.kerberos.principal

Root/_HOST@cc.cn

Dfs.web.authentication.kerberos.keytab

/ hadoop/krb5.keytab

Dfs.web.authentication.kerberos.principal

HTTP/_HOST@cc.cn

Ignore.secure.ports.for.testing

True

Dfs.datanode.keytab.file

/ hadoop/krb5.keytab

Dfs.datanode.kerberos.principal

Root/_HOST@cc.cn

Hadoop.http.staticuser.user

Root

Profile yarn-site.xml

Host test1:

# vim/usr/local/hadoop-2.4.0/etc/hadoop/yarn-site.xml

Yarn.resourcemanager.keytab

/ hadoop/krb5.keytab

Yarn.resourcemanager.principal

Root/_HOST@cc.cn

Yarn.nodemanager.keytab

/ hadoop/krb5.keytab

Yarn.nodemanager.principal

Root/_HOST@cc.cn

Profile mapred-site.xml

Host test1:

# vim / usr/local/hadoop-2.4.0/etc/hadoop/mapred-site.xml

Mapreduce.jobhistory.keytab

/ hadoop/krb5.keytab

Mapreduce.jobhistory.principal

Root/_HOST@cc.cn

3. Synchronize configuration files

Host test1:

# scp-r/usr/local/hadoop-2.4.0/ test2:/usr/local/

# scp-r/usr/local/hadoop-2.4.0/ test3:/usr/local/

4. Start

Host test1:

# start-all.sh

5. Verification

Host test3:

# kinit-k-t / hadoop/krb5.keytab root/test3

# hdfs dfs-ls /

7. Hbase integrates Kerberos1 and adds configuration file hbase-site.xml

Host test1:

# vim/usr/local/hbase-0.98.1/conf/hbase-site.xml

Hbase.security.authentication

Kerberos

Hbase.security.authorization

True

Hbase.rpc.engine

Org.apache.hadoop.hbase.ipc.SecureRpcEngine

Hbase.coprocessor.region.classes

Org.apache.hadoop.hbase.security.token.TokenProvider

Hbase.master.keytab.file

/ hadoop/krb5.keytab

Hbase.master.kerberos.principal

Root/_HOST@cc.cn

Hbase.regionserver.keytab.file

/ hadoop/krb5.keytab

Hbase.regionserver.kerberos.principal

Root/_HOST@cc.cn

2. Synchronize configuration files

Host test1:

# scp/usr/local/hbase-0.98.1/conf/hbase-site.xml test2:/usr/local/hbase-0.98.1/conf/

# scp / usr/local/hbase-0.98.1/conf/hbase-site.xmltest3:/usr/local/hbase-0.98.1/conf/

3. Start

Host test1:

# start-hbase.sh

4. Verification

Host test3:

# kinit-k-t / hadoop/krb5.keytab root/test3

# hbase shell

8. Cluster connection mode 1, keytab file location

/ etc/xiaofeiyun.keytab

Creation process

Host test1:

# kadmin-p root/admin

Password for root/admin@cc.cn:

Kadmin: addprinc-randkey data/xiaofeiyun

Kadmin: addprinc-randkey platform/xiaofeiyun

Kadmin: ktadd-k / etc/xiaofeiyun.keytab data/xiaofeiyun platform/xiaofeiyun

# scp / etc/xiaofeiyun.keytab test2:/etc/

# scp / etc/xiaofeiyun.keytab test3:/etc/

2. Krb5.conf file location

/ etc/krb5.conf

3. Hadoop connection

Conf.set ("fs.defaultFS", "hdfs://cluster1")

Conf.set ("dfs.nameservices", "cluster1")

Conf.set ("dfs.ha.namenodes.cluster1", "test1,test2")

Conf.set ("dfs.namenode.rpc-address.cluster1.test1", "test1:9000")

Conf.set ("dfs.namenode.rpc-address.cluster1.test2", "test2:9000")

Conf.set ("dfs.client.failover.proxy.provider.cluster1", "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider")

4. Hbase connection

Ha.zookeeper.quorum

Test1:2181,test2:2181,test3:2181

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report