In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces "how to configure the client to connect to the Kafka cluster securely". In the daily operation, I believe many people have doubts about how to configure the client to connect to the Kafka cluster securely. The editor consulted all kinds of materials and sorted out a simple and useful method of operation. I hope it will be helpful to answer the doubt of "how to configure the client to connect to the Kafka cluster securely". Next, please follow the editor to study!
This is the first in a series of short blog articles about Apache Kafka security. In this article, we will show how to configure the client to authenticate the cluster using different authentication mechanisms.
A protected Apache Kafka cluster can be configured to enforce authentication using the following different methods:
SSL-TLS client authentication
SASL / GSSAPI-Kerberos authentication
SASL / PLAIN-LDAP and file-based authentication
SASL / SCRAM-SHA-256 and SASL / SCRAM-SHA-512
SASL / OAUTHBEARER
In this article, we will begin to study Kerberos authentication and focus on the client configuration required to authenticate through a cluster configured to use Kerberos.
We will not cover the server-side configuration in this article, but we will add some references when we need to make the example clearer.
The example shown here highlights authentication-related attributes in bold to distinguish them from other required security attributes, as shown in the following example. Assume that TLS is enabled for the Apache Kafka cluster and that TLS should be enabled for each secure cluster.
Security.protocol=SASL_SSL
Ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks
We use kafka-console-consumer in all of the following examples. All concepts and configurations also apply to other applications.
Kerberos authentication
By far, Kerberos is the most common option we use to secure Kafka clusters in this area. It simplifies identity management by enabling users to use corporate identities stored in services such as Active Directory,RedHat IPA and FreeIPA. Kerberos-based Kafka clustering also makes it easier to integrate with other services in the big data ecosystem, which typically use Kerberos for strong authentication.
Kafka implements Kerberos authentication through a simple authentication and security layer (SASL) framework. SASL is the authentication framework and is the standard IETF protocol defined by RFC 4422. It supports a variety of different authentication mechanisms, and the mechanism that implements Kerberos authentication is called GSSAPI.
The basic Kafka client properties that must be set to configure the Kafka client to authenticate via Kerberos are shown below:
# Uses SASL/GSSAPI over a TLS encrypted connectionsecurity.protocol=SASL_SSLsasl.mechanism=GSSAPIsasl.kerberos.service.name=kafka# TLS truststoressl.truststore.location=/opt/cloudera/security/jks/truststore.jks
The above configuration uses Kerberos (SASL / GSSAPI) for authentication. TLS (SSL) is used only for data encryption over cable.
JAAS configuration
However, the above properties do not provide clients with the credentials they need to authenticate through the Kafka cluster. We need more information.
When using Kerberos, we can provide credentials to the client application in two ways. Stored in the ticket cache as a valid Kerberos ticket, or as an keytab file that applications can use to obtain Kerberos tickets
The processing of Kerberos credentials in the Kafka client is done by the Java Authentication and Authorization Service (JAAS) library. Therefore, we need to configure the necessary information for the client so that JAAS knows where to get the credentials.
There are two ways to set these properties for Kafka clients:
Create a JAAS configuration file and point the Java system property java.security.auth.login.config to it; or
Inline JAAS configuration to set the Kafka client property sasl.jaas.config.
In this section, we show how to use these two methods. For simplicity, the examples in this article will use the sasl.jaas.config method.
Use JAAS profile
If you are using a JAAS configuration file, you need to tell the Kafka Java client where to find it. This is done by setting the following Java property on the command line:
...-Djava.security.auth.login.config=/path/to/jaas.conf....
If you use the Kafka command line tool in the Cloudera data platform (CDP), you can do this by setting the following environment variables:
$export KAFKA_OPTS= "- Djava.security.auth.login.config=/path/to/jaas.conf"
The content of the configuration file depends on the source of the credential. To use Kerberos tickets stored in the user ticket cache, use the following jaas.conf file:
KafkaClient {com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true;}
To use Keytab, use the following command instead:
KafkaClient {com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab= "/ etc/security/keytabs/alice.keytab" principal= "alice@EXAMPLE.COM";}; use the sasl.jaas.config attribute
I generally prefer not to use a separate JAAS configuration file, but rather to use the sasl.jaas.config Kafka property to set the JAAS configuration for the client. This is usually simpler and gets rid of other configuration files (jaas.conf). The following configuration is the same as the jaas.conf configuration above.
Note: the following settings must be written on one line. The semicolon at the end of the line is required.
To use Kerberos tickets stored in the ticket cache:
Sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true
To use keytab, use the following command instead:
Sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab= "/ etc/security/keytabs/alice.keytab" principal= "alice@EXAMPLE.COM"; example
The following is an example of reading from a topic using a Kafka console consumer using Kerberos authentication and connecting directly to an agent (without using a load balancer):
# Complete configuration file for Kerberos auth using the ticket cache$ cat krb-client.propertiessecurity.protocol=SASL_SSLsasl.mechanism=GSSAPIsasl.kerberos.service.name=kafkasasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true;ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks# Authenticate with Kerberos to get a valid ticket$ kinit alicePassword for alice@REALM:
# Connect to Kafka using the ticket in the ticket cache$ kafka-console-consumer\-- bootstrap-server host-1.example.com:9093\-- topic test\-- Network connection between consumer.config / path/to/krb-client.properties and Kerberos
The central component of Kerberos is the Kerberos Distribution Center (KDC). KDC is a service that handles all Kerberos authentication initiated by the client. For Kerberos authentication to work properly, both the Kafka cluster and the client must have a connection to the KDC.
This is easy to achieve in a corporate environment, which is usually the case. However, in some deployments, KDC may be placed behind the firewall, making it impossible for clients to obtain valid tickets through it.
Cloud and hybrid deployments (cloud + on-premises) can pose challenges for clients to use Kerberos authentication because local KDC is usually not integrated into services that deploy the cloud. However, because Kafka supports other authentication mechanisms, customers can use other alternatives, which we will explore in the next article.
At this point, the study on "how to configure clients to connect securely to the Kafka cluster" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.