Introduction to Apache Kafka Security

Updated 20 April 2020

In this introduction blog, I will try to explain Kafka Security in terms so that everyone can understand, we will go over problems, needs, SSL, SASL, ACL and many more.

Apache Kafka and the need for Security

Kafka is work as a middle layer which is use to enable back-end system to share real time data feeds with all through Kafka topic. With standard Kafka configuration, any user or application have access to write any message to any topics, as well as read data from any topics. As now companies are moving towards tenancy model. Where number of teams as well as applications use the same Kafka Cluster, or Kafka Cluster contain some critical and confidential information. For this we need to implement Kafka security.

To Know more about Apache Kafka Click Link.

Problems Solve by Apache Kafka Security

Kafka has three Security components:

Encryption of data in-motion using SSL/TLS

This is use to encrypt data from producer to Kafka and consumer to Kafka. This is very common pattern which is use by everyone when moving towards web. By this the normal HTTP site can get change into HTTPS.

Authentication using SSL or SASL

This is use to authenticate producer and consumer for Kafka cluster, which verifies their identity. This is the secure way to identify as well as approve the clients for authorization.

Authorization using ACLs

Once the user is authenticated, Kafka brokers can run them against Access Control Lists (ACL) to find whether a particular client is authorized to write or read to some topic.

Encryption (SSL)

By the use of encryption the problem of man in the middle attack is resolve. If data is PLAINTEXT (by default in Kafka), then any of these routers can read the content of data which is send. By the help of encryption all the data is encrypt and securely transmitted over the network. By this only the sender and receiver can access the data. This is use to encrypt the transmission of data on broker is still un-encrypt.

This encryption comes at a cost: CPU is now use for both Kafka Clients and Kafka Brokers at maximum advantages for encrypt and decrypt packets. SSL Security comes at the cost of performance. The performance cost is decrease further by a substantial amount.

Authentication (SSL & SASL)

There are two ways to authenticate Kafka clients to your brokers:

SSL Authentication

It is basically two way authentication, by this idea the users will get certificate, signed by certificate authority, by which Kafka brokers is use to verify the identity of the users.

This is the most common setup which is use to get maximum advantage to a managed Kafka Clusters.

SASL Authentication

Simple Authorization Service Layer, basically the authentication mechanism is separate from the Kafka protocol. It is very useful with Bigdata and Hadoop.

SASL PLAINTEXT

This is a classic credentials combination. In this the credentials are store in the Kafka brokers. In advance and each change needs to restart of service. This is not recommended kind of security. Or if you use SASL/PLAINTEXT then make sure to enable SSL encryption, so that the credentials isn’t sent as PLAINTEXT on the network.

SASL SCRAM

This is a credentials combination alongside a challenge (salt), by this it become more secure. On top of this, username and password hashes are store in zookeeper, by which without reboot broker we can scale security. SASL/SCRAM then make sure to enable SSL encryption, so that the credentials isn’t sent as PLAINTEXT on the network.

SASL GSSAPI (Kerberos)

This is based on Karberos ticket mechanism, a very secure way of providing authentication, all the security is manage within their Kerberos server. Additionally all the communication is encrypt by SSL encryption is optional with SASL/GSSAPI, it is difficult to configure Kerberos with Kafka.

For making things short as well as simple use SASL/SCRAM or SASL GSSAPI (Kerberos) for authentication layer.

Authorization (ACL)

After authentication of client is complete, Kafka need to decide what the client can as well as cannot do. Now the authorization comes in picture which is control by ACL (Access control list). ACL are what you expect them to be. In standard Kafka “SimpleAclAuthorizer” is use.

ACL are very helpful for preventing unauthorized access to the topic write and read permissions.

By using default provider, ACL are store in Zookeeper. So for that it is important to secure zookeeper and make sure that only Kafka brokers are allow to write to zookeeper (zookeeper.set.acl=true).

Now to manage security of Kafka Tool Link.

Category(s) Uncategorized
author
. . .

Leave a Comment

Your email address will not be published. Required fields are marked*


Be the first to comment.

Start a Project






    Message Sent!

    If you have more details or questions, you can reply to the received confirmation email.

    Back to Home