Easy Steps: Install Apache Kafka on Debian 12<\/a><\/p>\n\n\n\nGenerate SSL\/TLS Certificates<\/h3>\n\n\n\n
To begin with, you need to generate TLS\/SSL certificates for Kafka brokers and clients.<\/p>\n\n\n\n
In this tutorial, we will be using our own self-signed SSL\/TLS certificate<\/strong>. If possible, please use commercially signed\/trusted CA certificates.<\/p>\n\n\n\nGenerate CA Private Key<\/h4>\n\n\n\n
Run the following OpenSSL command to generate a private key for your CA:<\/p>\n\n\n\n
mkdir \/etc\/ssl\/kafka<\/code><\/pre>\n\n\n\nopenssl genpkey -algorithm RSA -out \/etc\/ssl\/kafka\/ca.key<\/code><\/pre>\n\n\n\nThe command generates an RSA private key and saves it in the file \/etc\/ssl\/kafka\/ca.key<\/code>.<\/p>\n\n\n\nGenerate CA self-signed certificate<\/h4>\n\n\n\n
Once you have the private key, you can now generate the CA self-signed certificate using the command below. When the command runs, you are prompted to provide information about your CA, such as the common name, organization, and location, contact email e.t.c. Common Name, must be provided.<\/p>\n\n\n\n
openssl req -x509 -new -key \/etc\/ssl\/kafka\/ca.key -days 3650 -out \/etc\/ssl\/kafka\/ca.crt<\/code><\/pre>\n\n\n\nSample output;<\/p>\n\n\n\n
\nYou are about to be asked to enter information that will be incorporated\ninto your certificate request.\nWhat you are about to enter is what is called a Distinguished Name or a DN.\nThere are quite a few fields but you can leave some blank\nFor some fields there will be a default value,\nIf you enter '.', the field will be left blank.\n-----\nCountry Name (2 letter code) [AU]:US<\/strong>\nState or Province Name (full name) [Some-State]:California<\/strong>\nLocality Name (eg, city) []:San Francisco<\/strong>\nOrganization Name (eg, company) [Internet Widgits Pty Ltd]:Kifarunix-Demo Inc<\/strong>\nOrganizational Unit Name (eg, section) []:Infrastracture<\/strong>\nCommon Name (e.g. server FQDN or YOUR name) []:kafka.kifarunix-demo.com<\/strong>\nEmail Address []:\n<\/code><\/pre>\n\n\n\nYou can provide all these information from the command line using the -subj <\/strong>option.<\/p>\n\n\n\nopenssl req -x509 -new -key \/etc\/ssl\/kafka\/ca.key -days 3560 -out \/etc\/ssl\/kafka\/ca.crt \\\n-subj \"\/C=US\/ST=California\/L=San Francisco\/O=Kifarunix-Demo Inc\/CN=kafka.kifarunix-demo.com\/emailAddress=admin@kifarunix-demo.com\"<\/code><\/pre>\n\n\n\nNote that it is not recommended to use wildcard CN. Instead, use SAN to define your other domains\/IPs.<\/p>\n\n\n\n
Generate Server Private Key and CSR<\/h4>\n\n\n\n
Next, generate the server private key and certificate signing request (CSR).<\/p>\n\n\n\n
openssl req -new -newkey rsa:4096 -nodes -keyout \/etc\/ssl\/kafka\/server.key \\\n-out \/etc\/ssl\/kafka\/server.csr \\\n-subj \"\/C=US\/ST=California\/L=San Francisco\/O=Kifarunix-Demo Inc\/CN=kafka.kifarunix-demo.com\/emailAddress=admin@kifarunix-demo.com\"<\/code><\/pre>\n\n\n\nGenerate and Sign Server Certificate<\/h4>\n\n\n\n
Now, you need to generate the server certificate using the CSR, the CA cert and private key.<\/p>\n\n\n\n
Note that since OpenSSL command doesn’t include the extensions such as Subject Alternative Names on the certificate, you need to provide this information manually.<\/p>\n\n\n\n
SAN extension allows you to include additional subject names, such as domain names or IP addresses, in a single certificate, thus allowing a certificate to be valid for multiple entities or alternative names.<\/p>\n\n\n\n
So, create a CNF file with your SAN extensions;<\/p>\n\n\n\n
vim \/etc\/ssl\/kafka\/san.cnf<\/code><\/pre>\n\n\n\n\nauthorityKeyIdentifier=keyid,issuer\nbasicConstraints=CA:FALSE\nkeyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment\nsubjectAltName = @alt_names\n\n[alt_names]\nDNS.1=kifarunix-demo.com\nDNS.2=*.kifarunix-demo.com\n<\/code><\/pre>\n\n\n\nthen generate and sign the server certificate;<\/p>\n\n\n\n
openssl x509 -req -in \/etc\/ssl\/kafka\/server.csr -CA \/etc\/ssl\/kafka\/ca.crt \\\n-CAkey \/etc\/ssl\/kafka\/ca.key -CAcreateserial -out \/etc\/ssl\/kafka\/server.crt \\\n-days 3650 -extfile \/etc\/ssl\/kafka\/san.cnf<\/code><\/pre>\n\n\n\nSample output;<\/p>\n\n\n\n
Certificate request self-signature ok\nsubject=C = US, ST = California, L = San Francisco, O = Kifarunix-Demo Inc, CN = kafka.kifarunix-demo.com, emailAddress = admin@kifarunix-demo.com<\/code><\/pre>\n\n\n\nCreate Kafka Keystore<\/h3>\n\n\n\n
Now that we have the server certificate and key, we need to generate Kafka keystore.<\/p>\n\n\n\n
Convert Server Certificate to PKCS12<\/h4>\n\n\n\n
First, convert the certificate into PKCS12 format. When prompted, provide the keystore password and keep that password somewhere you can easily retrieve.<\/p>\n\n\n\n
\nopenssl pkcs12 -export \\\n\t-in \/etc\/ssl\/kafka\/server.crt \\\n\t-inkey \/etc\/ssl\/kafka\/server.key \\\n\t-name kafka-broker \\\n\t-out \/etc\/ssl\/kafka\/kafka.p12\n<\/code><\/pre>\n\n\n\nCreate Kafka Java KeyStore (JKS)<\/h4>\n\n\n\n
Next, create Kafka Java KeyStore (JKS) and import the certificate. You will be required to set the destination keystore and source keystore passwords.<\/p>\n\n\n\n
\nkeytool -importkeystore \\\n\t-srckeystore \/etc\/ssl\/kafka\/kafka.p12 \\\n\t-destkeystore \/etc\/ssl\/kafka\/kafka.keystore.jks \\\n\t-srcstoretype pkcs12\n<\/code><\/pre>\n\n\n\nSample output;<\/p>\n\n\n\n
\nImporting keystore \/etc\/ssl\/kafka\/kafka.p12 to \/etc\/ssl\/kafka\/kafka.keystore.jks...\nEnter destination keystore password: \nRe-enter new password: \nEnter source keystore password: \nEntry for alias kafka-broker successfully imported.\nImport command completed: 1 entries successfully imported, 0 entries failed or cancelled\n<\/code><\/pre>\n\n\n\nCreate Kafka TrustStore<\/h4>\n\n\n\n
Similarly, create Kafka truststore containing your CA root certificate. This ensures that the connection between the clients\/brokers via TLS can be proven to be signed by your CA.<\/p>\n\n\n\n
keytool -keystore \/etc\/ssl\/kafka\/kafka.truststore.jks -alias CARoot -import -file \/etc\/ssl\/kafka\/ca.crt<\/code><\/pre>\n\n\n\nWhen executed, you will be prompted set the truststore password and whether to trust the certificate. And of course, trust it (yes<\/strong>)!<\/p>\n\n\n\nAlso save the password.<\/p>\n\n\n\n
\nEnter keystore password: \nRe-enter new password: \nOwner: CN=kafka.kifarunix-demo.com, OU=Infrastracture, O=Kifarunix-Demo Inc, L=San Francisco, ST=California, C=US\nIssuer: CN=kafka.kifarunix-demo.com, OU=Infrastracture, O=Kifarunix-Demo Inc, L=San Francisco, ST=California, C=US\nSerial number: 3c91690b7b180a5be423280485b8ea05f3582a6\nValid from: Sun Jul 16 02:02:37 EDT 2023 until: Wed Jul 13 02:02:37 EDT 2033\nCertificate fingerprints:\n\t SHA1: ED:01:33:C4:32:41:26:A0:2D:24:BC:39:0B:DF:F6:28:A1:5B:F3:0D\n\t SHA256: D6:B7:78:58:F3:F6:41:7D:6C:A2:3B:9E:55:D6:1C:13:EA:07:0C:4D:D3:9F:3E:C5:82:EB:03:38:A9:60:1A:78\nSignature algorithm name: SHA256withRSA\nSubject Public Key Algorithm: 2048-bit RSA key\nVersion: 3\n\nExtensions: \n\n#1: ObjectId: 2.5.29.35 Criticality=false\nAuthorityKeyIdentifier [\nKeyIdentifier [\n0000: CE 9A E0 3F 0E F5 DF BF 38 F5 AE 5B 33 B9 31 E7 ...?....8..[3.1.\n0010: 3C AD A0 13 <...\n]\n]\n\n#2: ObjectId: 2.5.29.19 Criticality=true\nBasicConstraints:[\n CA:true\n PathLen: no limit\n]\n\n#3: ObjectId: 2.5.29.14 Criticality=false\nSubjectKeyIdentifier [\nKeyIdentifier [\n0000: CE 9A E0 3F 0E F5 DF BF 38 F5 AE 5B 33 B9 31 E7 ...?....8..[3.1.\n0010: 3C AD A0 13 <...\n]\n]\n\nTrust this certificate? [no]: yes\n<\/code><\/pre>\n\n\n\nConfirm your keystore\/trustore details;<\/p>\n\n\n\n
keytool -list -v -keystore \/etc\/ssl\/kafka\/kafka.keystore.jks<\/code><\/pre>\n\n\n\nkeytool -list -v -keystore \/etc\/ssl\/kafka\/kafka.truststore.jks<\/code><\/pre>\n\n\n\nIt is now time to configure Apache Kafka SSL\/TLS Encryption. This can be done by updating the server.properties<\/code><\/strong> configuration as follows.<\/p>\n\n\n\nNote that we are running Kafka with KRaft algorithm in our setup.<\/p>\n\n\n\n
Open the Kafka server\/broker configuration for updates;<\/p>\n\n\n\n
vim \/opt\/kafka\/config\/kraft\/server.properties<\/code><\/pre>\n\n\n\nBy default, Kafka is set to accept plain text connections as you can see under Socker server settings<\/strong> section;<\/p>\n\n\n\n\n############################# Socket Server Settings #############################\n\n# The address the socket server listens on.\n# Combined nodes (i.e. those with `process.roles=broker,controller`) must list the controller listener here at a minimum.\n# If the broker listener is not defined, the default listener will use a host name that is equal to the value of java.net.InetAddress.getCanonicalHostName(),\n# with PLAINTEXT listener name, and port 9092.\n# FORMAT:\n# listeners = listener_name:\/\/host_name:port\n# EXAMPLE:\n# listeners = PLAINTEXT:\/\/your.host.name:9092\nlisteners=PLAINTEXT:\/\/:9092,CONTROLLER:\/\/:9093\n<\/strong>\n# Name of listener used for communication between brokers.\ninter.broker.listener.name=PLAINTEXT\n<\/strong>\n# Listener name, hostname and port the broker will advertise to clients.\n# If not set, it uses the value for \"listeners\".\nadvertised.listeners=PLAINTEXT:\/\/localhost:9092\n<\/strong>\n# A comma-separated list of the names of the listeners used by the controller.\n# If no explicit mapping set in `listener.security.protocol.map`, default will be using PLAINTEXT protocol\n# This is required if running in KRaft mode.\ncontroller.listener.names=CONTROLLER\n<\/strong>\n# Maps listener names to security protocols, the default is for them to be the same. See the config documentation for more details\nlistener.security.protocol.map=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,SSL:SSL,SASL_PLAINTEXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL<\/strong>\n<\/code><\/pre>\n\n\n\nThus, to enable SSL\/TLS connection, we will update some of the configs here and add a few more SSL settings.<\/p>\n\n\n\n
Note the controller.listener.names<\/code><\/strong> can be used if you have a Kafka cluster. At the moment, we are just running single node Kafka cluster. If you have a cluster, ensure you configure SSL\/TLS settings on all nodes.<\/p>\n\n\n\nWith comment lines removed, this is how our Socket server settings<\/strong> look like;<\/p>\n\n\n\n\n############################# Socket Server Settings #############################\n\nlisteners=SSL:\/\/kafka.kifarunix-demo.com:9092,CONTROLLER:\/\/kafka.kifarunix-demo.com:9093\ninter.broker.listener.name=SSL\nadvertised.listeners=SSL:\/\/kafka.kifarunix-demo.com:9092\ncontroller.listener.names=CONTROLLER\nlistener.security.protocol.map=CONTROLLER:SSL,SSL:SSL\n\n\nssl.keystore.location=\/etc\/ssl\/kafka\/kafka.keystore.jks\nssl.keystore.password=ChangeME\nssl.key.password=ChangeME\nssl.truststore.location=\/etc\/ssl\/kafka\/kafka.truststore.jks\nssl.truststore.password=ChangeME\nssl.client.auth=required\n<\/code><\/pre>\n\n\n\nUpdate the configuration according to your setup.<\/p>\n\n\n\n
Note that the line ssl.client.auth=required<\/strong><\/code> enforces SSL\/TLS client authentication. It specifies that clients connecting to the Kafka brokers must provide a valid client certificate for authentication.<\/p>\n\n\n\nAlso, if you are using KRaft, ensure you update the controller address;<\/p>\n\n\n\n
\n############################# Server Basics #############################\n\n# The role of this server. Setting this puts us in KRaft mode\nprocess.roles=broker,controller\n\n# The node id associated with this instance's roles\nnode.id=1\n\n# The connect string for the controller quorum\ncontroller.quorum.voters=1@kafka.kifarunix-demo.com:9093<\/strong>\n<\/code><\/pre>\n\n\n\nSave and exit the configuration file.<\/p>\n\n\n\n
Test and Validate Kafka SSL\/TLS Connection<\/h3>\n\n\n\nRestart Kafka Service<\/h4>\n\n\n\n
You can now restart Kafka service to apply the changes;<\/p>\n\n\n\n
systemctl restart kafka<\/code><\/pre>\n\n\n\nCheck the logs;<\/p>\n\n\n\n
journalctl -f -u kafka<\/code><\/pre>\n\n\n\nCheck the status;<\/p>\n\n\n\n
\n\u25cf kafka.service - Apache Kafka\n Loaded: loaded (\/etc\/systemd\/system\/kafka.service; disabled; preset: enabled)\n Active: active (running) since Sun 2023-07-16 07:03:36 EDT; 1min 14s ago\n Main PID: 129624 (java)\n Tasks: 90 (limit: 4642)\n Memory: 715.7M\n CPU: 14.477s\n CGroup: \/system.slice\/kafka.service\n \u2514\u2500129624 java -Xmx1G -Xms1G -server -XX:+UseG1GC -XX:MaxGCPauseMillis=20 -XX:InitiatingHeapOccupancyPercent=35 -XX:+ExplicitGCInvokesConcurrent -XX:MaxInlineLevel=15 -Djava.awt.headless=true \"-Xlog>\n\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,877] INFO Awaiting socket connections on kafka.kifarunix-demo.com:9092. (kafka.network.DataPlaneAcceptor)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,881] INFO [BrokerServer id=1] Waiting for all of the authorizer futures to be completed (kafka.server.BrokerServer)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,882] INFO [BrokerServer id=1] Finished waiting for all of the authorizer futures to be completed (kafka.server.Broker>\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,882] INFO [BrokerServer id=1] Waiting for all of the SocketServer Acceptors to be started (kafka.server.BrokerServer)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,882] INFO [BrokerServer id=1] Finished waiting for all of the SocketServer Acceptors to be started (kafka.server.Brok>\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,882] INFO [BrokerServer id=1] Transition from STARTING to STARTED (kafka.server.BrokerServer)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,883] INFO Kafka version: 3.5.0 (org.apache.kafka.common.utils.AppInfoParser)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,883] INFO Kafka commitId: c97b88d5db4de28d (org.apache.kafka.common.utils.AppInfoParser)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,883] INFO Kafka startTimeMs: 1689505422882 (org.apache.kafka.common.utils.AppInfoParser)\nJul 16 07:03:42 kafka.kifarunix-demo.com kafka-server-start.sh[129624]: [2023-07-16 07:03:42,884] INFO [KafkaRaftServer nodeId=1] Kafka Server started (kafka.server.KafkaRaftServer)\n<\/code><\/pre>\n\n\n\nTest Client Topic Creation Over SSL\/TLS<\/h4>\n\n\n\n
To simulate how a Kafka client would send stream of data into Kafka and create a topic over ssl, we will use the kafka-topics.sh<\/code><\/strong>, command on the Kafka server.<\/p>\n\n\n\nNote that Kafka server is now using SSL\/TLS and requires client authentication via the certificate. Thus, create a properties files to define the client-broker connection properties;<\/p>\n\n\n\n
vim ~\/kafka-client-ssl-test.properties<\/code><\/pre>\n\n\n\nEnter the following content and update them accordingly!<\/p>\n\n\n\n
\nsecurity.protocol=SSL\nssl.keystore.location=\/etc\/ssl\/kafka\/kafka.keystore.jks\nssl.keystore.password=ChangeME\nssl.truststore.location=\/etc\/ssl\/kafka\/kafka.keystore.jks\nssl.truststore.password=ChangeME\n<\/code><\/pre>\n\n\n\nNote, if client authentication is not required in the broker, then you dont need the ssl.keystore.<\/strong> settings.<\/p>\n\n\n\nNext, test the SSL\/TLS connection to Kafka;<\/p>\n\n\n\n
\n\/opt\/kafka\/bin\/kafka-topics.sh --create \\\n\t--topic testssl-topic \\\n\t--bootstrap-server kafka.kifarunix-demo.com:9092 \\\n\t--command-config kafka-client-ssl-test.properties\n<\/code><\/pre>\n\n\n\nIf all goes well, then you should see such an output;<\/p>\n\n\n\n
Created topic testssl-topic.<\/code><\/pre>\n\n\n\nList the topics;<\/p>\n\n\n\n
\/opt\/kafka\/bin\/kafka-topics.sh --list \\\n\t--bootstrap-server kafka.kifarunix-demo.com:9092 \\\n\t--command-config kafka-client-ssl-test.properties<\/code><\/pre>\n\n\n\nExclude Internal Kafka Broker Connections from SSL Authentication<\/h4>\n\n\n\n
Now, what if you want to exclude the connections within the Kafka broker from SSL authentication so you don't have to always provide a path to SSL configurations file as we did above while listing the topics created?<\/p>\n\n\n\n
Edit the properties file and set the PLAINTEXT connection for localhost on specific port;<\/p>\n\n\n\n
vim \/opt\/kafka\/config\/kraft\/server.properties<\/code><\/pre>\n\n\n\nSee the the highlighted configs added;<\/p>\n\n\n\n
\nlisteners=SSL:\/\/kafka.kifarunix-demo.com:9092,CONTROLLER:\/\/kafka.kifarunix-demo.com:9093,PLAINTEXT:\/\/localhost:9094<\/strong>\n...\nadvertised.listeners=SSL:\/\/kafka.kifarunix-demo.com:9092,PLAINTEXT:\/\/localhost:9094<\/strong>\n...\nlistener.security.protocol.map=CONTROLLER:SSL,SSL:SSL,PLAINTEXT:PLAINTEXT<\/strong>\n<\/code><\/pre>\n\n\n\nRestart Kafka;<\/p>\n\n\n\n
systemctl restart kafka<\/code><\/pre>\n\n\n\nEnsure the port is opened;<\/p>\n\n\n\n
ss -altnp | grep :90<\/code><\/pre>\n\n\n\n\nLISTEN 0 50 [::ffff:192.168.57.32]:9092 *:* users:((\"java\",pid=140837,fd=159)) \nLISTEN 0 50 [::ffff:192.168.57.32]:9093 *:* users:((\"java\",pid=140837,fd=131)) \nLISTEN 0 50 [::ffff:127.0.0.1]:9094 *:* users:((\"java\",pid=140837,fd=161)) \n<\/code><\/pre>\n\n\n\nYou can then run Kafka commands internally, without need for SSL authentication;<\/p>\n\n\n\n
\/opt\/kafka\/bin\/kafka-topics.sh --list --bootstrap-server localhost:9094<\/code><\/pre>\n\n\n\nAnd that is it on our guide on how to configure Apache Kafka SSL\/TLS encryption.<\/p>\n\n\n\n
Further Reading<\/h3>\n\n\n\n
Read more on Apache Kafka Security configuration page<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"In this quick guide, we will take you through steps on how to configure Apache Kafka SSL\/TLS encryption for enhanced security. By default, Kafka uses<\/p>\n","protected":false},"author":10,"featured_media":17897,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"rank_math_lock_modified_date":false,"footnotes":""},"categories":[34,121,1187],"tags":[7057,7059,7058,7060],"class_list":["post-17875","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-security","category-howtos","category-ssl-tls","tag-apache-kafka-ssl-tls","tag-kafka-ssl-and-plaintext","tag-kafka-ssl-connection","tag-kafka-tls-configuration","generate-columns","tablet-grid-50","mobile-grid-100","grid-parent","grid-50","resize-featured-image"],"_links":{"self":[{"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/posts\/17875"}],"collection":[{"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/users\/10"}],"replies":[{"embeddable":true,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/comments?post=17875"}],"version-history":[{"count":30,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/posts\/17875\/revisions"}],"predecessor-version":[{"id":22580,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/posts\/17875\/revisions\/22580"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/media\/17897"}],"wp:attachment":[{"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/media?parent=17875"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/categories?post=17875"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/kifarunix.com\/wp-json\/wp\/v2\/tags?post=17875"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}