Use Apache Kafka with PHP

Updated 17 April 2020

Previously, we got Kafka setup and configuration in our Ubuntu 18.04 machine. Now we will go for Apache Kafka with PHP which is use to produce as well as consume the messages.

By the help of Kafka we can transfer continuously streaming data to the cluster, for example history of website visits, financial transactions, online shopping order, application logs, etc, use machine learning or aggregate data for further analysis. This all takes seconds or minutes, instead of hours and days.

You can read more in the Apache Kafka Page.

Now we start with Apache Kafka with PHP, before that you might be follow this link to install zookeeper and Apache Kafka in Ubuntu 18.04. If you already done with installation then follow the steps.

What we will do?

  • Install PHP Client
  • Create Consumer
  • Create Producer
  • Test Environment

Install PHP Client :-

There are several PHP clients for Kafka. I recommend using this Compose, because it shows better performance than some other PHP clients.

Now go to same folder in which Kafka is configured and run the command.

Create Consumer :-

Create file “Consumer.php” with any editor and set broker to “127.0.0.1:9092” as Kafka cluster is at local with one broker (node).

Add the following colored content to file.

<?php
require ‘vendor/autoload.php’;

$config = \Kafka\ConsumerConfig::getInstance();
$config->setMetadataRefreshIntervalMs(10000);
$config->setMetadataBrokerList(‘127.0.0.1:9092’);
$config->setGroupId(‘test’);
$config->setBrokerVersion(‘1.0.0’);
$config->setTopics([‘test’]);
$consumer = new \Kafka\Consumer();

$consumer->start(function($topic, $part, $message) {
var_dump($message);
});

Save as well as exit from the file with “esc” + “:wq!”.

Create Producer :-

A producer works on two modes – Asynchronous (Single message at a time) and Synchronous (Multiple message at a time).

For Asynchronous message create file Producer.php in any editor.

Add the following colored content in file.

<?php
require ‘vendor/autoload.php’;

$config = \Kafka\ProducerConfig::getInstance();
$config->setMetadataRefreshIntervalMs(10000);
$config->setMetadataBrokerList(‘127.0.0.1:9092’);
$config->setBrokerVersion(‘1.0.0’); $config->setRequiredAck(1);
$config->setIsAsyn(false); $config->setProduceInterval(500);

$producer = new \Kafka\Producer(
function() {
return [
[
‘topic’ => ‘test’,
‘value’ => ‘Cloudkul….message.’,
‘key’ => ‘testkey’,
],
];
}
);

$producer->success(function($result) {
var_dump($result); });
$producer->error(function($errorCode) {
var_dump($errorCode);
});
$producer->send(true);

Save as well as exit from the file with “esc” + “:wq!”.

For Synchronous message create file ProducerSyc.php in any editor. Where the producer produce 200 message at time.

Add the following colored content in the file.

<?php
require ‘vendor/autoload.php’;
$config = \Kafka\ProducerConfig::getInstance();
$config->setMetadataRefreshIntervalMs(10000);
$config->setMetadataBrokerList(‘127.0.0.1:9092’);
$config->setBrokerVersion(‘1.0.0’);
$config->setRequiredAck(1);
$config->setIsAsyn(false);
$config->setProduceInterval(500);
$producer = new \Kafka\Producer();
for($i = 0; $i < 200; $i++) {
$producer->send([
[
‘topic’ => ‘test’,
‘value’ => ‘Cloudkul… message #’.$i,
‘key’ => ”,
],
]);
}

Save as well as exit from the file with “esc” + “:wq!”.

Test Environment :-

Now we have all the required setup in our Ubuntu 18.04, so first we start our Consumer, run in the terminal.

Then from another terminal we will try to send message with the help of Producer.

And also check the Synchronous Producer.

Now we see in the consumer terminal then we find all the messages are found their, exactly what we want.

Category(s) Uncategorized
author
. . .

Leave a Comment

Your email address will not be published. Required fields are marked*


Be the first to comment.

Start a Project






    Message Sent!

    If you have more details or questions, you can reply to the received confirmation email.

    Back to Home