Kafka

Use Case - Logistics revisited

x

x

Pentaho Data Integration

Kafka-docker-compose is a tool or method that allows you to easily configure and set up Apache Kafka along with its components such as Kafka Brokers, ZooKeeper, Kafka Connect, and more in a Docker environment. Using docker-compose, you can define and run multi-container Docker applications where each service (like a Kafka broker or ZooKeeper) is defined in a docker-compose.yml file.

This approach simplifies the complexities of network configurations between these services and ensures that you have a reproducible and isolated environment for development, testing, and potentially production scenarios. It allows for easy scaling of Kafka brokers and other services within your cluster.

  1. Create a Kafka directory.

cd
mkdir -p ~/Kafka
  1. Follow the instructions in the link below to create

Link to kafka-docker-compose

Lets start with a simple cluster that consists of: 1 Broker & 1 Controller - Zookeeper mode.

  1. Execute the following command (adds JMX agents for Prometheus & Grafana).

cd
cd ~/Kafka
python3 kafka_docker_composer.py -b 1 -z 1 -p
  1. Execute the generated docker-compose.yml file.

cd
cd ~/Kafka
docker-compose up -d
Deploy Kafka containers

KaDeck

KaDeck is a specialized tool designed for working with Apache Kafka, offering a user-friendly interface for Kafka monitoring, management, and data exploration. It serves as a comprehensive client that allows developers, data engineers, and operations teams to interact with their Kafka clusters more efficiently.

The tool provides real-time visibility into Kafka topics, consumer groups, and messages, allowing users to browse and search through data streams with advanced filtering capabilities. This makes troubleshooting and debugging significantly easier compared to command-line alternatives. KaDeck also offers features for monitoring cluster performance, analyzing consumer lag, and visualizing message flow throughout the system.

One of KaDeck's key strengths is its ability to decode various message formats automatically (including Avro, JSON, and Protocol Buffers), presenting the data in a structured, readable format. The tool supports both cloud-based and on-premises Kafka deployments, making it versatile for different enterprise environments. For teams working extensively with event streaming platforms, KaDeck helps bridge the gap between technical Kafka operations and business-relevant data insights.

  1. Try KaDeck for free.!

Link to KaDeck

  1. Run the following command - Change the ports to prevent conflict.

docker run -d -e xeotek_kadeck_free="<your_email_address>" -e xeotek_kadeck_port=80 -e xeotek_kadeck_secret="" -e xeotek_kadeck_teamid="" -p 80:80 --name kadeck -v kadeck_data:/root/.kadeck/ xeotek/kadeck:6.0.1

  1. Open the Lenses HQ at: http://localhost:8070

Link to KaDeck

KaDeck
  1. Log in with:

Username: admin

Password: admin

  1. xx

Last updated

Was this helpful?