Apache Kafka is a distributed streaming platform. With it’s rich API (Application Programming Interface) set, we can connect mostly anything to Kafka as source of data, and on the other end, we can set up a large number of consumers that will receive the steam of records for processing. Kafka is highly scaleable, and stores the streams of data in a reliable and fault-tolerant way. From the connectivity perspective, Kafka can serve as a bridge between many heterogeneous systems, which in turn can rely on it’s capabilities to transfer and persist the data provided.
In this tutorial we will install Apache Kafka on a Red Hat Enterprise Linux 8, create the systemd
unit files for ease of management, and test the functionality with the shipped command line tools.
In this tutorial you will learn:
- How to install Apache Kafka
- How to create systemd services for Kafka and Zookeeper
- How to test Kafka with command line clients