Installing Single Node Kafka Cluster

In this tutorial, we will set up a single-node Kafka Cluster and run it using the command line.

Step 1) Let’s start getting the Kafka binary, you can download the Kafka binary from the below linkhttps://kafka.apache.org/

Step 2) Click on Download button


Click on the binary download to get the download started

Kafka is download in the Downloaded folder

Moving the Kafka download to the Kafka Directory (ie /home/dataengineer/kafka)

Step 3) Unzip Kafka

tar -xvzf kafka_2.12-3.6.0.tgz


Step 4) START THE KAFKA ENVIRONMENT

NOTE: Your local environment must have Java 8+ installed.

Apache Kafka can be started using ZooKeeper

Kafka with ZooKeeper

Run the following commands in order to start all services in the correct order:

# Start the ZooKeeper service

$ bin/zookeeper-server-start.sh config/zookeeper.properties

Keep this terminal running

Open another terminal session and run:

# Start the Kafka broker service

$ bin/kafka-server-start.sh config/server.properties



Step 5) CREATE A TOPIC TO STORE YOUR EVENTS

Kafka is a distributed event streaming platform that lets you read, write, store, and process events (also called records or messages in the documentation) across many machines.Example events are payment transactions, geolocation updates from mobile phones, shipping orders, sensor measurements from IoT devices or medical equipment, and much more. These events are organized and stored in topics. Very simplified, a topic is similar to a folder in a filesystem, and the events are the files in that folder.So before you can write your first events, you must create a topic.

Open another terminal session and run:

$ bin/kafka-topics.sh –create –topic quickstart-events –bootstrap-server localhost:9092

All of Kafka’s command line tools have additional options: run the kafka-topics.sh command without any arguments to display usage information. For example, it can also show you details such as the partition count of the new topic:

STEP 6: WRITE SOME EVENTS INTO THE TOPIC

A Kafka client communicates with the Kafka brokers via the network for writing (or reading) events. Once received, the brokers will store the events in a durable and fault-tolerant manner for as long as you need—even forever.

Run the console producer client to write a few events into your topic. By default, each line you enter will result in a separate event being written to the topic.

$ bin/kafka-console-producer.sh –topic quickstart-events –bootstrap-server localhost:9092

This is my first event

This is my second event


STEP 7: READ THE EVENTS

Open another terminal session and run the console consumer client to read the events you just created:

$ bin/kafka-console-consumer.sh –topic quickstart-events –from-beginning –bootstrap-server localhost:9092

This is my first event
This is my second event
You can stop the consumer client with Ctrl-C at any time.

Mon Jan 8, 2024

Launch your GraphyLaunch your Graphy
100K+ creators trust Graphy to teach online
Learn Bigdata, Spark & Machine Learning | SmartDataCamp 2024 Privacy policy Terms of use Contact us Refund policy