Setup and Run Apache Kafka on Windows

To get started with Apache Kafka on Windows, complete the instructions given in this quick start tutorial.

Note: Your computer must have Java 8+ installed to run Kafka.

Follow these steps to setup and run Apache Kafka on a local machine:

Downloading Apache Kafka

Go to Apache Kafka official download page at https://kafka.apache.org/downloads and download the latest stable binary version.

Extracting the Downloaded file

To keep the length of the path short by not placing the Kafka file in folders with long name. Doing so will cause error with message "The input line is too long. The syntax of the command is incorrect" while running the Kafka server. Move the downloaded .tgz file to the root of C: drive and extract it there.

Change Default Configurations

Go to the directory where Apache Kafka files were extracted and modify the value of the zookeeper data directory path in the 'config/zookeeper.properties' file:

//from
dataDir=/tmp/zookeeper

//to
dataDir=D:/kafka/data/zookeeper

Next, open and update Apache Kafka log file path value in config/server.properties file as shown in the example below:

//from
log.dirs=/tmp/kafka-logs

//to
log.dirs=D:/kafka/data/kafka-logs

In this example, we're using the D drive to save Kafka data and logs. If you can't save data and logs to the C drive due to permission issues, think about putting them on a different drive if you can.

Creating Data folder for Zookeeper and Apache Kafka

First, create a "kafka" folder. Inside this "kafka" folder, create a "data" folder. Now, within the "data" folder, create "zookeeper" and "kafka-logs" folders. For example:

Starting Zookeeper and Apache Kafka

Open a command prompt and navigate to the '\bin\windows' directory within the extracted Kafka folder by entering the command 'cd C:\kafka_2.13-3.5.1\bin\windows'. Start Zookeeper by executing the 'zookeeper-server-start.bat' script with the 'config\zookeeper.properties' file using the following command:

.\zookeeper-server-start.bat ..\..\config\zookeeper.properties

You need to make sure that Zookeeper started successfully.

Next, open a new command prompt and navigate to'\bin\windows' directory within the extracted Kafka folder by using the command 'cd C:\kafka_2.13-3.5.1\bin\windows'. Start Apache Kafka by executing the 'kafka-server-start.bat' script with the 'config\server.properties' file using the following command:

.\kafka-server-start.bat ..\..\config\server.properties

You also need to make sure that Apache Kafka started successfully.

Creating a Topic

A topic is required to store events. A topic in Kafka, is like a table in a database where data are stored. You can have multiple topics for different events. To create a new topic, navigate to '\bin\windows' directory within the extracted Kafka folder by using the command 'cd C:\kafka_2.13-3.5.1\bin\windows', then execute the following command by replacing 'my-topic-name' with the name of your topic:

.\kafka-topics.bat --create --topic my-topic-name --partitions 5 --replication-factor 1 --bootstrap-server localhost:9092

You can also view details of the new topic by running the following command:

.\kafka-topics.bat --describe --topic my-topic-name --bootstrap-server localhost:9092

Writing Events to the Topic

A Kafka client communicates with the Kafka broker to write events. After the events are written, the broker will store the events for as long as you want.

There are Kafka clients libraries, supported for different languages, using which we can send events to Kafka topics. An application that sends data to the Kafka topic is called a producer application.

You can also use the console Producer to write events to the topic. By default, each entered line is treated as a separated event:

> cd C:\kafka_2.13-3.5.1\bin\windows
> .\kafka-console-producer.bat --topic my-topic-name --bootstrap-server localhost:9092
> This is my first event
> This is my second event
> This is my third event

The console Producer client can be stopped by pressing Crtl+C at any time.

Reading Events from the Topic

A Kafka client communicates with the Kafka broker to read events.

You can use Kafka clients libraries in your application to read events from Kafka topics. An application that reads data from the Kafka topics is called a consumer application.

You can also use console Consumer client to read events that you created. Following is the command to read events:

> cd C:\kafka_2.13-3.5.1\bin\windows
> .\kafka-console-consumer.bat --topic my-topic-name --from-beginning --bootstrap-server localhost:9092

You can stop the Consumer client by pressing Ctrl+C at any time, followed by entering Y (Yes).

Stopping the Kafka Services

First, stop the Kafka console Producer and Consumer clients.

Next, stop the Kafka broker by pressing Ctrl+C, followed by entering Y (Yes).

Lastly, stop the Kafka ZooKeeper by pressing Ctrl+C, followed by entering Y (Yes).

Import/Export Data as Streams of Events into Kafka

Sometimes, you may need to collect data from any existing relational databases or messaging system into Kafka. To achieve this, you can use Kafka Connect. Kafka connect is a tool that helps to stream data reliably and durably between Kafka and other external systems. Kafka Connect can collect data continuously from any external systems into Kafka and vice versa.