Kafka Messaging with Protobuf and Schema Registry in C#

Intro

In this short article, I’ll demonstrate how to use Protobuf and Confluent Schema Registry to produce and consume Kafka messages in C#.

Additionally, we will delve into monitoring and managing Kafka topics and schemas via Confluent Control Center.

You can find the complete source code on GitHub

Setting Up Kafka

To prepare your environment for this demo, you can use this docker-compose file to launch the necessary Docker containers. This file includes the following services:

  • Kafka
  • Zookeeper
  • Schema Registry
  • Control Center

Creating the Kafka Topic Using Confluent Control Center

Confluent Control Center is a web-based user interface for monitoring and managing Kafka clusters. It provides an intuitive way to visualize the data flowing through Kafka topics and tracks key performance metrics of the Kafka cluster, such as throughput, latency, and message rate.

With Confluent Control Center, you can:

  • Monitor Kafka cluster health: View the status of brokers, topics, and consumers in real-time, and set up alerts for abnormal behavior.
  • Manage Kafka topics: Create and modify topics, monitor topic-level metrics, and inspect messages in the topic.
  • Track performance metrics: Measure the performance of your Kafka cluster with comprehensive metrics and analytics, including message rates, byte rates, and latency.
  • Troubleshoot issues: Use built-in tools to diagnose and troubleshoot issues with your Kafka cluster.

Creating the “users” Topic

If you’ve used the docker-compose file from the previous section, Control Center should be running on http://localhost:9021/

To create a new topic, you can click on “Topics” and then the “Create topic” button:

Enter the topic name “users” and hit “Create with defaults”:

Configuring the Protobuf Schema

Once the topic is created, go to the “Schema” tab and hit “Set a schema”:

Use can schema the proto schema from the user.proto file:

Generating the C# Classes

The User class is generated based on the user.proto file and the protobuf compiler. Here is a sample command you can use if you want to regenerate the C# class yourself:

protoc --csharp_out=. user.proto

The Demo App

The sample application is pretty straightforward. We start a Kafka producer and consumer to publish/receive User messages, serialized using the schema from the previous section.

Please feel free to delve into the complete source code here. In this discussion, we will concentrate on the key aspects.

Producer

Spend a moment to review the StartProducer method:

It is responsible for setting up a Kafka producer with a Protobuf serializer utilizing Schema Registry. The producer sends messages with a string key and a User value to a Kafka topic.

  • The schemaRegistryConfig object is created with a Url property to connect to the schema registry.
  • ProtobufSerializer is set as the value serializer using the SetValueSerializer method.

Consumer

The consumer part looks like this:

I think the code here should be pretty self-explanatory. We set up a Kafka consumer with a Protobuf deserializer that handles the incoming User messages.

Running the App

If everything is configured correctly, executing the app should produce a result similar to this:

You can check the new messages in Control Center as well:

Summary

This article has demonstrated how to use Protobuf with Kafka and C# to produce and consume messages while also highlighting the use of Schema Registry for schema management.

Thanks for reading, and see you next time!

Resources

  1. GitHub Repo

Site Footer

Subscribe To My Newsletter

Email address