“It’s Not Whether You Get Knocked Down, It’s Whether You Get Up.”
– Inspirational Quote By Vince Lombardi
Kafka Producer API allows applications to send streams of data to topics in the Kafka cluster.
Looking for a way to implement Custom Kafka Producer in your project. This blog post gives you an end to end solution to implement this functionality using KAFKA API.
There are two ways to implement Kafka Producer :-
- Implementing CUSTOM Kafka Producer using KAFKA API.
- Implementing LAGOM Kafka Producer. (Preferable)
- Import KafkaProducer and ProducerRecord packages from producer API.
2. Setting up the required configuration for producing a record. You can add this method in below defined class or any configuration class.
3. Create a class that extends KafkaProducerApi.
4. Send the Message to Kafka Topic. For Example: The Message is a case class that I would like to produce to the Kafka Topic. By Default, Kafka serializer uses String type as Key and value. If you want a custom case class to be produced you should implement serializer for that corresponding case class as explained below. Add this method in above defined class.
Writing your custom serializer
By default, The
value.serializer instruct how to turn the key and value objects the user provides with their
ProducerRecord into bytes.
- Start ZooKeeper.
sudo bin/zkServer.sh start
2. Start Kafka.
3. Create a topic where your service is producing.
bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic test
4. Run the Consumer to test if your service is able to produce the desired case class in JSON Format.
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning
Thanks for reading my blog.I will cover implementation of Producer using Lagom in upcoming blog posts.Stay tuned.