Apache Kafka

Distributed Transactions and Saga Patterns

In a Knolx session organized by Knoldus, we discussed the idea of following Saga Patterns. For that to be more accessible, I’d like to share the session with the help of this blog. Service-oriented architecture has given us enough advantages to be a predominant architecture in our Industry, but it can’t be all sunshine and rainbows. There are use cases where monoliths are not only Continue Reading

Code Combat II : The Code Battle For The Vanguard Continues…

“If you can dream it, you can do it. ”  -Walt Disney For some coding is a job. For some, it is an exercise. But for us folks here at Knoldus, it’s a Passion. So in order to bring a twist in the daily work schedule, Knoldus held an overnight Hackathon competition within the organization on 18th May 2018 which presented an opportunity for every Knolder(employees Continue Reading

Kafka Streams

Interactive Queries in Apache Kafka

Apache Kafka v0.10 introduced a new feature Kafka Streams API – a client library which can be used for building applications and microservices, where the input and output data can be stored in Kafka clusters. Kafka Streams provides state stores, which can be used by stream processing applications to store and query data.  Every task in Kafka Streams uses one or more state stores which Continue Reading

Structured Streaming: What is it?

With the advent of streaming frameworks like Spark Streaming, Flink, Storm etc. developers stopped worrying about issues related to a streaming application, like – Fault Tolerance, i.e., zero data loss, Real-time processing of data, etc. and started focussing only on solving business challenges. The reason is, the frameworks (the ones mentioned above) provided inbuilt support for all of them. For example: In Spark Streaming, by just adding Continue Reading

A Beginner’s Guide to Deploying a Lagom Service Without ConductR

How to deploy a Lagom Service without ConductR? This question has been asked and answered by many, on different forums. For example, take a look at this question on StackOverflow – Lagom without ConductR? Here the user is trying to know whether it is possible to use Lagom in production without ConductR or not. To which the best answer that came up was – “Yes, it is Continue Reading

Kafka And Spark Streams: The happily ever after !!

Hi everyone, Today we are going to understand a bit about using the spark streaming to transform and transport data between Kafka topics. The demand for stream processing is increasing every day. The reason is that often, processing big volumes of data is not enough. We need real-time processing of data especially when we need to handle continuously increasing volumes of data and also need Continue Reading

Error Registering Avro Schema | Multiple Schemas In One Topic

org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {“type”:”record”,”name”:”schema1″,”namespace”:”test”,”fields”:[{“name”:”Name”,”type”:”string”},{“name”:”Age”,”type”:”int”},{“name”:”Location”,”type”:”string”}]} Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409 at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:170) at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:188) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:245) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:237) at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:232) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59) at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72) at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:158) at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:57) at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala) You might have come across a similar exception while working with AVRO schemas. Kafka throws this exception due to a compatibility issue Continue Reading

spark streaming with kafka

Assimilation of Spark Streaming With Kafka

As we know Spark is used at a wide range of organizations to process large datasets. It seems like spark becoming main stream. In this blog we will talk about Assimilation of Spark Streaming With Kafka. So, lets get started. How Kafka can be integrated with Spark? Kafka provides a messaging and integration platform for Spark streaming. Kafka act as the central hub for real-time streams of Continue Reading

KnolX: Learning Kafka Streams with Scala

Hello everyone, Knoldus organized a session on 22nd September 2017. The topic was “Learning Kafka Streams with Scala”. Many people attended and enjoyed the session. In this blog post, I am going to share the slides & video of the session. Slides: Video: If you have any query, then please feel free to comment below.  

Case Study to understand Kafka Consumer and its offsets

In this blog post, we will discuss mainly Kafka Consumer and its Offsets. We will understand this using a case study implemented in Scala. This blog post assumes that you are aware of basic Kafka terminology. CASE STUDY: The Producer is continuously producing records to the source topic. The Consumer is consuming those records from the same topic as it has subscribed for that topic. Continue Reading

Joins in Kafka

Join Semantics in Kafka Streams

Introduction to core concepts:   Apache Kafka is a distributed streaming platform which enables you to publish and subscribe to a stream of records also letting you process this stream of records as it occurs. Kafka Streams is a client library used for building applications and microservices, where the input and output data are stored in Kafka clusters. Interface KStream<K, V> is an abstraction of Continue Reading

A Java Lagom service which only consumes from Kafka topic (Subscriber only service)

Subscriber only service means an application which only consumes, does not produce. We have generally seen the applications which both produces and consumes data from a Kafka topic but sometimes we need to write an application which only consumes data i.e. consumes data from a 3rd party service. So in this blog I am going to explain how to write a Lagom service which only Continue Reading

Having Issue How To Order Streamed Dataframe ?

A few days ago, i have to perform aggregation on streaming dataframe. And the moment, i apply groupBy for aggregation, data gets shuffled. Now the situation arises how to maintain order? Yes, i can use orderBy with streaming dataframe using Spark Structured Streaming, but only in complete mode. There is no way of doing ordering of streaming data in append mode and update mode. I Continue Reading

%d bloggers like this: