Author: Shashikant Tanti

Event Sourcing CQRS using Axon

Reading Time: 4 minutes Event Sourcing and CQRS are two of the original patterns for data management in a Microservices architecture. In this Blog, we understand the various data management patterns for microservices. What is Event Sourcing? At the high-level, Event Sourcing act like storing the state of the application in the form of domain events. I have also used a working example for Event Sourcing using Axon and Spring Continue Reading

Spring-Webflux: How to test controllers?

Reading Time: 3 minutes Introduction : While working with Spring MVC, you might have used Spring MockMVC to perform testing of Spring web MVC controllers. MockMVC class is part of the Spring MVC test framework which helps in testing the controllers explicitly starting a Servlet container. But, this is not something that will work if you are using SpringBoot Webflux. If you have a Spring application built with Webflux, the MVC controllers can be tested Continue Reading

Axon vs Kafka

Reading Time: 4 minutes Introduction One of the most common discussion points that come up regularly in interactions with Customers/Prospects/Forums is how does Axon compare to  Apache Kafka? Can one do the job of the other? Are they complementary to each other? Can they work together? Does Axon provide any capabilities to work with Kafka? Apache Kafka is a very popular system for publishing and consuming events. Its architecture Continue Reading

Salient Features of Spring WebFlux

Reading Time: 4 minutes What is Spring WebFlux? Spring WebFlux is a fully non-blocking, annotation-based web framework built on Project Reactor that makes it possible to build reactive applications on the HTTP layer. WebFlux uses a new router functions feature to apply functional programming to the web layer and bypass declarative controllers and RequestMappings. It requires you to import Reactor as a core dependency. WebFlux was added in Spring 5 as a reactive Continue Reading

Spring Cloud GCP for Pub/Sub

Reading Time: 3 minutes This is the first blog in the Spring Cloud GCP- Cloud Pub/Sub. In this blog, we will create an application to receive messages that are sent by the sender application. We will use Google Cloud Pub/Sub as the underlying messaging system. To integrate Google Cloud Pub/Sub with our application we will be using Spring. As part of this blog, we have to download the basic Continue Reading

Project Loom-OpenJDK

Reading Time: 4 minutes 1. Project Loom Project Loom is an attempt by the OpenJDK community to introduce a lightweight concurrency construct to Java. The prototypes for Loom so far have introduced a change in the JVM as well as the Java library. Before we discuss the various concepts of Loom, let’s discuss the current concurrency model in Java. 2. Java’s Concurrency Model Presently, Thread represents the core abstraction of concurrency in Java. Continue Reading

How to implement Data Pipelines with the help of Beam

Reading Time: 4 minutes Throughout this blog, I will provide a deeper look into this specific data processing model and explore its data pipeline structures and how to process them. Apache Beam Apache Beam is one of the latest projects from Apache, a consolidated programming model for expressing efficient data processing pipelines. It is an open-source, unified model for defining both batches- and streaming-data parallel-processing pipelines. The Apache Beam programming model Continue Reading

Data Mesh: The Four Principles of the Distributed Architecture

Reading Time: 4 minutes A data mesh is a decentralized architecture devised by Zhamak Dehghani, director of Next Tech Incubation, principal consultant at Thoughtworks, and a member of its technology Advisory Board. An intentionally designed distributed data architecture, under centralized governance and standardization for interoperability, enabled by a shared and harmonized self-serve data infrastructure. Key uses for a data mesh Data mesh’s key aim is to enable you to get Continue Reading

Streaming Kafka Messages to Google Cloud Pub/Sub

Reading Time: 3 minutes In this blog post i present an example that creates a pipeline to read data from a single topic/multiple topics from Apache Kafka and write data into a topic in Google Pub/Sub. The example provides code samples to implement simple yet powerful pipelines.also provides an out-of-the-box solution that you can just ” compatiable.This consicutive example is build in Apache Beam.And it can be downloaded here.So, we hope you will find this Continue Reading

Big Data Processing with Apache Beam

Reading Time: 4 minutes Introduction In this world, daily every minute, every second, lots of data is generated from a variety of data sources. So, it is very tedious to extract and process information from it. In order to solve these problems, Apache Beam comes into the picture. Apache Beam is an open-source unified programming model to define and execute data processing pipelines, transformation, including ETL and processing batch Continue Reading

Google BigQuery: Cloud Data Warehouse

Reading Time: 6 minutes BigQuery is a data warehouse to work with large amounts of data. With BigQuery, one can collect data from various sources, store the data, analyze the data, and eventually; be able to visualize the analysis in multiple ways. This blog talks about BigQuery, its various features, and use cases. This blog will go through the following topics: Introduction BigQuery Working Features of BigQuery Introduction Ever since the dawn of computing, and subsequently the internet, Continue Reading

Java 8 : Lambda Streams

Reading Time: 4 minutes The official Java 8 release came with a myriad of features, the most prominent of which are undoubtedly lambdas and the Java stream API. Many projects upgraded to Java 8 just to leverage the sweet lambda syntax, or because existing frameworks updated themselves to use them. Java streams are no less important. What Are Streams in Java? The whole idea of Java streams is to Continue Reading

Vert.x — Learnings about a reactive framework

Reading Time: 4 minutes What is Vert.x? A typical Vert.x application consists of multiple verticles, which are modules that can be deployed and scaled independently. Verticles communicate with each other by sending messages over an event bus. Since Vert.x is a polyglot framework so we can implement each verticle in a different programming language. (currently officially supported languages: Java, JavaScript, Groovy, Ruby, Ceylon, Scala and Kotlin). Vert.x is unopinionated, which means that it doesn’t Continue Reading