Author: Shashikant Tanti

Businessman playing digital tablet in cafe

Reliable Database Migrations with Liquibase and Spring Boot

Reading Time: 2 minutes In this blog, we look at database migrations with the very popular Liquibase database migration library and how you can use it in the context of a Spring Boot application. Setting Up Liquibase in Spring Boot By using default Spring Boot auto-configures Liquibase while we upload the Liquibase dependency to our construct document.Spring Boot makes use of the primary DataSource to run Liquibase (i.e. the only annotated with @primary if there is a couple of). In case we need to apply a special DataSource we can mark that bean as @LiquibaseDataSource.rather, we are able to set Continue Reading

Data centre

Liquibase with Spring Boot

Reading Time: 3 minutes The purpose of this blog is to show you the process of using Liquibase as a piece of your Spring Boot workflow. Springboot makes it easy to create standalone, production-maven Spring-based applications. Introduction Liquibase is an open-source database that has an independent library for tracking, managing, and applying database schema changes. Liquibase was started in 2006 and it is used to allow easier tracking of database changes, especially in an agile software Continue Reading

Analyzing data.

Liquibase: Database change management

Reading Time: 3 minutes Liquibase is one of the most protean tools for database migration. It operates on almost all SQL database outlets. So whether you use MySQL, SQL Garçon, Oracle, Firebird, or a combination of these or any other common database platforms, it’s got you covered. It also runs in any CI/ CD channel so long as you can run Java, install the connectors, and connect to your databases. This Continue Reading

Event Sourcing CQRS using Axon

Reading Time: 4 minutes Event Sourcing and CQRS are two of the original patterns for data management in a Microservices architecture. In this Blog, we understand the various data management patterns for microservices. What is Event Sourcing? At the high-level, Event Sourcing act like storing the state of the application in the form of domain events. I have also used a working example for Event Sourcing using Axon and Spring Continue Reading

Spring-Webflux: How to test controllers?

Reading Time: 3 minutes Introduction : While working with Spring MVC, you might have used Spring MockMVC to perform testing of Spring web MVC controllers. MockMVC class is part of the Spring MVC test framework which helps in testing the controllers explicitly starting a Servlet container. But, this is not something that will work if you are using SpringBoot Webflux. If you have a Spring application built with Webflux, the MVC controllers can be tested Continue Reading

Axon vs Kafka

Reading Time: 4 minutes Introduction One of the most common discussion points that come up regularly in interactions with Customers/Prospects/Forums is how does Axon compare to  Apache Kafka? Can one do the job of the other? Are they complementary to each other? Can they work together? Does Axon provide any capabilities to work with Kafka? Apache Kafka is a very popular system for publishing and consuming events. Its architecture Continue Reading

Salient Features of Spring WebFlux

Reading Time: 4 minutes What is Spring WebFlux? Spring WebFlux is a fully non-blocking, annotation-based web framework built on Project Reactor that makes it possible to build reactive applications on the HTTP layer. WebFlux uses a new router functions feature to apply functional programming to the web layer and bypass declarative controllers and RequestMappings. It requires you to import Reactor as a core dependency. WebFlux was added in Spring 5 as a reactive Continue Reading

Spring Cloud GCP for Pub/Sub

Reading Time: 3 minutes This is the first blog in the Spring Cloud GCP- Cloud Pub/Sub. In this blog, we will create an application to receive messages that are sent by the sender application. We will use Google Cloud Pub/Sub as the underlying messaging system. To integrate Google Cloud Pub/Sub with our application we will be using Spring. As part of this blog, we have to download the basic Continue Reading

Project Loom-OpenJDK

Reading Time: 4 minutes 1. Project Loom Project Loom is an attempt by the OpenJDK community to introduce a lightweight concurrency construct to Java. The prototypes for Loom so far have introduced a change in the JVM as well as the Java library. Before we discuss the various concepts of Loom, let’s discuss the current concurrency model in Java. 2. Java’s Concurrency Model Presently, Thread represents the core abstraction of concurrency in Java. Continue Reading

How to implement Data Pipelines with the help of Beam

Reading Time: 4 minutes Throughout this blog, I will provide a deeper look into this specific data processing model and explore its data pipeline structures and how to process them. Apache Beam Apache Beam is one of the latest projects from Apache, a consolidated programming model for expressing efficient data processing pipelines. It is an open-source, unified model for defining both batches- and streaming-data parallel-processing pipelines. The Apache Beam programming model Continue Reading

Data Mesh: The Four Principles of the Distributed Architecture

Reading Time: 4 minutes A data mesh is a decentralized architecture devised by Zhamak Dehghani, director of Next Tech Incubation, principal consultant at Thoughtworks, and a member of its technology Advisory Board. An intentionally designed distributed data architecture, under centralized governance and standardization for interoperability, enabled by a shared and harmonized self-serve data infrastructure. Key uses for a data mesh Data mesh’s key aim is to enable you to get Continue Reading

Streaming Kafka Messages to Google Cloud Pub/Sub

Reading Time: 3 minutes In this blog post i present an example that creates a pipeline to read data from a single topic/multiple topics from Apache Kafka and write data into a topic in Google Pub/Sub. The example provides code samples to implement simple yet powerful pipelines.also provides an out-of-the-box solution that you can just ” compatiable.This consicutive example is build in Apache Beam.And it can be downloaded here.So, we hope you will find this Continue Reading

Big Data Processing with Apache Beam

Reading Time: 4 minutes Introduction In this world, daily every minute, every second, lots of data is generated from a variety of data sources. So, it is very tedious to extract and process information from it. In order to solve these problems, Apache Beam comes into the picture. Apache Beam is an open-source unified programming model to define and execute data processing pipelines, transformation, including ETL and processing batch Continue Reading