Author: Chiranjeev Kumar

Introduction to Circuit Breaker

Reading Time: 3 minutes Resilience4j is a lightweight, fault tolerance library for Java-based applications. It provides a set of simple and easy-to-use APIs to help developers build resilient and fault-tolerant applications. Resilience4j is built on the principles of the circuit breaker, bulkhead, and retry patterns, and is designed to help developers handle and recover from failures in distributed systems. Resilience4j is designed to be modular and extensible, so developers Continue Reading

Connecting Axon application to DocumentDb 

Reading Time: 5 minutes Step 1 : Create an Amazon EC2 Instance This EC2 instance is for ssh tunneling to aws documentDB cluster and application running locally. By  using cmd: chmod 400 keypair-name   :-  Gives the user read permission, and removes all other permission. Step 2 : Create an Amazon DocumentDB Cluster Step 3 : Create a Java application with the Axon framework for Event-Sourcing.(Aws DocumentDB as event Continue Reading

Bearded confident maintenance engineer in white shirt is working in database center

Connecting to EventStoreDB

Reading Time: 3 minutes EventStoreDB is an industrial-strength Event Sourcing database that stores your critical data in streams of immutable events. It was built from the ground up for Event Sourcing and offers an unrivaled solution for building event-sourced systems. The core features such as guaranteed write concurrency model, granular stream, and stream APIs make EventStoreDB the best choice for event-sourced systems – especially when compared with other database Continue Reading

Kafka Connect Concepts

Reading Time: 5 minutes Kafka Connect is a framework to stream data into and out of Apache Kafka. A few major concepts. Connectors – the high-level abstraction that coordinates data streaming by managing tasks Tasks – the implementation of that how data is copied from Kafka Workers – the running processes that execute connectors and tasks Converters – the code used to translate data between Connect and the system sending or receiving data Continue Reading

Kafka connector with MongoDB

Reading Time: 3 minutes The MongoDB Kafka connector is a Confluent-verified connector that persists data from Kafka topics as a data sink into MongoDB as well as publishes changes from MongoDB into Kafka topics as a data source. Apache Kafka The Apache Kafka is an open-source publish/subscribe messaging system. Apache Kafka provides a flexible, fault tolerant, and horizontally scalable system to move data throughout datastores and applications. A system is fault tolerant if the Continue Reading

Streaming Data Pipelines using Kafka connect

Reading Time: 3 minutes Are the messages produced by using one part of a utility to apache Kafka and eaten up with the aid of some other part of that equal utility, different applications aren’t interested in those messages? Allow’s imagine that the distributing statistics units which are computed these rubdowns may also need those messages into an outside gadget or push it from an external carrier. So having Continue Reading

Paradigms in Pentaho Data Integration

Reading Time: 4 minutes PDI has three paradigms for storing user input Arguments Parameters Variables Arguments A PDI argument is a named, user-supplied, single-value input given as a command-line argument (running a transformation or job manually from Pan or Kitchen, or as part of a script). Each transformation or job can have a maximum of 10 arguments. Each argument declared as space-separated values given after the rest of the Continue Reading

JDBC connection with Pentaho

Reading Time: 3 minutes Pentaho Data Integration allows you to define connections to multiple databases provided by multiple database vendors(MySQL, Oracle, Postgres, and many more). Pentaho Data Integration ships with the most suitable JDBC drivers forsupported databases and its primary interface to databases is through JDBC. Vendors write a driver that matches the JDBC specification and Pentaho Data Integration uses the driver. Unless you require extensive debugging or have Continue Reading

Pentaho Data Integration – Getting Started With Transformations

Reading Time: 5 minutes Pentaho Data Integration (PDI) is an extract, transform, and load (ETL) solution that uses an innovative metadata-driven approach. PDI includes the DI Server, a design tool, three utilities, and several plugins. You can download the Pentaho from URL:- https://sourceforge.net/projects/pentaho/ Uses of Pentaho Data Integration Pentaho Data Integration is an extremely flexible tool that addresses a broad number of use cases including: Data warehouse population with Continue Reading

Postman (CRUD operations)

Reading Time: 3 minutes Postman is the collaboration platform for API development. Postman simplifies each step of building an API and streamlines collaboration so you can create better APIs CRUD operations CRUD Meaning: CRUD is an acronym that comes from the world of computer programming and refers to the four functions that are considered necessary to implement a persistent storage application: create, read, update and delete. Persistent storage refers to any data storage device Continue Reading

React JS : Components

Reading Time: 6 minutes React js is a JavaScript library for building user interfaces. Now we will study about components of react js. Components Components let you split the UI into independent, reusable pieces, and think about each piece in isolation. This page provides an introduction to the idea of components Conceptually, components are like JavaScript functions. They accept arbitrary inputs (called “props”) and return React js elements describing Continue Reading