scala

Use of Either in Scala

Reading Time: 3 minutes In this blog, we are going to see the use of Either in scala. We use Options in scala but why do we want to go for Either? Either is a better approach in the respect that if something fails we can track down the reason, which in Option None case is not possible.We simply pass None but what is the reason we got None Continue Reading

Reading Avro files using Apache Flink

Reading Time: 2 minutes In this blog, we will see how to read the Avro files using Flink. Before reading the files, let’s get an overview of Flink. There are two types of processing – batch and real-time. Batch Processing: Processing based on the data collected over time. Real-time Processing: Processing based on immediate data for an instant result. Real-time processing is in demand and Apache Flink is the Continue Reading

Using Apache Flink for Kinesis to Kafka Connect

Reading Time: 3 minutes In this blog, we are going to use kinesis as a source and kafka as a consumer. Let’s get started. Step 1: Apache Flink provides the kinesis and kafka connector dependencies. Let’s add them in our build.sbt: Step 2: The next step is to create a pointer to the environment on which this program runs. Step 3: Setting parallelism of x here will cause all Continue Reading

Going stateful with Gatling Session API.

Reading Time: 4 minutes Hello Everyone, In our previous blog post Gatling feeders and feeder strategies we have discussed what are the different ways to inject the data in our simulation from different data sources.  Today we will discuss: Real-time use case (Problem Statement)  Solution: Gatling Session and Session API Injecting Data with Gatling session API Extracting data with Gatling session API Some common exceptions Real-time use case (Problem Continue Reading

Building API with gRPC using ProtoBuf – Part 3

Reading Time: 3 minutes In the previous blog, we discussed the Protocol buffers. How do they work? How to define the protobuf and compile it? Whats happens when we compile our protobuf file? How Protocol buffers were designed to solve many problems? Also how to generate Java or Scala code? In order to continue our series, we will try to create a small basic application with simple client requests Continue Reading

Dotty – Union Data Types

Reading Time: 2 minutes Hello folks, As we know that dotty is new Scala compiler (also know as Scala 3.0) which is coming with some new features and improvements. To get more details about the dotty and its environment setup. Please follow our beginner guide blog. In this blog, I will describe you about newly introduced Union data type of dotty and its various properties. A union is data type, which Continue Reading

Achieving Concurrency with Akka Actors

Reading Time: 3 minutes Java comes with a built-in multi-threading model based on shared data and locks. To use this model, you decide what data will be shared by multiple threads and mark as “synchronized” sections of the code that access the shared data. It also provides a locking mechanism to ensure that only one thread can access the shared data at a time. Lock operations remove possibilities for Continue Reading

Combining Gatling Reports

Reading Time: 4 minutes Hi guys, In this blog, we shall discuss about the report generation through Gatling and combining Gatling Reports. As you may already know, when we run a Gatling test the report gets generated automatically. However, you cannot compare the Gatling test report of two different tests in a single report unless you have the enterprise version of Gatling. I will try to explain an easy Continue Reading

Getting Started with Akka-Streams

Reading Time: 4 minutes As the world is growing, so is the data. And analysis of this data has become important. But how will you do it? How will you work with the data whose size is unknown to you? A solution to this scenario is Akka-Streams and here I’m going to discuss it. In this blog you will get to know the basics of Akka-Streams just to get Continue Reading

Streaming from Kafka to PostgreSQL through Spark Structured Streaming

Reading Time: 3 minutes Hello everyone, in this blog we are going to learn how to do a structured streaming in spark with kafka and postgresql in our local system. We will be doing all this using scala so without any furthur pause, lets begin. Setting up the necessities first: Dependencies Set up the required dependencies for scala, spark, kafka and postgresql. 2. PostgreSQL setup Lets start fresh by Continue Reading

Beginner’s Guide to Design Patterns in DAML

Reading Time: 6 minutes DAML is an open-source language used to write distributed applications quickly,concisely and correctly. It runs on leading blockchain platforms like Hyperledger Sawtooth , fabric and Corda. It is used to build smart contracts for distributed ledgers and provide us with ability to focus more on business workflow instead of the blockchain implementation. In our previous blogs, Building Powerful Smart Contracts, Getting started with building Templates Continue Reading

Building DAML Applications with Scala Bindings

Reading Time: 3 minutes “Blockchain by itself isn’t transformational, however it is foundational. As a foundational innovation, Blockchain’s value can only be fully realized when the business process is transformed to take advantage of its capabilities, leading to ROI for existing business models and the ability to create value through new ones.”  Said by  ― Tom Golway, Planning and Managing ATM Networks So, what is a Blockchain ? Blockchain Continue Reading