Introduction to Akka Streams-Part 2

In the previous blog Introduction to Akka Streams-Part 1, I have given a brief introduction about Akka Streams and its components. In this blog, I’ll be discussing the features of Akka Streams. Akka Streams is an implementation of Reactive Streams. The features of Akka Streams: Reusable pieces, Time based processing, Backpressure are discussed below in detail.

Reusable pieces

In many streams API, we can not reuse the code for the source, flow and sink. But Akka Streams library provides us the ease to do so.

In the above example, source has been created and has been reused to form two streams. Not only source, but all other components can be reused. Suppose there is a source that is producing the User data. Using the same source, an organisation wants to create two streams that filters the user on the basis of age and convert the data to bytestring and then write to a file.

For this case, we can implement a function to convert the user to a bytestring and write to file. This function that transforms a flow and writes to file can be reused multiple times.

Time Based Processing

Akka Streams allows limiting the flow of elements in a given duration. Using the throttle method that the library provides, we can specify the number of elements that we want to pass in a given duration. In the below example, the source is producing one element in 1 second.

Back pressure

It is the most important feature that is based on Reactive Streams. Before proceeding how backpressure work, let’s look at the problem:  why back pressure?

Suppose source is producing data at a very high rate and the consumer is slower and can’t handle data coming at such a fast pace. To handle such a scenario, the consumer sends a non-blocking and asynchronous messages to the publisher. Moreover, the publisher of the stream i.e the source ensures that it will not emit data more than the received demand of the consumer. In these non-blocking and asynchronous messages, the consumer sends the consumption rate at which it can receive data elements. The back-pressure protocol guarantees that the Publisher will never signal more elements than the signaled demand.

Stream Ordering in Akka Streams

Almost all computations in Akka Stream preserve the input order of streams. Suppose A1, A2, A3, …., An elements are produced by the source. After the stream passes through a processing stage, the processed elements will be B1, B2,…., Bn. The order will be preserved. If Ai happens to be before Aj in the input source, then the processed element Bi appears before Bj after passing through the processing stage of stream.

There are many other benefits of using Akka Streams and this library can be used for a number of use cases. For more information, refer Akka docs.


Leave a Reply

%d bloggers like this: