What is Lagom?
Lagom is an open source micro-service framework, built with Akka message-driven runtime and Play web framework and finally light bend service orchestration. Mixing all these technologies abstracts away the complexities of building, running, and managing microservice architectures.
Lagom is a Swedish word meaning “just the right amount”.
Often, when people talk about microservices, they focus on the micro part of the name, assuming it means that a service should be small. We want to emphasize that the important thing when splitting a system into services is to find the right boundaries between services. This means aligning them with bounded contexts, business capabilities, and isolation requirements. A system of right-sized microservices will naturally achieve scalability and resilience requirements and be easy to deploy and manage. So, rather than focus on how small your services should be, design “Lagom” size services.
Lagom gives you two API to play with “Java” and “Scala”
And two tools to work with “Maven” and “Sbt“.
Here in this article, we will build the wordcount example using the Scala language with Sbt build tool.
Little Bit About lagom Architecture: –
Lagom is a framework for creating microservice-based systems. It offers four main features:
- Service API
- Persistence API
- Development Environment
- Production Environment
The Service API provides a way to declare and implement service interfaces, to be consumed by clients. This API will generally contain the Route or the API calls defined in it. For location transparency, clients discover services through a Service Locator. The Service API supports synchronous request-response calls as well as asynchronous streaming between services.
The Second layer which implements the service layer is known as Impl-layer where the methods declared in API layer and implemented.
The Persistence API provides event-sourced persistent entities for services that store data. Command Query Responsibility Segregation (CQRS) read-side support is provided as well. Lagom manages the distribution of persisted entities across a cluster of nodes, enabling sharding and horizontal scaling. Cassandra is provided as a default database.
The Development Environment allows running all your services, and the supporting Lagom infrastructure, with one command. It hot-reloads your services when code changes; no fragile scripts are needed to set up and maintain a development environment. With Lagom, a developer can bring up a new service or join an existing Lagom development team in just a few minutes.
Lightbend ConductR is the out-of-the-box supported production environment. Lightbend ConductR allows simple deployment, monitoring, and scaling, of Lagom services in a container environment. In our next blog, we will see how we can run the services using conductR.
Lagom Project Description: –
The code for lagom service is present on path: – https://github.com/piyushknoldus/lagom-scala-wordcount
This is a Scala Sbt project that demonstrates the legacy wordcount example using the Lagom Framework.
This is a microservices-based Kafka Producer/Consumer application where one lagom service produces data in Kafka and persist events in Cassandra Db. The other service consumes data from Kafka and gives the word count of the words.
Here, we are using embedded Kafka and Cassandra. You can use external ones as well. For that, you just need to uncomment the last 4 lines of the build.sbt and install Kafka and Cassandra on your machine.You need to start Zookeeper server, Kafka server, and Cassandra before starting the application as well.
The Runnable code and steps to configure it are present at: https://github.com/piyushknoldus/lagom-scala-wordcount/wiki/Setting-up-the-development-environment
Go and have a try and if any issue or problem persists ping me on my email Id or create an issue at https://github.com/piyushknoldus/lagom-scala-wordcount/issues
for more details about lagom and tutorials visit the link: https://www.lagomframework.com/