Creating Object Pool(s) in Scala

Table of contents
Reading Time: 3 minutes

We are currently working on a very exciting web scale project. The framework is built using Scala and Akka actors and till date we are quite pleased with the performance. The architecture is plugin based where we can dynamically add plugins to our framework for processing incoming messages. Now, this is where it gets interesting and is the reason for this post.

Some of the plugins are expensive to create. They would rather be created when the framework is started and pooled in memory. That brings us to the way we created our object pool(s). For our scenario, the we could actually end up having multiple object pools because each plugin would have a different # of pooled plugins. So we end up with something like this in our configuration file

As you would notice, we have a pool of 10 for foo and 5 for bar. Now, when our Akka actors need to pick up a plugin, they go to the PoolManager and ask for the plugin.

The PoolManager (is a Scala object) keeps track of the PluginPool(s) which are populated when the framework is initialized. It maintains a map of all the PluginPool(s) available. Hence, at the time of framework initialization the following function of PoolManager is called,

The PluginPool class looks like this

This has been inspired by the connection pool written for scala.

And now, let us look at how the Akka actor gets the plugin. It makes a call to getPlugin() with the name of the plugin

The PoolManager picks up the right pool and calls fetchPlugin on the pool.

Once the actor is done with its processing, it releases the plugin back to the pool so that it is available for a different actor.

Written by 

Vikas is the CEO and Co-Founder of Knoldus Inc. Knoldus does niche Reactive and Big Data product development on Scala, Spark, and Functional Java. Knoldus has a strong focus on software craftsmanship which ensures high-quality software development. It partners with the best in the industry like Lightbend (Scala Ecosystem), Databricks (Spark Ecosystem), Confluent (Kafka) and Datastax (Cassandra). Vikas has been working in the cutting edge tech industry for 20+ years. He was an ardent fan of Java with multiple high load enterprise systems to boast of till he met Scala. His current passions include utilizing the power of Scala, Akka and Play to make Reactive and Big Data systems for niche startups and enterprises who would like to change the way software is developed. To know more, send a mail to hello@knoldus.com or visit www.knoldus.com