Today we are to going to talk about a very important aspect of reactive web applications development. It’s not from the alien world, we always talk about it but we rarely want to deal with it. It’s Cache. In today’s mammoth scalable architecture, we are mostly surrounded by big design issues and somewhere here and there we neglect benefits of using a very useful concept of Cache.
Here, our superman of reactive-world Lightbend Inc. gave us the solution to this problem as well, Akka-Http Cache. Though it has been designed on top of Caffeine framework which in itself is highly efficient Caching solution based on Java 8, it provides us the capability to implement caching in highly concurrent or Asynchronous environment, which makes it special as scale and concurrency are inseparable for building any robust application to handle Zillions of users.
Akka-Http provides us cache solution in 2 different forms: Request-Response Caching (also called caching directives) and Objects Caching. In this post, we will discuss Object Caching in Akka-HTTP.
Many a times, there are requirements for caching heavy computation generated objects which are to be served to many client requests. In such cases cache saves us from recomputing such objects again and again instead we can directly serve the requests from the value from the cache which is saved in cache on first request arrival.
This is very well handled by Akka-Http caching solution supported by Caffeine under the hood.
Let’s roll up our sleeves and write some real code to explain it better.
package com.knoldus.inc | |
import akka.actor.ActorSystem | |
import akka.http.caching.LfuCache | |
import akka.http.caching.scaladsl.Cache | |
import akka.http.scaladsl.Http | |
import akka.stream.ActorMaterializer | |
import com.knoldus.inc.routes.UserRoutes | |
import com.knoldus.inc.shopping.ComputeCart | |
import com.typesafe.config.ConfigFactory | |
import scala.concurrent.ExecutionContextExecutor | |
import scala.io.StdIn | |
/** | |
* *******************Driver Program for Akka-Http Application******************************** | |
* | |
* | |
* Akka-Http example demonstrating usage of Cache in Akka-Http | |
* 1. We have build cart component of Online Shopping | |
* 2. For every product added into the cart we compute total cart value by adding price of | |
* all products | |
* 3. For every request for total cart value we don't want to compute this Heavy Computation | |
* again and again until there are no new products added into the shopping cart | |
* 4. We have just taken a simple scenario to describe a heavy computation but in Production | |
* it can be call to some un-managed service whose result we want to cache. | |
* 5. Since float arithmetic is heavy computation from CPU Perspective so we can use Cache | |
* to avoid re-computation of total cart untill there are no new products added | |
*/ | |
object Client { | |
def main(args: Array[String]): Unit = { | |
implicit val system: ActorSystem = ActorSystem() | |
implicit val materializer: ActorMaterializer = ActorMaterializer() | |
implicit val executionContext: ExecutionContextExecutor = system.dispatcher | |
/** | |
* 1. Instantiation of Akka-Http Cache, LfuCache | |
* 2. By Default Caching Strategy Provided by Akka-Http is Least Frequently Used | |
* 3. For customizing additional setting like time to live etc | |
* we can use CachingSettings class and configure it for our requirement | |
*/ | |
val cache: Cache[String, Float] = LfuCache[String, Float] | |
val computeCart = new ComputeCart(cache) | |
val userRoutes: UserRoutes = new UserRoutes(cache, computeCart) | |
/** | |
* Setting up Akka-Http Server and binding the routes | |
*/ | |
val config = ConfigFactory.load("settings.properties") | |
val hostname = config.getString("http.host") | |
val port = config.getInt("http.port") | |
val server = Http().bindAndHandle(userRoutes.routes, hostname, port) | |
println(s"Listening on $hostname:$port") | |
println("Http server started!") | |
StdIn.readLine() | |
server.flatMap(_.unbind) | |
system.terminate() | |
println("Http server terminated!") | |
} | |
} |
This is the driver object our Akka-HTTP application containing the main method. We have instantiated Cache object here with implementation class LfuCache which is the implementation of Least Frequently Used cache strategy. This is a frequency based caching strategy where eviction of cache object depends upon the frequency of access. Internally an access counter is maintained which is incremented with each access to the cache. This counter has a threshold value of 15 and if the counter is further increased all the values are downsampled. It simply means access counter is maintained for the small time window, this allows saving storage space for access counter which is only 4 bits now. For cache insert policy it follows TinyLFU (based on Bloom Filter theory) which is a very efficient approach for cache insertion based on the time window and decides eviction of keys in the cache. For further reading, you can refer to TinyLRU.
package com.knoldus.inc.shopping | |
import akka.http.caching.scaladsl.Cache | |
import akka.stream.Materializer | |
import com.knoldus.inc.entities._ | |
import scala.collection.mutable.ListBuffer | |
import scala.concurrent.{ExecutionContext, Future} | |
import scala.util.control.Breaks | |
/** | |
* | |
* @param cache | |
* @param executionContext | |
* @param materializer | |
*/ | |
class ComputeCart(cache: Cache[String, Float])(implicit val executionContext: ExecutionContext, implicit val materializer: Materializer) { | |
var productList: ListBuffer[Product] = ListBuffer.empty | |
val CART_VALUE: String = "cartValue" | |
/** | |
* Calculating total cart value from current shopping cart | |
* | |
* @return | |
*/ | |
def calculateTotalCartValue(): Future[Float] = { | |
var total: Float = 0 | |
for (product <- productList) { | |
total += product.productPrice | |
} | |
Future.successful(total) | |
} | |
/** | |
* If total value is present into the cache return it else set Future of Compute login into the cache which | |
* will eventually set the value of cache | |
* | |
* @return | |
*/ | |
def getCartValue(): Future[Float] = { | |
cache.getOrLoad(CART_VALUE, _ => calculateTotalCartValue()) | |
} | |
/** | |
* If product is added into cart refresh the value of key CART_VALUE in cache and update the same | |
* | |
* @param product | |
* @return | |
*/ | |
def addProduct(product: Product): ListBuffer[Product] = { | |
productList += product | |
cache.remove(CART_VALUE) | |
cache.getOrLoad(CART_VALUE, _ => calculateTotalCartValue()) | |
productList | |
} | |
/** | |
* If product is removed from cart refresh the value of key CART_VALUE in cache and update the same | |
* | |
* @param productId | |
*/ | |
def removeProduct(productId: String): Unit = { | |
val break = new Breaks | |
break.breakable { | |
for (product <- productList) { | |
if (product.productId.equalsIgnoreCase(productId)) { | |
productList -= product | |
cache.remove(CART_VALUE) | |
cache.getOrLoad(CART_VALUE, _ => calculateTotalCartValue()) | |
break.break() | |
} | |
} | |
} | |
} | |
} |
Akka-Http played a smart move and didn’t provide set and get as two different methods. Instead to set and get the cache, it gave a higher order function getOrLoad(Key, Function to Compute Cached Value), where the cached value is a Future wrapped function to compute the value corresponding to the key.
It’s not over yet, superheroes should take care of supervillains. Here, we are talking about Concurrent requests want to cache same objects into Cache, how to handle this situation?
Solution
This whole process is asynchronous and it avoids to cache multiple copies of same cache object. Only for first access of cache future is put into the cache and for subsequent requests either if future is completed then the value is returned else future is returned from the cache.
Here, every time a product is added to the cart, total cart value is updated by recomputing using the price of the new product added into the cart and correspondingly cache key CART_VALUE is also refreshed with the new value. But every time cart value is accessed we are not going to recompute total value. We will simple lookup for CART_VALUE in the cache and return the value.
Here compute cart value is representing a heavy computation task which we want to avoid to run for every single request instead we want to serve it from cache to avoid redundant computation. In real life scenario, it can be accessing some third-party API which we want to avoid if there is no change in response as per business use case.
For complete application with details please click here.
For further reading from Akka-Http Lightbend Inc. official documentation please click here.
Hope this will help you to build even better Reactive Applications to serve the world ♥