An Introduction to Caching in Mule Soft 4

Reading Time: 5 minutes

Mule runtime engine (Mule) offers customizable strategies, such as the Cache scope and the HTTP Caching API Gateway policy, to enable cache according to your needs. Let’s first have a look at caching, being followed by Cache scope:


It is the process of storing frequently used data in memory, file system, or database which saves processing time and load if it would have to be accessed from the original source location every time.

Storing the user data across the project flows and the cached data available among all flows of the project. Most useful when data does not change frequently or is static in nature. It allows you to efficiently reuse previously retrieved or computed data. Caching speeds up the response time for user requests and reduces the load on the back-end.

For example, We have to create an API to retrieve user information, that has been connected to an external database that is on a different server and fetch the records. (Assumption: external DB is not changing frequently)

Implementation Without Cache:
For every request, my API will connect to the external server DB and search for the record in DB and fetch the record.
if we receive 100 requests for the same user, then I need to connect and execute the DB query 100 times.

Implementation With Cache:
For every request for a different user, my API will connect to the external server DB and search for the record in DB and fetch the record and save it in an internal cache.
if we receive 100 requests for the same user, then I need to connect and execute the DB query only once, for all the 99 requests we can fetch the record for cache and return the response.

What we can clearly see in the above example is that with cache we can reduce the cost of reconnecting to DB again and again for the same request 99 times. This makes a whole lot of difference when we are dealing with a huge number of requests in a very short duration.

Mule Cache Scope and its configuration details:

Cache Scope: The cache scope is for storing and reusing frequently called data. We can use a Cache scope to reduce the processing load on the Mule instance and to increase the speed of message processing within a flow. The Cache scope only caches non-consumable payloads. It does not cache consumable payloads (such as a streaming payload), which can only be read once before they are lost. On each request it calculates the key using the below algorithm then stores the response payload of the flow as value for that key.

Key = SHA256keyGenerator + SHA256 Digest

Cache scope, by default, is used in-memory caching strategy. This has such a drawback that when mule starts caching large payloads it may reach memory limit and throw java heap exception. As a consequence, this defaults strategy should be replaced. This can be done by creating an object-store caching strategy.

How Cache Scope Works


Caching Configurations

Mainly, there are two strategies of caching in Mule:

Default Caching:

If you do not specify a caching strategy, it uses a default caching strategy. This strategy provides a basic caching mechanism. Everything will be cached in memory, which is a volatile ram and is non-persistent i.e. if you restart your application, the cached data will be lost. If you want to store a huge static payload, you must use a custom caching strategy.


Reference to a Strategy:

You can create a custom cache strategy using this option. In this, you can use Object Store and then define the cache size, time to live, and other configurations as per your requirement.

There are a few steps to configure a Cache Strategy:

  • Open the Caching Strategy Configuration window.
  • Define the name of the caching strategy.
  • Define the Object Store by selecting between Edit Inline and Global Reference.
  • Select a component for producing a key utilized for storing events inside the caching strategy.
  • Open the Advanced tab in the property window to configure the advanced setting.


Implementing Caching in Mule 4

I have generated a simple flow, so as to show the caching mechanism through a simple demo.

Here is a configuration XML for flow

<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:ee=""
	<http:listener-config name="HTTP_Listener_config"
		doc:name="HTTP Listener config"
		<http:listener-connection host=""
			port="8081" />
	<flow name="caching-demoFlow"
		<http:listener doc:name="Listener"
			path="/cache-demo/{empID}" config-ref="HTTP_Listener_config" />
		<logger level="INFO" doc:name="Start Log"
			doc:id="19c9912e-46d6-42a7-8edb-7b904c6402e0" message="Flow Started" />
		<ee:cache doc:name="Cache"
			<logger level="INFO" doc:name="Logger"
				message="Started Caching" />
			<http:request method="GET"
				doc:name="Consume req res service"
			<ee:transform doc:name="Set Response Payload"
					<ee:set-payload><![CDATA[%dw 2.0
output application/json
			<logger level="INFO" doc:name="Logger"
				message="Ended Caching" />
		<logger level="INFO" doc:name="End Log"
			doc:id="177f655a-6dcb-4ac6-a768-a6986aaecf66" message="Flow Ended" />

Application flow


Deploying the caching application


Now we can test our application.


caching request


First hit

cahing first

Second Hit after 2 mins, and the response comes from cached data with the lesser time taken.

caching second


By implementing it, we can improve our application’s performance by limiting the number of external HTTP requests. When used correctly, it will not only result in significantly faster load times but also decrease the load on your server. In many cases, we are not just making a request. We have to process the request as well, which could involve hitting a database, performing some sort of filtering, etc. Thus, caching can cut the latency from the processing of the request as well.