A New Take on Caching

A New Take on Caching

Any application out there in the world, especially production level applications possess a huge amount of data. These are obviously stored in a secure location. But every time the application is in need of using these data, it has to access those storages. This practice arouses a few inconveniences when the data is required faster or frequently. Also, when reading data becomes an expensive operation, it is quite inconvenient for the application to function robustly. If you’re looking to overcome this inconvenience, caching is your solution.

Caching enables you to access your stored data much faster. The data which is stored in a cache can be from the direct data source or generated by processing the data in a request. Hence, in subsequent requests the application does not have to access the data source or reprocess the data. Instead, it can simply access the cache and serve the request smoothly and much faster.

Using caching as is, has a small problem with it. The cache will be available to use as long as the application is up and running. The moment the application restarts, the cache will be empty and you will lose the data which was previously in the cache. The solution for this is using a cache which can persist.

A persistent cache stores the data in the file system or the system memory. In a situation where the application stops or crashes, the data in the cache will not be lost. Instead, it will be stored in the file system or the system memory, waiting to be loaded back to the new cache created after the restarting of the application. This ensures that the application will pick up from the same state as it was before crashing/stopping.

Check out the linked white paper to get an understanding of a few of the existing solutions for persistent caches and some of their limitations.

Furthermore, the linked white paper introduces an approach that can be taken to solve the limitations that arise from existing solutions. It is noteworthy to mention that the introduced approach is a customization done due to certain stipulations. Check out the white paper to get to know what those stipulations are. If you have similar stipulations, the introduced approach is the perfect fit for you to implement a persistent cache.

The following are the different types of caches built using this customization.

  • Basic Persistent Cache
  • Persistent Cache with TTL (Time-to-Live)
  • Persistent Cache with Per Row TTL
  • Persistent Loading Cache

The white paper extensively discusses each of these types and other related documentation needed to get you started. Before that let’s have a look at the features of this customization.

  • Any object type can be used as the key or value without depending on the library specific wrapper objects.
  • Does not require external configuration files.
  • Can specify a common TTL or a per-row TTL.
  • Loading cache features with TTL.
  • TTL can be specified as any timeunit using Java 8’s ChronoUnit.
  • Features like batch storing of data and saving keys/values if they are absent.

To wrap up, here we have introduced a customization for a feature rich persistent caching which can be used when a caching mechanism is needed with persistence. To get a comprehensive understanding of this approach do check out the white paper linked below.

Read More >

Let us know what you think of this approach.
Happy coding!

Mariam Zaheer

Senior Solutions Analyst - Solutioning

Related Blog post

Towards SOC Compliance

Towards SOC Compliance


September 2, 2020

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.