Caching is a quick and easy way to save roundtrips between your application and where you store your data. However, it’s not as easy as just snapping a map into your application – to really leverage a cache, you not only have to understand caching, where a cache can be used, but how it can affect what your application does and how your architecture is used.
Caching is the use of an intermediate and temporary data store to preserve computation or data retrieval time.
There are four basic strategies for caching, with two of them being fairly rare. They are: local caching, distributed caching, system of record with local storage, and event handling. Of the four of them, the first is by far the most common, the second is growing in deployments, the third is very rare but shouldn’t be, and the last is almost entirely inappropriate for caching approaches – but is used nonetheless.
That doesn’t mean that the distribution of caching strategies is appropriate. The plain caching approach is workable, but if it’s used for the most common case – intermediate data storage – it usually means your data takes too long to retrieve. The second most common case is for memoization – storage of intermediate and calculated results – and local caching is generally fine for this, although distributed caching is better because it prevents different participating processors from having to calculate the same intermediate results.