Jargon Monday: Caching

Welcome to Jargon Monday, a series that explains technical terms without needing a PhD in Computer Science to understand!

In this first instalment, we’ll dive into what caching is and how it can help you build better software.


Imagine you fly weekly to your regional branch office. Let’s assume that the connection is very stable and the flight times don’t change much throughout the year.

Now, each week when planning the trip you could visit the airline’s website and get up-to-date flight information directly from them. But their website is slow. So slow that you can probably finish your performance reviews while waiting for it to load. And they make you click through five screens just to see the information you seek. Also, bookmarking the page doesn’t work (I’ll touch on why that can happen in another post). You get the point – you’d rather avoid going there until it’s really necessary.

But wait, you just remembered that the connection is pretty stable. So you decide to print out the flight times on a piece of paper and leave it on your desk, thus speeding the process up considerably. Of course, once in a while you will still need to visit the website to update your information. For the purpose of this example we’ll assume that you know exactly when that happens (maybe the airline sends you an email if the flight times have changed).

Voilà! You’ve just invented caching. That piece of paper is your cache, and the flight times are the information your are caching.

Now you can use your “cache” to speed up fetching the flight information. However, it can also act as your fail-safe: imagine that one day you get ready to plan your trip, but the airline’s website is down. If you haven’t copied the information anywhere, you are stuck waiting for the site to become available. But with your cached copy, you can use that to plan out your trip. You might need to adjust your plans a bit when you get to finally access the site again, but at least you are not just sitting there waiting.


In more formal terms, caching is a process of duplicating some relevant information from the source system (authoritative source) and placing it in a more easily accessible place (the cache).

The primary reason to cache data is to improve performance of your application (the time it takes to retrieve that information). Caching can also be used as a fallback mechanism to access the data when the source system in unavailable.


Caching is an excellent way to improve the system’s performance if the information you are trying to access doesn’t change too frequently or you can tolerate slightly out-of-date information. You can control how fresh the data in the cache is by the process of cache invalidation, which we’ll cover in another instalment of Jargon Monday.

Meanwhile, don’t be afraid to reach out to your tech team to ask what can and what should not be cached in your system!





Leave a Reply

Your email address will not be published. Required fields are marked *