Micro-caching is the processes of temporarily storing event-driven web content for very brief periods of time (as short as one second).
Overview
A normal website may handle X requests per second with ease, but if this increases to X++ requests per second, the website may slow down. In this case, the web server can no longer handle requests as expected. Micro-caching temporarily caches certain content for a very short period of time. During this time, all other requests for the same content get their content from the cache instead of the origin server. Micro-caching works the same as standard content caching, but there are two differences: 1) the period of time the content is cached (very short) and 2) what type of content is cached (event-driven, often mistaken as dynamic).
How Micro-Caching Works
Micro-caching caps the amount of requests reaching the origin server by having cache servers store content for very short periods of time. Enabling micro-caching involves setting the path and configuring the cache parameters in the proxy_cache_path
directive. You activate the process using the proxy_cache
directive:
proxy_cache_path /tmp/cache keys_zone=cache:10m levels=1:2 inactive=600s
max_size=100m ;
server {
proxy_cache cache;
proxy_cache_valid 200 1s;
...
}
In the scenario above, caching is allowed for one second for responses with a 200 OK
status code. The path defines the location where to store the cache. This can be anywhere, however, using the shared RAM provides the best performance. The key_zone=cache: 10M
is the maximum size of an individual file stored in the cache. Max_size: 100M
is the entire size of the cache and cannot be larger than the available memory. The inactive=600s
is the inactivity period with no hits after which the server removes the cache. When users request an asset, the content is cached so that the next user requesting the same asset will be served from this cache. As a result of micro-caching, if 200 users request the asset within 5 seconds, only 1 in 40 users will have to access the original content (assuming 40 users per second). This example is illustrated in the visual glossary image above.
Example of Micro-Caching
A CDN can be used to cache event-driven content for high-traffic websites. The content on these sites remains static for an unpredictable length of time until an event takes place or something happens. Examples of such sites are:
- News sites: The content is updated in an unpredictable manner as the news happens. In addition, the content may have to be updated as new comments are added.
- Stock trading websites: The stock prices change frequently during trading and only remain static for a very short period of time. However, the prices remain static for longer durations during the night, weekends, holidays, and other times when there is no trading.
- Sports score websites: The number of scores and time of scoring are usually unpredictable and the results should be updated every time a team scores.
Caching the event-driven content gives the impression that the content is dynamic and ensures that visitors continue to have fast access to the latest content.
Conclusion
Micro-caching is an effective method for accelerating the delivery of “dynamic,” non-personalized website content without compromising the end user experience. In most cases, micro-caching is only suitable for the data that does not affect the operations, integrity, or security of the processes. It ensures that, when the web page or site is under heavy load, most of the visitors get a copy of the page served from the static content cache rather than the origin server. The process reduces the load on the server and improves the overall performance of the website.
As always, If you have any questions or concerns about any of the topics mentioned in this article, please feel free to reach out to support. Live chat and ticket support are available 24/7.