Content caching is a type of performance optimization in which files are loaded from nearby servers instead of a single origin server. When enabled, users save time by loading content from a local server instead of a potentially distant one.
Page load time is a key factor for an online service, and directly impacts pageviews, sales and customer satisfaction. If a site is only available from a single server, users can face delays of hundreds of milliseconds when loading content (such as servers in North America being accessed from Asia).
The solution is to store files on servers around the globe, with multiple ones serving each region. Enterprises coordinate with content delivery networks (CDNs) to distribute their files and ensure users have fast, local access to a site no matter the location.
How it Works
CDN providers distribute servers in data centers around the world, and work with ISPs and telecom carriers to ensure they have fast access to the Internet. An enterprise then grants the CDN provider access to their central server (called an origin server) to enable the content caching solution.
Here’s a step-by-step description of how content caching works:
- A user clicks a link to a webpage, which may contain references to large media files (videos, audio or images).
- The user’s browser automatically connects to the closest available caching server. Under the hood, this communication uses “Anycast DNS”, which allows the closest server to respond to a request (just like emergency numbers like 911 or 112 automatically connect to a nearby police station).
- If the requested file is available on the caching server, it is delivered to the user. This is the common case, and the page loads quickly since the server is the closest one to the user.
- If the requested file is not on the caching server, it is fetched from the enterprise’s origin server, delivered to the user, and stored for later use. This scenario is slower, since the file must be loaded from the main server, but rare. It only happens when the file is first accessed or if the cache has been reset.
Once a file has been accessed by a single user in the CDN, it will provide fast access for all users in the same region. (Content can additionally be cached on a user’s computer for later visits to the same site, but this does not speed up access for other users.)
Content Caching Example
Say a Netflix subscriber in London wants to stream the show House of Cards. To ensure fast access and minimum buffering time, Netflix would copy the videos from their origin servers in Los Gatos, CA to the caching server closest to London. All subscribers in London would see a speedup when accessing the show, and avoid the need for a trans-Atlantic file transfer.
Moving content within a distributed network is called “pushing content to the edge,” and Netflix excels at this operation. Over time, they have built up a caching empire by partnering with numerous ISPs as part of their Netflix Open Connect program. Of course, this is all hidden from subscribers, who just notice the results of faster content delivery.
Content caching is a true win-win, and benefits both users and providers of online services.
- Users see faster load times for digital content, whether that means videos, images, compressed files, web pages or online games.
- Enterprises see higher customer satisfaction and engagement, avoiding the chance that distant users abandon the site for performance reasons.
- Additionally, enterprises see lower bandwidth costs since files are served from local caching servers, which typically have bulk data transfer rates.
We live in an age where 47% of consumers expect pages to load in under 2 seconds, and 40% abandon sites that take over 3 seconds to load. With that said, content caching is a truly vital performance optimization for any business that serves users online.
If you have any questions about the content of this article, please feel free to reach out to the Support Team for assistance, we're available 24/7 for your convenience.