Fastly and Magento

This is a quick post on some of the technical benefits of Fastly and how it can be used with a Magento store. This is not intended as an endorsement of Fastly over other CDN offerings – it is just some technical information on Fastly that I think is interesting.

Note: Magento Cloud has announced support for Fastly – this post is partly in response to a tweet question asking why Magento Cloud picked Fastly. TL;DR: it was a good feature match.

(Okay, so this started as a quick post and grew longer as I got into it. 3,000+ words is still somewhat “quick” isn’t it? Well, for me anyway! 😉

What is a CDN vs a Cache

A quick bit of background for those not familiar with the terms CDN and cache. A CDN (Content Delivery Network) is a set of machines typically distributed across the world that can be used to serve content. Because the machines are closer (in network distance) to end users, download latency is reduced. It also reduces the load on your main web server – for example, it is common to offload download of product images from your main web servers.

One of the questions with a CDN is how to get content loaded into it. If you want to put your images on the CDN and leave HTML pages on the main web server, that implies the application has to decide in advance which HTTP requests should go to the CDN and which go to the web server. And you need to keep both in sync.

A cache in the context of this post (sometimes also referred to as a “HTTP accelerator”) is a server that receives HTTP requests and tries to respond to the request from local high speed storage. If it does not have the content, it sends the request on to the real web server (“the origin”) and then saves the response for later use. The HTTP response from the web server includes headers to tell the cache how long the content is suitable to be cached for.

Caches can make management of a site a bit easier as all content is managed centrally. Caches can also safely lose content at any time – the master copy of the content is always available centrally. Caches are effectively transparent to the application in that the application will behave the same if the cache is present or not. The cache just speeds up requests by responding immediately without wasting CPU time on the main web server.

Fastly has both the properties of a CDN (with servers close to end users distributed across the globe) and a cache (there is no special need to upload content in advance). It will fetch content on first request then keep a copy (acting as a reverse proxy for the origin). So there is no need to ftp files to Fastly – you just have all the files accessible from your web server and away you go. Nice an easy.

Partial Cache Invalidation

“There are only two hard things in Computer Science: cache invalidation and naming things.” — Phil Karlton

Before getting too far into details, it is worth noting that Fastly has great support for both Magento 1 and Magento 2. In fact, there is a great deal of parallel of the Magento 2 support for Varnish and what Fastly supports. For example, Fastly adds a form of partial cache invalidation support in Magento 1. With Magento 2 Fastly leverage’s the improved support that comes with Magento 2, but Fastly is a good solution for both Magento 1 and 2 sites.

To expand here further, the approach is to tag pages in the cache with a label (e.g. the product id). If the product is updated, a purge request is sent to the cache specifying the product id so all pages holding information regarding that product are purged from the cache. This allows the web server to return pages that can be cached for weeks or longer.

This has quite profound performance implications that is worth understanding well.

Without cache tags on pages, how do you flush a page from a cache if something about the product changes? One approach is to set a short cache time-to-live (TTL) on the page. When the time expires, the page is removed from the cache automatically. That way it caches for a while, then expires from the cache forcing the origin to serve up the page. It’s a simple approach, but can result in much lower cache hit rates. Why? This is where maths comes into the equation (pun intended!).

Server Cache vs Browser Cache Time

Before diving into the maths side, one more little detail on TTL based caching. With Fastly there are actually two levels of cache – Fastly and the web browser. Thus the web server can actually return how long Fastly should cache the page (which can be a long duration since the web server can tell it proactively to purge it’s cached copy) and how long the web browser should cache the page for. The web browser can, for example, perform “fetch if modified” requests which are still faster than a full page request (the required bandwidth is reduced as Fastly only has to report that the old content is still valid). So a web server may commonly specify a long TTL for Fastly and a short (or zero) TTL for the web browser.

Distributions

So back to the maths side of the story. To keep the following discussion simple, I will restrict the discussion to product pages. Obviously life is more complex than this – there are category pages, search results, CMS pages, product images, and so on. But let’s focus on product pages for a moment.

There are several numbers that impact cache effectiveness including:

  • Number of unique products,
  • Frequency of product page access (typically a smaller number of products are accessed more frequently than others),
  • Arrival rate of requests, and
  • Frequency of product updates (requiring affected pages to be purged from the cache).

Let’s look at these numbers a bit closer, starting with frequency of access.

It is common for a subset of products to be accessed disproportionally. These could be items in a sale or your site’s best sellers. If you sort all products by probability of access, you typically get a curve such as the following.

Graph

That is, the probability of a cache entry being hit is not linear with the number of products. It is important to realize the shape of the curve can vary widely. The graph above had a pretty sharp knee indicating a small number of products are accessed a lot more frequently. The curve would be flatter if your sales are more evenly balanced across your product line.

And of course the curve may have more kinks and bumps in real life based on your real distribution of items and how frequently they are viewed.

Cache Miss Rate

The cache miss rate directly influences the rate at which new entries are added to the cache. When the cache is empty, the miss rate is higher as every request will be a cache miss. This in turn puts more load on the origin, which is important to understand if you want to reduce latency of requests. As the cache fills, the probability of a cache miss decreases. The cache miss rate hits zero when every future request that will ever come is already in the cache.

So what is the likely cache miss rate for a site? There is no easy answer for this. Let’s explore why.

  • A site with a larger number of SKUs will take longer to fill the cache. The more common pages will be hit first giving the greatest benefit, but the cache will not fill until all pages are accessed. (Side note: This has a side effect that if a product is never accessed, you will never incur any CPU to generate the page content for it.) Different curves will have different cache hit/miss rates and take different lengths of time to fill the cache.
  • The cache size can make a difference, although in the case of Fastly they have pretty big caches so this is not likely to be an issue. (Disk is cheap these days, and Fastly uses SSDs to maximize speed for cache hits.)
  • CDNs are typically distributed across the world to help international traffic. You will not get the same traffic patterns in all geographies, so geographicaly distributed storage may fill at different rates.

Another Fastly feature is worth mentioning here – a feature they call “Origin Shield”. Fastly operates a globally distributed cache, but cache misses all funnel through an “Origin Shield” which will not permit multiple concurrent requests for the same resource to hit the origin. (This is often referred to as a “stampede” and can occur when a cache is emptied while the server is under load.) It will wait for the existing request to complete, then serve all the other requests from the one response. This can be very useful for high volume sites as it stops the web server from being unnecessarily swamped by requests for the same common items (or flash sale items). Fastly protects the web server in this way, allowing it to focus on other productive work. (Remember you need to purchase hardware to cope with peak loads, not the average load. So reducing the peak load reduces hardware costs.)

Product Update Patterns

Another aspect is product update patterns. Do you do a batch import of the whole product catalog every night? Multiple times a day? Do you blindly update products even if not actually changed? What is the arrival rate of product updates? What percentage of the products are updated? All these factors influence whether it is better to purge individual products from the cache, or whether it makes more sense to do a global cache flush. (In some circumstances, wiping the whole global cache may be faster than deleting all the items one-by-one. Fastly can purge cache entries at 150ms per entry, which is very fast for a CDN, but not ideal if you need to update 100,000 products in a single transaction.)

Thus it may be worth reviewing how you get product updates into your system as it may have a direct impact on your cache hit rate and thus site performance. If you are updating products unnecessarily, you may be increasing you cache miss rate for no purpose.

Stale Cache Entries

There is also an interesting question of how fresh does the cache need to be. It is always going to be a possibility that a person views a product page, but before they go to purchase it is sold out or updated in some way. Having said that, you want to minimize the delay to avoid unfortunately user experiences.

One feature Fastly has is the ability to mark pages as stale (https://docs.fastly.com/guides/performance-tuning/serving-stale-content). Fastly will then fetch in the background a new copy of the page from the origin, but until then it will still serve the old page from cache. This means that even stale pages are cache hits – the user gets the super-fast experience as per normal cache hits. You do need to work out the importance of correctness. The stale content would have been returned if the user had hit the site a few seconds earlier, so you need to decide whether fast response or time accuracy is important for your site. For most sites it probably won’t matter, but it will affect your cache hit rate figures.

Time to Live

It has already been mentioned that some sites work by setting a short time to live for cache entries. That way you don’t need to purge items from the cache – you just leave them and wait for them to expire naturally. If you have a smaller set of SKUs with very high hit rates (e.g. on a flash sale site) this can work well. As your number of SKUs increases, this will increase your cache miss rates, slowing down the average customer experience.

This is why both Fastly and Magento 2 support partial cache invalidation. For many sites, the product update rate is such that you are better leaving entries in the cache forever. (Fastly claim they can purge pages in 150ms, which is pretty impressive.) Combine this with the ability for Fastly to serve stale pages from cache, and the “Origin Shield”, then you start to see that pages will have a cache miss once, then will sit in the cache a long time (days, weeks, potentially months). This can result in extremely high cache hit rates compared to other schemes. It depends on the application, but it is quite feasible to hit 99% cache hit rates (of pages that can be cached – I am not talking about uncacheable pages which are discussed below).

This is where the cache miss rate is an important metric. The cache miss rate for an empty (or near empty) cache is where you get poorer performance. As the cache fills, you can get increasingly higher cache hit rates.

This of course adds complexity to benchmarking – performance of a site is likely to improve over time until you reach some form of steady state. Do you care more about startup cost? Or steady state? You need to understand the system behavior to correctly interpret system results.

So if you have a change that is going to require a full cache flush (e.g. after a new code deployment), it is worthwhile to deploy such changes during lower site traffic periods so arrival rate of requests is reduced, taking pressure off the origin servers while the cache fills.

Non-Cacheable Pages and Private Data

It is worth a sidebar here as a reminder that not all pages are cacheable. For example, a checkout page is generally not cacheable as it should not be shared between users. So the overall cache hit rate of a site is affected by the average number of product views per checkout ratio. (So another variable in the mix.)

Magento 2 makes it easier for extension developers to deal with caching correctly. You specify on a block whether it is cacheable or not and Magento does much of the heavy lifting for you. Private data comes into play here as well. Private data is data that is not sharable between users. Magento 2 caches private data in the web browser as it is specific per user. For example, the user’s name can be cached in the browser. JavaScript on the page can then be used to inject such locally cached data into static HTML pages (which are cacheable by Fastly). Magento 2 will fetch all data from multiple private blocks on a page and then cache that data for later reused.

A side note here on Magento 2 compared to Magento 1 – many of these tricks can be done with Magento 1 if you code them up. Magento 2 frequently is more about reducing the effort to use such features and standardizing them to reduce upgrade costs and the need for local customizations (which you have to pay for). That is one of the joys of an open source product – if you are willing to pay, you can get improvements made. With Magento 2 a lot of work has been done to reduce such implementation and maintenance costs for customers. GoDaddy and Cloudways (and undoubtedly more) with Varnish support configured out of the box is an example story of how this will benefit customers. Even the bottom end of the market can now get access to Varnish caching without incurring additional costs.

Conclusions

“Hey Alan! Are you saying the performance of cache misses does not matter?” No. They matter! They were simply not the topic of this blog post.

  • Cache misses occur while the cache fills (less common products may not have hit the cache yet), and
  • There are un-cacheable pages like checkout flows.

Also remember that you need to spec your hardware taking into consideration peak loads, not average load. So flattening spikes may be more important than steady state if you are trying to keep costs down.

Hopefully this post sheds some insights on both how Fastly works and why Magento Cloud selected it. The Fastly approach aligns well with Magento 2 and partial cache invalidation. Apologies if I missed some of the details.

To repeat, this post is not a comparison of different CDN offerings – it is in part a description of Fastly, in part an exploration of the intricacies of caching, with occasional digressions into performance analysis complexity. Fastly delivers significant benefits to Magento sites providing both distributed CDN capabilities (reducing latency of static page assets) and cache functionality including “purge tagged cache entries” support. So there is good alignment of functionality.

Also be aware this post was knocked up pretty quickly in response to a question. It is a bit rough around the edges as a result. But hey, its a blog post. You get what you pay for!

Hopefully this post was also interesting with respect to why you cannot just pick up a performance benchmark (including ones Magento publishes) and say “this test got a 23.4% improvement, so I will get exactly that result on my production server as well”. Unfortunately it is not that simple. Performance results are a useful tool, but if the site profiles does not match your site then you will not get the same results.

To explore this further, I am playing around with the idea of building a very simple cache simulator in PHP so you can plug in your own site’s distributions and see how well the caching and product updates are likely to affect your site (and I would love to hear your results!). Good content for a future blog post.

And as a final note, if you don’t use Fastly, Magento 2 makes it easier (and hence cheaper) to get a site going with Varnish. It does not have all the Fastly features above, but does deliver significant benefit. What particularly excites me is seeing low end hosting solutions like GoDaddy and Cloudways (and others) support Varnish caching out of the box with zero installation effort or cost. That aspect is probably most relevant to smaller sites with lower budgets. I mean larger projects don’t need to save money do they? 😉

Standard disclaimer: This post is my opinions and not necessarily those of my employer. This post is not official information about Fastly. Errors are my own, but hopefully not too numerous!

2 comments

  1. Doug Goldberg · · Reply

    Wow Alan. If this is your quick post I don’t want to see the long ones. Great description of the reasons behind Fastly.

  2. Great information and deep technical insight! When Alan talks anything about Magento I would leave everything and listen. He is very passionate about Magento and lucid presenting his ideas. Thank you – Kiran

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.