The globally distributed cache that sits between your origin server and your users — and why most solo SaaS apps already have one without configuring it.
Research-based overview. This article synthesizes public documentation, pricing pages, and user reports. How we research.
When a browser in Tokyo requests a JavaScript bundle from a server in Virginia, the round-trip alone is roughly 150–200 milliseconds before the first byte arrives. Multiply that across the dozens of assets a modern SaaS app loads on its first page, and you have a sluggish app for anyone who is not sitting next to your origin. A CDN solves this by caching those assets in dozens or hundreds of edge locations worldwide. The Tokyo browser hits an edge server in Tokyo, gets a 5–15 ms response, and never bothers your origin at all.
For solo SaaS founders, the practical version of this story is simpler: if you deploy on Vercel, Netlify, or Cloudflare Pages, you already have a CDN. The question is rarely “should I add a CDN?” but “what is the CDN doing for me, and when do I need to think about it?”
The benefits stack across four distinct categories, and most apps pick up all of them at once when they put a CDN in front of their origin.
The architecture is conceptually straightforward but has a handful of moving parts that determine whether your CDN actually helps or quietly hurts you.
Each edge location maintains its own cache. When a request arrives, the edge checks for a fresh cached copy of the requested URL. If it finds one (a “cache hit”), it serves that copy directly. If not (a “cache miss”), it forwards the request to your origin, stores the response, and serves it. Subsequent requests for the same URL at that edge location are served from cache until the entry expires.
Without an origin shield, every edge location independently fetches from the origin on its first miss — a lot of duplicate origin hits across hundreds of locations. An origin shield is an intermediate cache layer that all edges consult before going to the origin. Cloudflare calls it Tiered Cache; CloudFront calls it Origin Shield; Fastly bakes it in via shielding. The Cloudflare documentation on Tiered Cache walks through the topology in detail.
The cache key is the unique identifier under which a response is stored. By default it is the request URL plus a few selected headers. If you cache by URL alone but the response varies by Accept-Language, you will serve French content to English users. If you cache by full URL including query strings but your tracking parameters create a unique URL on every page load, your cache hit ratio drops to zero. Most mistakes in CDN configuration come from a misaligned cache key.
Each cache entry has a time-to-live (TTL) after which it is considered stale and revalidated on the next request. TTLs are set by the origin via response headers, overridden by CDN configuration, or both. When you ship a new asset and need the old one removed before its TTL expires, you call the CDN's purge API to invalidate specific URLs. Purge mechanics vary — Cloudflare offers per-URL and tag-based purge; Fastly is famous for sub-second purge across the entire network.
| Provider | Best for | Notable features |
|---|---|---|
| Cloudflare | Default for solo SaaS | Largest network footprint. Free tier covers DDoS protection, TLS, and basic caching at $0. Workers run code at the edge. |
| Fastly | Developer-focused teams | VCL config language for fine-grained behavior. Sub-second instant purge. Used by GitHub, Stripe, and Shopify. |
| AWS CloudFront | Apps already on AWS | Tight integration with S3, Lambda@Edge, and CloudFront Functions. Pay-as-you-go with no monthly minimum. |
| Akamai | Enterprise / regulated industries | The original CDN. Largest enterprise-grade footprint and most complex configuration. Rarely the right choice for a solo founder. |
| Vercel Edge Network | Next.js apps | A managed CDN built on top of Cloudflare and AWS. Zero configuration; tightly integrated with Next.js cache primitives. |
For most solo SaaS founders, the choice is made by the deployment platform — picking a deployment target is, indirectly, picking a CDN. Our breakdown of Vercel vs Cloudflare Pages walks through how the trade-offs play out in practice.
Static assets — images, fonts, JavaScript bundles, CSS files — cache cleanly. They have a stable URL, do not vary by user, and can be cached for months at the edge with a long TTL. Bundlers like Vite, Webpack, and Turbopack hash the file name on every build, so you can ship updates without manual purges; the new build produces a new URL, and the old URL stays cached harmlessly.
Dynamic content — HTML pages with personalized data, API responses, server-rendered routes — is harder. Two strategies dominate.
Stale-while-revalidate (SWR) is an HTTP caching directive (RFC 5861) that tells the CDN to serve a stale cached copy immediately while asynchronously revalidating in the background. The user gets a fast response; the next user gets a fresh one. Vercel exposes this directly via the Cache-Control response header and the revalidate Next.js option.
Edge config and on-demand revalidation use a key-value store at the edge to handle small pieces of dynamic data without round-tripping to the origin on every request. Vercel Edge Config, Cloudflare KV, and Fastly Edge Dictionaries all serve this role. The pattern: store feature flags, config, or small lookups at the edge; let the rest of the page render from a cached HTML shell.
The HTTP Cache-Control header is how your origin tells the CDN how to cache a response. The relevant directives:
public — the response can be cached by any cache, including shared caches like a CDN. Use for assets that are not user-specific.private — the response can be cached only by the user's own browser, never by a shared cache. Use for personalized HTML and authenticated API responses.max-age=N — the response is fresh for N seconds in any cache.s-maxage=N — same as max-age, but only applies to shared caches (the CDN). Lets you have a long edge TTL and a short browser TTL.immutable — the response will never change. Browsers skip revalidation entirely. Use on hashed asset filenames.The ETag header is a content fingerprint; the CDN sends it back on the next request, and the origin returns a quick 304 Not Modified if nothing changed. The Vary header tells the cache which request headers should be part of the cache key — Vary: Accept-Encoding ensures gzipped and uncompressed responses are stored separately. The MDN reference on Cache-Control is the canonical guide.
The original CDN idea was “cache static stuff at the edge.” The 2026 reality is that the edge runs your application code, too. Cloudflare Workers, Vercel Edge Functions, Fastly Compute, and AWS Lambda@Edge all let you execute serverless functions in the same physical locations where the cache lives. The benefits: cold-start times measured in single-digit milliseconds, a request never hits a centralized region, and you can rewrite responses or do auth checks at the edge.
Edge functions blur the line between “CDN” and “application platform.” A Vercel deployment routes a request first to an edge cache, then to an Edge Function if the route is configured for it, then potentially to a serverless function in a centralized region for heavier work. The CDN is no longer just a cache; it is the first layer of compute. Our deploy guide for Next.js on Vercel covers how to think about this in practice.
Images are the largest category of bytes most SaaS apps serve, and modern formats (AVIF, WebP) cut the byte count by 30–70% compared to JPEG/PNG. Image optimization at the edge means: the CDN receives the image, transcodes it to the best format the requesting browser supports, resizes to the requested dimensions, and caches the result. The next user with the same combination gets it from cache; the next user with a different device profile gets a freshly-transcoded version.
The major options:
<Image> component; the workflow is essentially zero-config for Next.js apps.sharp can do the same work on a server, but you give up the edge cache and pay origin CPU on every miss.If your stack is Next.js on Vercel, Remix on Netlify, or any framework on Cloudflare Pages, you have a fully configured CDN today. You did not pay extra for it; it ships with the platform. The cache headers Next.js writes by default are reasonable for most apps. The image component already routes through edge optimization. You did not configure an edge shield, but the platform did.
This means the question for most solo founders is not “which CDN should I buy?” The default is fine. The question is “am I doing anything that quietly bypasses my CDN?” That is where the real wins and the real bugs live.
A handful of situations move CDN configuration from “the platform handles it” to “you need to make decisions.”
Things that will quietly break in production if you do not pay attention:
Cache-Control: public on a logged-in dashboard route is an immediate cross-user data leak. Always use private or no-store on user-specific content.no-cache) on HTML responses./webhook/stripe behind a cache that occasionally serves a stale 200 OK is an outage waiting to happen. See our notes on webhook security best practices for the production checklist.A CDN is the layer of cached, geographically distributed servers that turn a single-region origin into a globally fast app. For solo SaaS founders in 2026, the more accurate framing is: your deployment platform already runs a CDN for you, and your job is to know what it is doing — not to operate it from scratch. Understand cache headers, never cache authenticated content, and treat your CDN as a first-class part of your stack rather than an invisible accelerator. Once you do, the rest of the architecture — edge functions, image optimization, dynamic caching strategies — falls into place naturally.
The stack, prompts, pricing, and mistakes to avoid — for solo founders building with AI.