Code that runs at the network edge — geographically close to the user — in a lightweight runtime, instead of in a single origin region.
Research-based overview. This article synthesizes public documentation, pricing pages, and user reports. How we research.
An edge function is a serverless function that is deployed to a globally distributed network of compute nodes (often hundreds of points of presence) and executed in a lightweight runtime — usually a V8 isolate — physically close to the requesting user.
If you already understand serverless functions and CDNs, the cleanest analogy is this: an edge function is to a serverless function as a CDN is to an origin server. A CDN caches static content close to the user; an edge function executes dynamic logic close to the user. Same geographic distribution principle, applied to compute instead of cache.
The lifecycle of a single request to an edge function:
The runtime constraints are the interesting part. V8 isolates can't run arbitrary Node.js code — they implement a subset of the Web platform APIs (fetch, Response, crypto, etc.) and disallow most of node:. This is why edge functions are advertised in terms of "Web standard" APIs rather than "the full Node.js API surface." Some platforms (Vercel, Netlify) now offer a Node.js compatibility mode at the edge, but the original V8-isolate model is still the cleanest mental picture.
| Provider | Runtime | Locations | Notable |
|---|---|---|---|
| Cloudflare Workers | V8 isolate (custom) | ~330 cities | The category-defining product. Tightly integrated with KV, R2, D1, Durable Objects. |
| Vercel Edge Functions | Edge Runtime (V8) + Node compat | 18+ regions | Tightly integrated with Next.js. Best for Next-heavy stacks. |
| Deno Deploy | Deno (V8) | 35+ regions | First-class TypeScript without a build step. Loved by Deno users. |
| Netlify Edge Functions | Deno-based | Global via Deno Deploy | Solid for Netlify-hosted sites. Less compelling on its own. |
| AWS Lambda@Edge / CloudFront Functions | Node.js / JS | ~250 PoPs | Powerful but operationally complex. AWS-native shops only. |
For a solo founder, the practical choice is usually between Cloudflare Workers (if you want to go all-in on the Cloudflare stack) and Vercel Edge Functions (if you're building with Next.js). The other three are good but niche. We compare two of the underlying platforms in Vercel vs Railway and Fly.io vs Railway.
The shared characteristic of every good edge use case: short, stateless work that benefits from being close to the user.
| Property | Traditional server | Serverless function | Edge function |
|---|---|---|---|
| Where it runs | One region (you pick) | One region (you pick) | ~all regions, automatically |
| Cold start | None (always running) | 100–1000+ ms (Lambda) | < 5 ms (V8 isolate) |
| Runtime | Full OS, full Node/Python/etc. | Full Node/Python/etc. | Web-standard subset |
| CPU/time limits | None | Up to 15 min (Lambda) | 10–50 ms typical |
| Ideal for | Long-running, stateful, predictable load | Stateless API endpoints, infrequent jobs | Latency-sensitive routing, auth, light personalization |
The decision rule. Default to a regional serverless function. Move to the edge only when you can prove the user-perceived latency matters — or when the work is genuinely just a request-router with no heavy compute. Don't put your whole app at the edge because a blog post said you should. The full picture of where this fits sits in our solo founder tech stack guide.
The stack, prompts, pricing, and mistakes to avoid — for solo founders building with AI.