All insights
Engineering7 min read

Deploying Next.js to the Edge: Vercel vs Cloudflare Workers

Edge computing moves your code closer to users. But the two dominant platforms have fundamentally different runtime models. Here's what you need to know before choosing.

NC

Nextcraft Engineering Team

What Edge Deployment Actually Means

Edge deployment runs your code on servers distributed globally — often 200+ locations — rather than in a single data center region. When a user in Tokyo requests your site, they're served by a node in Tokyo, not one in Virginia.

The practical result: TTFB (Time to First Byte) drops dramatically for globally distributed users. Response times that were 300–400ms from a distant region become 30–50ms from a nearby edge node.

The tradeoff: edge runtimes are constrained. They run V8 isolates (not Node.js), which means no Node-specific APIs, limited memory, and restricted access to native modules. Understanding these constraints is the key to making edge deployment work.

Vercel's Edge Runtime

Vercel has two compute options for Next.js:

  • Node.js runtime: Full Node.js compatibility, single-region (or multi-region on Enterprise)
  • Edge runtime: V8 isolates, globally distributed, constrained API surface

You opt into the Edge runtime per-route:

code
// Route Handlers
export const runtime = 'edge';

export async function GET(request: Request) {
  return new Response('Hello from the edge!');
}
code
// Middleware (always runs on Edge)
import { NextRequest, NextResponse } from 'next/server';

export function middleware(request: NextRequest) {
  // This runs on edge nodes worldwide
  const country = request.geo?.country;
  if (country === 'DE') {
    return NextResponse.redirect(new URL('/de', request.url));
  }
}

What works on Vercel Edge:

  • fetch, Request, Response (Web APIs)
  • crypto.subtle (Web Crypto API)
  • TextEncoder, TextDecoder
  • Streaming responses
  • Environment variables (as strings)

What doesn't work:

  • Node.js fs, path, os
  • Native modules (bcrypt, sharp, etc.)
  • Most database clients that use TCP connections directly

For database access from edge, you need an HTTP-based database client: Neon serverless, PlanetScale HTTP driver, or Upstash for Redis.

Cloudflare Workers

Cloudflare Workers also run V8 isolates, but with a different ecosystem and broader global network (300+ cities vs Vercel's ~80 edge locations).

Next.js on Cloudflare runs via the @cloudflare/next-on-pages adapter, which builds your Next.js app into a format compatible with Cloudflare Pages Functions.

code
// next.config.ts — nothing special needed for most cases
// The adapter handles the build transformation

// Your pages work as normal
export default async function Page() {
  // But no Node.js APIs — stick to Web APIs and fetch
  const data = await fetch('https://api.example.com/data');
  return <Content data={await data.json()} />;
}

Cloudflare-specific advantages:

  • Workers KV: Key-value store native to the edge runtime, zero-latency reads
  • Durable Objects: Stateful edge compute, useful for WebSockets and coordination
  • D1: SQLite-compatible database at the edge
  • R2: S3-compatible storage without egress fees
  • Cache API: Fine-grained cache control at the CDN layer

This is a full-stack edge platform, not just compute.

The Critical Limitation: Cold Starts

V8 isolates start faster than serverless functions (microseconds vs hundreds of milliseconds), but there's still a concept of warm vs cold instances.

On Vercel Edge, this is largely managed for you. On Cloudflare, Workers stay warm for the duration of a request and are recycled aggressively between requests. The first request to an idle Worker has slightly higher latency.

In practice, both platforms handle this well for production traffic. It becomes a concern only for very low-traffic routes or cron-like patterns.

Choosing Between Them

Choose Vercel Edge when:

  • You're already on Vercel for simplicity
  • You need to run edge logic alongside Node.js routes on the same deployment
  • Your edge code is primarily middleware (auth checks, geo-routing, A/B tests, header manipulation)
  • Your data layer is in a Vercel-integrated service (Neon, Upstash, PlanetScale)

Choose Cloudflare Workers when:

  • You want Cloudflare's storage ecosystem (KV, D1, R2, Durable Objects)
  • You need the broadest possible geographic distribution
  • You want fine-grained CDN control with the Cache API
  • Cost at scale is a concern (Cloudflare's pricing is very competitive)

What Actually Runs Well on the Edge

The best use cases for edge compute in Next.js:

Middleware: Authentication checks, feature flags, A/B routing, locale detection. These run on every request and benefit maximally from edge latency reduction.

API Routes with simple logic: Rate limiting, URL shortening, webhook processing, simple aggregations from cached data.

Personalized static pages: Render a static page but inject user-specific content (name, preferences) at the edge based on a cookie — combining the performance of static with the personalization of dynamic.

What shouldn't run on the edge: Heavy computation, file processing, video transcoding, operations requiring large memory allocations, anything using Node-native modules.

A Practical Architecture

For most production Next.js apps, the optimal architecture mixes runtimes:

code
Edge (Middleware):
  - Auth session validation
  - Geo-based routing
  - A/B test assignment

Edge (API Routes):
  - Simple reads from edge-native storage
  - Rate limiting
  - Webhook entry points

Node.js (Server):
  - Database-heavy operations
  - File processing
  - Complex business logic
  - Third-party SDK integrations

Deploy to Vercel with edge middleware and edge-enabled API routes where appropriate, Node.js for everything else. This is the pattern that gives you the majority of edge benefits without fighting the runtime constraints.

Stay Informed.

Join 1,200+ founders and engineers receiving our monthly deep dives on product engineering, design, and growth.

Insights once a month. No spam. Unsubscribe anytime.