Skip to main content
Estimated reading time: 8 minutes
FrontMCP deploying to Vercel's global edge network Here’s the deployment story you’re probably living: Your MCP server works locally. It handles tools, resources, prompts—maybe even agents. Now you need it in production. Traditional deployment means:
  • Provisioning servers
  • Setting up load balancers
  • Configuring Redis for sessions
  • Managing SSL certificates
  • Worrying about scaling
  • Paying for idle capacity
What if deployment was just… vercel deploy? FrontMCP has first-class Vercel support. One command generates the right build artifacts. Your server runs on Vercel’s edge network, scales automatically, and costs nothing when idle. Let’s ship it.

What You’ll Deploy

By the end of this guide, you’ll have:

Global Edge Deployment

Your MCP server running on Vercel’s worldwide network—low latency everywhere.

Automatic Scaling

Zero to thousands of requests with no configuration. Pay only for what you use.

Vercel KV Sessions

Edge-compatible session storage that works with serverless architecture.

One-Command Deploys

Push to git, deployment happens automatically. Or run vercel deploy.

Prerequisites

Before you start:
1

FrontMCP Project

You need a working FrontMCP server. If you don’t have one yet:
npx frontmcp create my-mcp-server
cd my-mcp-server
2

Vercel Account

Sign up at vercel.com if you haven’t already. The Hobby tier is free.
3

Vercel CLI

Install the Vercel CLI:
npm i -g vercel
vercel login

One-Command Build

FrontMCP’s CLI handles all the Vercel-specific configuration:
frontmcp build --adapter vercel
That’s it. This command:
  1. Compiles your TypeScript to ESM
  2. Bundles everything into a single handler.cjs file
  3. Generates Vercel’s Build Output API structure
  4. Detects your package manager (npm/yarn/pnpm/bun)
  5. Creates vercel.json with correct configuration
After running frontmcp build --adapter vercel, your project contains:
.vercel/
└── output/
    ├── config.json          # Routing configuration
    └── functions/
        └── index.func/
            ├── .vc-config.json  # Runtime config
            └── handler.cjs      # Your bundled server

vercel.json                   # Build commands
The .vc-config.json configures:
  • Runtime: Node.js 22.x
  • Handler: handler.cjs
  • Launcher: Nodejs type
Routes in config.json point all traffic to your function.

FrontMCP deploying to Vercel's global edge network

Deployment Steps

1

Build for Vercel

Run the build command:
frontmcp build --adapter vercel
You should see .vercel/output/ directory created.
2

Add Environment Variables

Your MCP server likely needs API keys. Add them to .env.local:
OPENAI_API_KEY=sk-...
DATABASE_URL=postgresql://...
For production, set them in Vercel Dashboard or via CLI:
vercel env add OPENAI_API_KEY production
3

Deploy

Run the deploy command:
vercel deploy
For production:
vercel deploy --prod
Vercel returns a URL like https://my-mcp-server.vercel.app
4

Verify

Test your deployment:
curl https://my-mcp-server.vercel.app/health
# Expected: {"status":"ok","serverless":true}
Or test the MCP endpoint:
curl -X POST https://my-mcp-server.vercel.app/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'

Adding Vercel KV for Sessions

Standard Redis requires a persistent TCP connection—which serverless doesn’t have. Vercel KV is a REST-based key-value store that works perfectly with serverless.

Enable Vercel KV

1

Add KV to Your Project

In Vercel Dashboard:
  1. Go to your project → Storage
  2. Click “Create Database”
  3. Select “KV”
  4. Name it (e.g., my-mcp-sessions)
Vercel automatically adds KV_REST_API_URL and KV_REST_API_TOKEN to your environment.
2

Configure FrontMCP

Update your server configuration:
@FrontMcp({
  info: { name: 'My MCP Server', version: '1.0.0' },
  redis: {
    provider: 'vercel-kv',
    // Environment variables are auto-detected
    // url: process.env.KV_REST_API_URL,
    // token: process.env.KV_REST_API_TOKEN,
    keyPrefix: 'mcp:',
    defaultTtlMs: 7200000, // 2 hours
  },
})
export default class MyServer {}
3

Redeploy

frontmcp build --adapter vercel
vercel deploy --prod

Vercel KV vs Standard Redis

FeatureVercel KVStandard Redis
TransportREST (edge-compatible)TCP (requires connection)
Latency~5-15ms~1-5ms
Pub/SubNot supportedSupported
Resource SubscriptionsRequires hybrid setupFully supported
SetupOne-click in dashboardProvision + configure
CostPay-per-requestFixed instance cost
For most MCP servers, Vercel KV is the right choice. It handles sessions, caching, and state—which is 90% of what you need. Only use standard Redis if you need real-time resource subscriptions.

Serverless Considerations

Function Timeout

Vercel serverless functions have timeout limits:
TierMax Duration
Hobby60 seconds
Pro300 seconds
Enterprise900 seconds
If your MCP operations take longer than 60 seconds on Hobby tier, they’ll fail. For long-running agent tasks, consider Pro tier or chunking the work.
FrontMCP uses Vercel’s Build Output API, so max duration is configured in the function’s .vc-config.json. You can customize this via the CLI:
frontmcp build --adapter vercel --max-duration 300
This generates .vercel/output/functions/index.func/.vc-config.json:
{
  "runtime": "nodejs22.x",
  "handler": "handler.cjs",
  "launcherType": "Nodejs",
  "maxDuration": 300
}
The maxDuration value must not exceed your Vercel tier limit. Hobby tier caps at 60 seconds.

Cold Starts

The first request after idle may take 1-3 seconds while Vercel spins up your function. Subsequent requests (warm starts) are much faster. FrontMCP tracks this for you:
// Available in your tools/resources
const info = ctx.scope.serverlessInfo;
console.log(info.isColdStart);      // true on first request
console.log(info.invocationCount);  // number of requests since cold start

Session Persistence Across Cold Starts

Here’s where FrontMCP shines: clients don’t need to re-initialize MCP when functions wake up from idle. Traditional MCP servers lose all state when they restart. Clients must detect the disconnect, re-send initialize, and rebuild their session. This creates a terrible user experience in serverless environments where functions spin down after ~15 minutes of inactivity. FrontMCP handles this automatically:
As long as the client sends its session-id header (or uses the same SSE connection URL with session token), FrontMCP reconstructs the full session context from Vercel KV. The client continues as if nothing happened—no initialize call required.
This means:
  • No client-side reconnection logic needed for cold starts
  • Session state persists across function invocations
  • Seamless experience even with aggressive function recycling

Resource Subscriptions (Hybrid Setup)

Vercel KV doesn’t support Pub/Sub, which means real-time resource subscriptions won’t work out of the box. If you need subscriptions:
@FrontMcp({
  info: { name: 'My Server', version: '1.0.0' },
  // Sessions via Vercel KV (edge-compatible)
  redis: {
    provider: 'vercel-kv',
  },
  // Pub/Sub via external Redis (for subscriptions)
  pubsub: {
    host: process.env.REDIS_HOST,
    port: 6379,
    password: process.env.REDIS_PASSWORD,
  },
})

Project Structure

A typical Vercel-ready FrontMCP project:
my-mcp-server/
├── src/
│   ├── main.ts           # Server entry point
│   ├── apps/
│   │   └── my-app/
│   │       ├── index.ts  # App definition
│   │       └── tools/    # Tool implementations
│   └── providers/        # Shared providers
├── .env.local            # Local environment
├── package.json
├── tsconfig.json
└── vercel.json           # Generated by build
Your vercel.json (auto-generated):
{
  "version": 2,
  "buildCommand": "npm run build",
  "installCommand": "npm install"
}
FrontMCP auto-detects your package manager. If you use yarn or pnpm, the generated vercel.json will have the correct commands.

Monitoring & Debugging

Vercel Dashboard

The Vercel dashboard shows:
  • Function invocations and duration
  • Error rates and logs
  • Request/response details
Navigate to Project → Functions to see real-time data.

Logging

FrontMCP logs are captured by Vercel:
@Tool({ name: 'my-tool' })
class MyTool extends ToolContext {
  async execute(input: any) {
    console.log('Processing:', input); // Visible in Vercel logs
    this.notify('Starting processing...', 'info');
    // ...
  }
}
View logs with:
vercel logs https://my-mcp-server.vercel.app

Error Tracking

For production, connect Vercel to your error tracking service:
// In your server configuration
@FrontMcp({
  // ...
  logging: {
    level: 'info',
    format: 'json', // Structured logs for parsing
  },
})

CI/CD Setup

Connect your git repository for automatic deployments:
1

Connect Repository

In Vercel Dashboard:
  1. Import your GitHub/GitLab/Bitbucket repo
  2. Vercel auto-detects the build settings
2

Configure Build

Set the build command in project settings:
  • Build Command: frontmcp build --adapter vercel
  • Output Directory: .vercel/output
  • Install Command: npm install (or yarn/pnpm)
3

Push to Deploy

Every push to main triggers a production deployment. Pull requests get preview deployments.

Cost Optimization

Vercel’s serverless pricing is usage-based:
ResourceHobby (Free)Pro ($20/mo)
Function Invocations100K/mo1M/mo
Function Duration100 GB-hrs1000 GB-hrs
Bandwidth100 GB1 TB
KV Operations30K/day150K/day
For MCP servers, the main cost drivers are function duration (LLM calls take time) and KV operations (session storage). Most projects stay well within free tier limits.

Troubleshooting

Symptom: 504 Gateway TimeoutSolution:
  • Increase max duration in vercel.json
  • Upgrade to Pro tier for longer limits
  • Break long operations into smaller chunks
Symptom: OPENAI_API_KEY is not definedSolution:
vercel env add OPENAI_API_KEY production
vercel deploy --prod
Symptom: First request takes 2-3 secondsSolution: This is expected for serverless. For lower latency:
  • Keep functions small (faster cold starts)
  • Use edge functions for ultra-low latency
  • Consider Pro tier with more resources
Symptom: Could not connect to Vercel KVSolution:
  • Ensure KV database is linked to project
  • Check that KV_REST_API_URL and KV_REST_API_TOKEN are set
  • Redeploy after linking KV

What’s Next

Serverless Documentation

Deep dive into serverless patterns, hybrid configurations, and edge optimization

Vercel KV Guide

Complete guide to session storage, caching, and KV best practices

Production Checklist

Security hardening, monitoring, and production readiness

AWS Lambda Deployment

Alternative: Deploy to AWS Lambda with SAM or Serverless Framework

FrontMCP supports multiple deployment targets: Vercel, AWS Lambda, Cloudflare Workers, and traditional Node.js servers. Choose what fits your infrastructure. Star us on GitHub to follow development.