Deployment

Deploy your One app to production

One supports multiple deployment targets for web applications. Choose the target that best fits your infrastructure needs.

Configuration

Set your deployment target in vite.config.ts:

vite.config.ts

import { one } from 'one/vite'
export default {
plugins: [
one({
web: {
deploy: 'node' // 'node' (default), 'vercel', or 'cloudflare'
}
})
]
}

Deployment Targets

Node (Default)

The default deployment target outputs a Node.js server using Hono. This is suitable for any Node.js hosting environment.

Terminal

# Build for production
npx one build
# Start the production server
npx one serve

The build output is in the dist directory:

  • dist/client - Static assets and client-side bundles
  • dist/server - Server-side rendering bundles
  • dist/api - API route handlers

You can deploy this to any platform that supports Node.js:

  • Docker: Create a Dockerfile that runs npx one serve
  • Railway, Render, Fly.io: Point to your repo and set the start command to npx one serve
  • AWS, GCP, Azure: Deploy as a containerized application or using their Node.js runtimes

Vercel

One supports deploying to Vercel's serverless infrastructure using the Build Output API.

vite.config.ts

import { one } from 'one/vite'
export default {
plugins: [
one({
web: {
deploy: 'vercel'
}
})
]
}

Project Setup

  1. Add vercel.json to your project root:

vercel.json

{
"framework": null,
"buildCommand": "npx one build",
"installCommand": "npm install"
}

Setting framework: null tells Vercel to use the pre-built output from the Build Output API instead of auto-detecting a framework.

  1. Install the Vercel CLI (optional, for manual deploys):
npm install -g vercel

Git Push Deploys

The easiest way to deploy is connecting your Git repository to Vercel:

  1. Go to vercel.com/new
  2. Import your Git repository
  3. Vercel auto-detects settings from vercel.json
  4. Click Deploy

Once connected, every push to your main branch triggers a production deploy. Pull requests get preview deployments automatically.

Manual CLI Deploy

For manual deploys or CI/CD pipelines:

Terminal

# Build for Vercel
npx one build
# Deploy using Vercel CLI
vercel deploy --prebuilt

The --prebuilt flag tells Vercel to use the pre-generated output in .vercel/output instead of running its own build process.

For production deploys:

Terminal

vercel deploy --prebuilt --prod

Testing Production Build Locally

You can do a basic sanity check of your production build locally:

Terminal

# Build the app
npx one build
# Serve the production build
npx one serve

Note: This runs the Hono server from dist/, not the actual Vercel serverless functions. It's useful for catching build errors and checking that pages render correctly, but it's not a perfect simulation of the Vercel environment.

For true Vercel testing, use a preview deploy:

Terminal

# Deploy a preview (not production)
vercel deploy --prebuilt

Every push to a non-main branch also triggers a preview deployment automatically when using Git integration. This is the most reliable way to test before promoting to production.

What Gets Built

When targeting Vercel, One generates:

  • .vercel/output/static - Static assets served from Vercel's Edge Network
  • .vercel/output/functions - Serverless functions for:
    • SSR pages (+ssr.tsx files)
    • API routes (app/api/**)
    • Loaders (data fetching functions)

Static routes (+ssg.tsx) are pre-rendered to .vercel/output/static as HTML files.

Environment Variables

Set environment variables in your Vercel project dashboard under Settings > Environment Variables:

Terminal

# Required for SSR and API routes
ONE_SERVER_URL=https://your-app.vercel.app

For local development, add to .env.local:

Terminal

ONE_SERVER_URL=http://localhost:8081

Vercel Project Settings

When using vercel.json, settings are auto-configured. If you need to configure manually in the Vercel dashboard:

  1. Framework Preset: Other
  2. Build Command: npx one build
  3. Output Directory: .vercel/output
  4. Install Command: npm install (or bun install, pnpm install)

Static-Only Sites

If your site only uses SSG routes (+ssg.tsx) without SSR, loaders, or API routes, you can simplify:

vite.config.ts

one({
web: {
deploy: 'vercel',
defaultRenderMode: 'ssg',
}
})

This generates only static files with no serverless functions, resulting in faster deploys and lower costs.

Troubleshooting

Build fails with "Cannot find module": Ensure all dependencies are in dependencies, not devDependencies, if they're used at runtime.

404 on dynamic routes: Check that your route files use the correct suffix (+ssr.tsx for server-rendered, +ssg.tsx with generateStaticParams for static).

Environment variables not available: Variables must be added in Vercel dashboard and redeployed. Local .env files are not uploaded.

Slow cold starts: Consider using +ssg.tsx for routes that don't need real-time data. Static routes have no cold start.

Static Export

If you're only using SSG and SPA routes without loaders or API routes, you can statically serve the dist/client directory from any static hosting:

  • Netlify
  • GitHub Pages
  • Cloudflare Pages (static only)
  • AWS S3 + CloudFront
  • Any CDN or static file server
npx one build # Upload dist/client to your static host

Cloudflare Workers

One supports deploying to Cloudflare Workers with full SSR, API routes, and edge performance.

vite.config.ts

import { one } from 'one/vite'
export default {
plugins: [
one({
web: {
deploy: 'cloudflare'
}
})
]
}

Build and Deploy

Terminal

# Build for Cloudflare
npx one build
# Deploy using Wrangler
cd dist && wrangler deploy

What Gets Built

When targeting Cloudflare, One generates:

  • dist/worker.js - The Cloudflare Worker entry point
  • dist/wrangler.jsonc - Wrangler configuration with lazy loading
  • dist/client - Static assets served from Cloudflare's edge
  • dist/server - Server-side bundles (lazy-loaded per route)
  • dist/api - API route handlers (lazy-loaded per route)

Lazy Loading

One uses a lazy loading pattern for Cloudflare Workers. Route modules are loaded on-demand when matched, not all upfront. This improves cold start times for apps with many routes.

The generated wrangler.jsonc includes:

{
"find_additional_modules": true,
"rules": [
{ "type": "ESModule", "globs": ["./server/**/*.js"], "fallthrough": true },
{ "type": "ESModule", "globs": ["./api/**/*.js"], "fallthrough": true }
]
}

Requirements

  • Cloudflare Workers with nodejs_compat compatibility flag (auto-configured)
  • Wrangler CLI for deployment

Wrangler Configuration

The generated wrangler.jsonc is ready to use. To customize, create your own wrangler.jsonc in your project root and One will use it as a base.

Key settings:

{
"name": "your-app-name",
"compatibility_flags": ["nodejs_compat"],
"assets": { "directory": "client" }
}

Edit this page on GitHub.