One supports multiple deployment targets for web applications. Choose the target that best fits your infrastructure needs.
Set your deployment target in vite.config.ts:
vite.config.ts
We recommend setting securityScan: 'error' for production deployments to prevent accidentally shipping API keys or tokens in client-facing JavaScript. See Configuration: build.securityScan for details.
The default deployment target outputs a Node.js server using Hono. This is suitable for any Node.js hosting environment.
Terminal
The build output is in the dist directory:
dist/client - Static assets and client-side bundlesdist/server - Server-side rendering bundlesdist/api - API route handlersYou can deploy this to any platform that supports Node.js:
npx one servenpx one serveOne supports deploying to Vercel’s serverless infrastructure using the Build Output API.
vite.config.ts
vercel.json
Setting framework: null tells Vercel to use the pre-built output from the Build Output API instead of auto-detecting a framework.
The cleanUrls: true setting is important for SSG routes - it allows URLs like /about to serve about.html without needing the .html extension. Without this, direct navigation to SSG pages will 404.
yarn
npm
bun
pnpm
The easiest way to deploy is connecting your Git repository to Vercel:
vercel.jsonOnce connected, every push to your main branch triggers a production deploy. Pull requests get preview deployments automatically.
For manual deploys or CI/CD pipelines:
Terminal
The --prebuilt flag tells Vercel to use the pre-generated output in .vercel/output instead of running its own build process.
For production deploys:
Terminal
You can do a basic sanity check of your production build locally:
Terminal
Note: This runs the Hono server from dist/, not the actual Vercel serverless functions. It’s useful for catching build errors and checking that pages render correctly, but it’s not a perfect simulation of the Vercel environment.
For true Vercel testing, use a preview deploy:
Terminal
Every push to a non-main branch also triggers a preview deployment automatically when using Git integration. This is the most reliable way to test before promoting to production.
When targeting Vercel, One generates:
.vercel/output/static - Static assets served from Vercel’s Edge Network.vercel/output/functions - Serverless functions for:
+ssr.tsx files)app/api/**)Static routes (+ssg.tsx) are pre-rendered to .vercel/output/static as HTML files.
Set environment variables in your Vercel project dashboard under Settings > Environment Variables:
Terminal
For local development, add to .env.local:
Terminal
When using vercel.json, settings are auto-configured. If you need to configure manually in the Vercel dashboard:
npx one build.vercel/outputnpm install (or bun install, pnpm install)If your site only uses SSG routes (+ssg.tsx) without SSR, loaders, or API routes, you can simplify:
vite.config.ts
This generates only static files with no serverless functions, resulting in faster deploys and lower costs.
Build fails with “Cannot find module”: Ensure all dependencies are in dependencies, not devDependencies, if they’re used at runtime.
404 on dynamic routes: Check that your route files use the correct suffix (+ssr.tsx for server-rendered, +ssg.tsx with generateStaticParams for static).
Environment variables not available: Variables must be added in Vercel dashboard and redeployed. Local .env files are not uploaded.
Slow cold starts: Consider using +ssg.tsx for routes that don’t need real-time data. Static routes have no cold start.
If you’re only using SSG and SPA routes without loaders or API routes, you can statically serve the dist/client directory from any static hosting:
yarn
npm
bun
pnpm
One supports deploying to Cloudflare Workers with full SSR, API routes, and edge performance.
vite.config.ts
Terminal
When targeting Cloudflare, One generates:
dist/worker.js - The Cloudflare Worker entry pointdist/wrangler.jsonc - Wrangler configuration with lazy loadingdist/client - Static assets served from Cloudflare’s edgedist/server - Server-side bundles (lazy-loaded per route)dist/api - API route handlers (lazy-loaded per route)One uses a lazy loading pattern for Cloudflare Workers. Route modules are loaded on-demand when matched, not all upfront. This improves cold start times for apps with many routes.
The generated wrangler.jsonc includes:
nodejs_compat compatibility flag (auto-configured)The generated wrangler.jsonc is ready to use. To customize, create your own wrangler.jsonc in your project root and One will use it as a base.
Key settings:
Edit this page on GitHub.