Your Files Are APIs. One Comment Changes Everything.
Write a TypeScript function — it's instantly a live HTTP endpoint. Add one comment to control auth, CORS, mode, or AI. No config. No middleware.
// @mode serverless
// @cors reflective
// @token my-secret-key
return { message: 'Hello!', time: Date.now() };
# Live at:
https://proj-cont-exec-1.us1.containers.hoody.com/hello
Two Modes. One Comment.
Every script picks a mode. Worker for stateful real-time apps. Serverless for isolated, ephemeral execution. One comment is all it takes.
Worker Mode
- Persistent V8 isolate — stays warm forever
- Shared state across all requests
- WebSocket supported
- Zero cold start after first request
Serverless Mode
- Fresh V8 isolate per request
- Complete isolation, no state leakage
- Configurable concurrency via @concurrent
- Ideal for webhooks and sporadic traffic
Every Behavior. One Line.
Drop a comment at the top of your file. No code changes, no config files, no middleware. Change the comment — behavior changes instantly.
Execution Mode
Choose between a persistent Worker VM or a fresh Serverless isolate per request. Determines state, WebSocket, and cold start behavior.
CORS Control
Mirror the request origin with 'reflective', open all origins with '*', or lock to a specific URL. Zero middleware required.
Request Timeout
Set the timeout in milliseconds. Default is 30s. Use 0 or 'unlimited' for long-running scripts. Prevents runaway executions.
Concurrency Cap
Serverless-only: cap simultaneous executions. Set to 'false' for serial processing — essential for webhook ordering.
Endpoint Auth
Protect any endpoint with a shared secret. Clients authenticate via Bearer, Basic, X-Token header, or query param. Constant-time comparison.
WebSocket Support
Enable real-time bidirectional connections. Requires Worker mode. ws.message, ws.open, ws.close handlers injected automatically.
AI Helpers
Injects generateText, streamText, generateObject from the Vercel AI SDK. No imports, no API key setup. Model pre-configured.
AI Model Selection
Override the default AI model per script. Default: google/gemini-2.5-flash-lite. Supports 300+ models from 14 providers.
AWS Integration
Enable the AWS SDK inside your script with a single comment. Access S3, DynamoDB, Lambda, and any other AWS service directly.
Complete Magic Comment Reference.
Every comment, every value, every default. All in one place.
Sets the execution mode for the script. 'worker' creates a persistent VM; 'serverless' (default) creates a fresh VM per request.
SYNTAX
From Webhook to WebSocket in Minutes.
Six patterns that cover the full range — from instant HTTP APIs to real-time AI proxies.
Instant APIs
Skip Express setup entirely. Create a file — it's a live HTTP endpoint. Worker for high traffic, serverless for isolation.
Webhook Receivers
Stripe, GitHub, Slack webhooks with serverless isolation. Use @concurrent false for serial processing and consistent ordering.
WebSocket Servers
Chat servers, live dashboards, SSE streams. Worker mode maintains persistent connections with shared room state.
AI MITM Proxy
Intercept and control AI requests. Add safety checks, modify prompts, block sensitive data, track usage — all in one script.
Rate Limiting
Track per-IP request counts in the shared object across requests. Worker mode makes in-memory rate limiting trivial.
Script Composition
Every script is an HTTP endpoint. Call other scripts with fetch(). Compose microservices from simple functions — no queues, no discovery.
Every API Surface, One Place.
Script execution, management, validation, templates, routing, monitoring, and magic comment control — all in one API.
Monitoring & Magic API
8 endpointsGET /api/v1/exec/monitor/stats
Execution & Scripts
7 endpointsGET/POST /:path
Validation
6 endpointsPOST /api/v1/exec/validate/script
Templates & Routing
6 endpointsGET /api/v1/exec/templates/list