Problem
When you ship an API, consumers expect a polished TypeScript SDK and increasingly an MCP server so AI assistants can interact with it. Hand-rolling these is tedious: you must keep types in sync with every endpoint change, write boilerplate for pagination, error handling, and retries, and then repeat the work again for MCP tool definitions. A single schema drift between OpenAPI spec, SDK, and MCP server causes silent failures.
Solution
Step 1: Connect your OpenAPI spec to Stainless
Sign up at stainless.com and point it at your OpenAPI spec (JSON or YAML). Stainless generates an OpenAI-quality TypeScript SDK automatically.
# Example: configure stainless via their dashboard or CLI
# Point to your API spec
curl -X POST https://api.stainless.com/v1/projects \
-H "Authorization: Bearer $STAINLESS_API_KEY" \
-d '{"openapi_url": "https://api.example.com/openapi.json"}'
Step 2: Use the generated SDK
The output is a fully typed Node SDK with retries, pagination, and error classes:
import ExampleAPI from "@example/sdk";
const client = new ExampleAPI({ apiKey: process.env.EXAMPLE_API_KEY });
const users = await client.users.list({ limit: 10 });
// Fully typed response with pagination support built in
Step 3: Deploy the generated MCP server to Cloudflare Workers
Stainless also generates an MCP server from the same spec. Deploy it to Cloudflare Workers with bearer auth:
// worker.ts - generated MCP server entry point
import { createMCPHandler } from "@example/mcp-server";
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const auth = request.headers.get("Authorization");
if (auth !== `Bearer ${env.MCP_SECRET}`) {
return new Response("Unauthorized", { status: 401 });
}
return createMCPHandler({ apiKey: env.EXAMPLE_API_KEY })(request);
},
};
# Deploy to Cloudflare Workers
npx wrangler deploy
Step 4: Keep everything in sync
When your API changes, update the OpenAPI spec and Stainless regenerates both the SDK and MCP server. Delete your hand-rolled MCP code entirely.
Why It Works
Stainless (the same tool OpenAI uses for their SDKs) treats the OpenAPI spec as the single source of truth. Every endpoint, type, and parameter flows through one pipeline to produce both the SDK and MCP server. This eliminates the drift that occurs when maintaining parallel hand-written implementations. Cloudflare Workers provide low-latency edge deployment for the MCP server with simple bearer token authentication.
Context
- Stainless is free for open-source projects and offers paid tiers for private APIs
- The generated SDK includes retry logic, streaming support, and proper TypeScript generics
- MCP servers on Cloudflare Workers can be connected to Claude Desktop, Claude Code, or any MCP-compatible client
- This pattern replaces the need for hand-rolled MCP implementations that must be manually updated with each API change
- Stainless supports generating SDKs in Python, Go, Java, and Kotlin in addition to TypeScript