Turn Existing APIs into MCP Servers

Many teams already have a fully-documented REST API described with OpenAPI 3.0+. Rather than re-writing that logic from scratch using a server framework, you can generate a thin MCP proxy that forwards requests to your upstream API while speaking the Model Context Protocol to AI assistants.

These generators focus on existing services, whereas the framework comparison targets building new servers from the ground-up.

Available Generators

Quick Start (stdio example)

# Install globally
npm i -g openapi-mcp-generator

# Generate a stdio MCP server from your OpenAPI file
openapi-mcp-generator \
  --input ./petstore.json \
  --output ./petstore-mcp

cd petstore-mcp && npm install && npm start

The CLI will:

  1. Parse your OpenAPI document
  2. Create a fully-typed TypeScript project with @modelcontextprotocol/sdk
  3. Generate Zod schemas for validation
  4. Scaffold a stdio entry-point (or web / streamable-http based on flags)
  5. Provide sample HTML clients for quick testing

When to Choose Automatically-Generated Servers

  • Legacy REST APIs needing AI integration fast
  • Backend teams that don’t want to learn MCP details
  • Prototyping multiple microservices for LLM agents
  • Consistency – guarantees your MCP contract stays in sync with the OpenAPI source

Generated code is an excellent starting point, but you may still want to harden auth, logging, and error-handling before production use.

Feature Highlights

Featureopenapi-mcp-generator
OpenAPI 3.0+ Support
Transports (stdio, web w/ SSE, streamable-http)
Auth Schemes (API key, Bearer, Basic, OAuth2)
Runtime Validation (Zod)
Typed Output (TypeScript)
Test Clients (HTML)

Limitations & Pitfalls

LLM experts (e.g. David Cramer) warn that a naïve OpenAPI-as-MCP shim runs into several issues:

  • Tool overload: LLMs struggle to pick from dozens (or hundreds) of endpoints. Slightly similar verbs (create_post vs post_message) cause frequent mis-selections.
  • Excessive parameters: Classic REST endpoints expose many optional fields; the model must decide what to fill, without UI context.
  • Context limits: Many models cap tool descriptions (~1 kB for some). Rich OpenAPI docs get truncated, losing critical guidance.
  • Response verbosity: Machine-oriented JSON responses (IDs, uuids, nested data) are hard for models to interpret without additional reasoning.
  • Mismatch in abstractions: Great MCP tools are designed around user-intent tasks (“Send a Slack message to #sales”) rather than low-level CRUD endpoints.

Treat the generated code as a starting template. Refactor the resulting tools, prune unnecessary endpoints, and wrap complex flows into higher-level, LLM-friendly actions.

Further Reading