> For the complete documentation index, see [llms.txt](https://developer.paddle.com/llms.txt).

# Build Paddle integrations faster with the Paddle docs MCP server

The Paddle docs MCP server connects Cursor, VS Code, Claude Code, and any MCP-compatible AI assistant to live Paddle documentation, the complete API spec, and current SDK references.

---

## What's new?

We've released [the Paddle docs MCP server](https://developer.paddle.com/sdks/ai/docs-mcp.md), a hosted [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) server that gives your AI assistant up-to-date knowledge of the entire Paddle platform.

Install it everywhere you use AI so your conversations in clients and IDEs immediately draw on current Paddle documentation, the full API spec, and live SDK references.

{% callout type="note" %}
To let AI assistants take actions in your account like creating catalog items, setting up notification destinations, or managing subscriptions, set up the [Paddle MCP server](https://developer.paddle.com/sdks/ai/paddle-mcp.md).
{% /callout %}

## How it works

AI assistants are only as good as what they know. Without access to the latest documentation, they can give outdated advice, miss important details, or confidently suggest things that are no longer accurate.

[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is an open standard that lets AI assistants query live external sources during a conversation, rather than relying entirely on what they learned during training.

With the Paddle docs MCP server installed, your AI assistant retrieves current Paddle content in real time whenever it needs to respond to a Paddle-related question. Answers are grounded in current documentation rather than whatever was indexed at training time.

The biggest gains are in AI-powered IDEs like [Cursor](https://cursor.com), [VS Code](https://code.visualstudio.com), and [Claude Code](https://claude.ai), where your assistant already has context about your codebase. You can ask it to write a webhook handler, explain what a transaction status means in the context of your provisioning logic, or review your subscription management functionality against the latest API spec — and it'll draw on both your code and live Paddle content to respond.

The Paddle docs MCP server is compatible with any AI client that supports MCP. This includes popular tools such as [Cursor](https://cursor.com), [VS Code](https://code.visualstudio.com), [Claude Code](https://claude.ai), [OpenAI Codex](https://openai.com/codex), and [Raycast](https://raycast.com), as well as any other AI client designed for MCP integration.

You only need to install it once per AI tool. The server runs in the background and performs searches automatically with no extra instruction required. Most clients show a visible indicator when a search is in progress.

{% callout type="info" %}
On first use, you may be asked to authenticate with [Kapa.ai](https://kapa.ai) using your Google account. This confirms the request is legitimate and keeps the service reliable for everyone.
{% /callout %}

## Next steps

The Paddle docs MCP server is available now.

Installation takes a few minutes and varies slightly by AI client. For step-by-step instructions covering Claude Code, Cursor, VS Code, and other MCP-compatible tools, see [Give AI assistants up-to-date Paddle knowledge](https://developer.paddle.com/sdks/ai/docs-mcp.md).