Context7 MCP Server: Up-to-Date Docs for Your AI Coding Assistant (Setup Guide)
Nikhil Tiwari
MCP Playground
TL;DR
- Context7 is an MCP server that fetches up-to-date, version-specific library documentation for AI assistants
- Helps reduce outdated code and hallucinated APIs by adding current docs to the model's context
- Add
use context7to your prompt to trigger a documentation lookup - Works with Cursor, Claude Code, VS Code, Windsurf, and other MCP clients
- Open source by Upstash — MIT license (GitHub)
A common frustration with AI coding assistants: you ask for code using a library, and the response is based on an outdated API or a function that no longer exists. The model's training data has a cut-off, and libraries move fast.
Context7 is an MCP server designed to help with this. It fetches up-to-date, version-specific documentation and code examples from indexed library sources and adds them to your AI assistant's context window. The idea is straightforward: give the model current docs so it has a better chance of generating correct code.
The Problem Context7 Solves
- Code may rely on outdated training data
- Risk of hallucinated APIs that don't exist
- Answers may target the wrong package version
- Manual verification often needed
- Docs fetched from indexed library sources
- Version-specific examples when available
- Model has current API surface in context
- Triggered by adding "use context7" to your prompt
Quick Facts
| Detail | Info |
|---|---|
| GitHub | upstash/context7 |
| npm | @upstash/context7-mcp |
| Remote URL | https://mcp.context7.com/mcp |
| License | MIT |
| Built by | Upstash |
| API key | Free at context7.com/dashboard (higher rate limits) |
| Docs | context7.com/docs |
How It Works
Context7 exposes two MCP tools. When you include "use context7" in a prompt, the AI assistant calls them as part of its tool-use flow:
Searches for a library by name (e.g., "nextjs", "react") and returns a Context7 library ID. Ranked by name match, description relevance, snippet count, and trust score.
Fetches the actual documentation and code examples for a resolved library ID. Returns version-specific content from the source.
Typical flow: your prompt mentions a library → the AI calls resolve-library-id → gets the library ID → calls get-library-docs → the returned documentation is added to context → the model uses it when generating a response.
Setup: Cursor
Edit your Cursor MCP config file at ~/.cursor/mcp.json:
Option A: Remote HTTP server (simpler, no local process)
{
"mcpServers": {
"context7": {
"url": "https://mcp.context7.com/mcp",
"headers": {
"CONTEXT7_API_KEY": "YOUR_API_KEY"
}
}
}
}
Option B: Local stdio server via npx
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"],
"env": {
"CONTEXT7_API_KEY": "YOUR_API_KEY"
}
}
}
}
An API key is optional but recommended for higher rate limits. Get one free at context7.com/dashboard. After saving the config, restart Cursor for the server to load.
Setup: Claude Code
Option A: Remote HTTP server
claude mcp add --transport http \
--header "CONTEXT7_API_KEY: YOUR_API_KEY" \
context7 https://mcp.context7.com/mcp
Option B: Local stdio server via npx
claude mcp add --transport stdio \
--env CONTEXT7_API_KEY=YOUR_API_KEY \
context7 -- npx -y @upstash/context7-mcp
Verify the server is registered:
claude mcp list
You can also check status inside Claude Code with /mcp.
Setup: VS Code (Copilot Chat)
Add to .vscode/mcp.json in your project or your VS Code user settings:
{
"servers": {
"context7": {
"type": "http",
"url": "https://mcp.context7.com/mcp",
"headers": {
"CONTEXT7_API_KEY": "YOUR_API_KEY"
}
}
}
}
Setup: Claude Desktop
Claude Desktop supports stdio servers only. Add to your claude_desktop_config.json:
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"],
"env": {
"CONTEXT7_API_KEY": "YOUR_API_KEY"
}
}
}
}
Restart Claude Desktop after saving. The hammer icon in the text input indicates tools are loaded.
Usage
Once installed, just add "use context7" to any prompt:
# Examples
"Create a Next.js middleware that checks for a valid JWT. use context7"
"Set up a Hono API route with Zod validation. use context7"
"Configure a Cloudflare Worker to cache API responses for 5 minutes. use context7"
"Write a React Server Component that fetches data with Suspense. use context7"
What happens behind the scenes:
- The AI identifies the library mentioned in your prompt
- It calls
resolve-library-idto find the library in Context7's index - It calls
get-library-docsto fetch relevant documentation - The returned docs are added to context, and the model uses them when generating its response
Advanced Features
Private Repositories
Paid plans support private repositories. You can add them through the Context7 dashboard so the server can index and serve documentation for internal libraries.
Library Verification
Library owners can claim and verify their libraries on Context7. Verified libraries are flagged with a higher trust score in search results.
Direct API Usage
Context7 also has a REST API for programmatic access:
# Search for a library
curl -H "Authorization: Bearer YOUR_API_KEY" \
"https://context7.com/api/v2/libs/search?query=nextjs"
# Get documentation
curl -H "Authorization: Bearer YOUR_API_KEY" \
"https://context7.com/api/v2/context?libraryId=/vercel/next.js"
TypeScript SDK
For building custom integrations, Context7 provides a TypeScript SDK and a CLI tool. See the API Guide for details.
Why Use Context7 Over Just Searching Docs?
| Approach | Pros | Cons |
|---|---|---|
| LLM training data | No setup needed | Has a knowledge cut-off; may hallucinate or use old APIs |
| Manual doc search | Authoritative, current | Slow, requires context-switching |
| Custom RAG pipeline | Fully customizable, works with any source | Significant setup and maintenance effort |
| Context7 MCP | Quick setup, docs fetched on demand | Library must be in Context7's index; quality varies by library |
Supported Clients
Context7 should work with any MCP-compatible client. The following have documented setup guides on context7.com:
Test MCP Servers in Your Browser
Try Context7 and other remote MCP servers with MCP Playground
Open MCP Playground →Related Content
- What Is the Model Context Protocol (MCP)?
- Claude Code MCP: Best Servers Setup Guide
- Awesome MCP Servers List
- Build Your First MCP Server with Python and FastMCP
Frequently Asked Questions
Is Context7 free?
What libraries does Context7 support?
Should I use the remote server or local npx?
https://mcp.context7.com/mcp) is simpler — no local process to manage. The local npx option (@upstash/context7-mcp) runs on your machine, which may be preferable behind corporate firewalls or for lower latency. Both connect to the same Context7 backend for documentation data.Does "use context7" work automatically or do I have to type it every time?
Can I use Context7 with ChatGPT?
Written by Nikhil Tiwari
15+ years in product development. AI enthusiast building developer tools that make complex technologies accessible to everyone.
Related Resources