Handles 10M+ token contexts with chunking, sub-queries, and local Ollama inference.
io.github.egoughnour/massive-context-mcp
https://github.com/egoughnour/massive-context-mcp
STDIO
No auth required
Hosted endpoint — paste into any MCP client.
Configuration this server reads at startup.
Directory for storing context data
URL for Ollama server (default: http://localhost:11434)
Directory for storing context data
URL for Ollama server (default: http://localhost:11434)
Where to find authoritative docs and source for massive-context-mcp.
Open MCP Agent Studio and connect this server to Claude, GPT, Gemini, DeepSeek and more — no install required.
Open Agent Studio