MCP ServerSTDIOOfficialv3.0.1

massive-context-mcp MCP Server

Handles 10M+ token contexts with chunking, sub-queries, and local Ollama inference.

io.github.egoughnour/massive-context-mcp

Hosted URL

https://github.com/egoughnour/massive-context-mcp

Transport

STDIO

Auth

No auth required

Connect to massive-context-mcp

Hosted endpoint — paste into any MCP client.

https://github.com/egoughnour/massive-context-mcp

Environment variables

Configuration this server reads at startup.

  • RLM_DATA_DIR

    Directory for storing context data

  • OLLAMA_URL

    URL for Ollama server (default: http://localhost:11434)

  • RLM_DATA_DIR

    Directory for storing context data

  • OLLAMA_URL

    URL for Ollama server (default: http://localhost:11434)

Resources

Where to find authoritative docs and source for massive-context-mcp.

Try massive-context-mcp with 30+ AI models

Open MCP Agent Studio and connect this server to Claude, GPT, Gemini, DeepSeek and more — no install required.

Open Agent Studio

Related servers

More on MCP Playground