Stateless MCP server that proxies research queries to Gemini CLI, reducing agent context/model usage
io.github.capyBearista/gemini-researcher
https://github.com/capyBearista/gemini-researcher
STDIO
No auth required
Hosted endpoint — paste into any MCP client.
Configuration this server reads at startup.
Gemini API key (optional if you already authenticated Gemini CLI via "gemini" login)
Override the project root directory used for path validation (defaults to current working directory)
Chunk size threshold (KB) for large responses (default: 10)
Chunk cache TTL in milliseconds (default: 3600000 / 1 hour)
Enable debug logging (set to "true" or "1")
Vertex AI / Google auth: path to a service account JSON credentials file
Vertex AI / Google auth: GCP project ID (used by some auth configurations)
Vertex AI / Google auth: Vertex AI project identifier (used by some auth configurations)
Where to find authoritative docs and source for gemini-researcher.
Open MCP Agent Studio and connect this server to Claude, GPT, Gemini, DeepSeek and more — no install required.
Open Agent Studio