MCP ServerSTDIOOfficialv1.0.2

gemini-researcher MCP Server

Stateless MCP server that proxies research queries to Gemini CLI, reducing agent context/model usage

io.github.capyBearista/gemini-researcher

Hosted URL

https://github.com/capyBearista/gemini-researcher

Transport

STDIO

Auth

No auth required

Connect to gemini-researcher

Hosted endpoint — paste into any MCP client.

https://github.com/capyBearista/gemini-researcher

Environment variables

Configuration this server reads at startup.

  • GEMINI_API_KEYSecret

    Gemini API key (optional if you already authenticated Gemini CLI via "gemini" login)

  • PROJECT_ROOT

    Override the project root directory used for path validation (defaults to current working directory)

  • RESPONSE_CHUNK_SIZE_KB

    Chunk size threshold (KB) for large responses (default: 10)

  • CACHE_TTL_MS

    Chunk cache TTL in milliseconds (default: 3600000 / 1 hour)

  • DEBUG

    Enable debug logging (set to "true" or "1")

  • GOOGLE_APPLICATION_CREDENTIALS

    Vertex AI / Google auth: path to a service account JSON credentials file

  • GOOGLE_CLOUD_PROJECT

    Vertex AI / Google auth: GCP project ID (used by some auth configurations)

  • VERTEX_AI_PROJECT

    Vertex AI / Google auth: Vertex AI project identifier (used by some auth configurations)

Resources

Where to find authoritative docs and source for gemini-researcher.

Try gemini-researcher with 30+ AI models

Open MCP Agent Studio and connect this server to Claude, GPT, Gemini, DeepSeek and more — no install required.

Open Agent Studio

Related servers

More on MCP Playground