Talk to your BigQuery warehouse without writing SQL. The hosted Google MCP endpoint exposes datasets, tables, schemas, the query API and the dry-run cost estimator as MCP tools — this template wires them up with a system prompt tuned for cost-aware analytics. Great for "how many DAU last quarter, broken down by plan tier?"-style questions where you want the answer plus the SQL plus the byte estimate.
Default model
Claude Sonnet 4.5
MCP servers
bigquery.googleapis.com
Auth
gcloud access token + GCP project ID (run `gcloud auth print-access-token`)
A few things this template does well out of the box.
Three steps to go from template to a live chat.
Click "Use this template"
Agent Studio opens with the MCP server, model and system prompt pre-filled.
Add your access token
gcloud access token + GCP project ID (run `gcloud auth print-access-token`)
Start chatting
Ask a question, watch live tool calls and switch models at any time to compare answers.
The endpoints this template connects to by default. You can swap any of them in Agent Studio.
https://bigquery.googleapis.com/mcp
bigquery.googleapis.com
A quick walkthrough for the credential this template needs.
Copy one into the studio to see the agent in action.
List the top 5 most expensive queries from yesterday in this project. How much did each one scan?
Schema for `my_dataset.events`? Sample 5 rows so I can see the shape.
Daily active users for the last 30 days, grouped by plan tier. Run it but show me the cost estimate first.
Find tables in `analytics_*` datasets that have not had a partition load in 48 hours.
Convert this question into BigQuery SQL: "What's the 7-day retention curve for users who signed up in March?" Don't run it, just show me the SQL.
The default instructions the model starts with. Edit it any time inside Agent Studio.
You are a BigQuery analytics assistant connected via Google's hosted MCP server. Use the available tools to: - Discover datasets, tables and views in the connected project - Inspect table schemas and sample rows before writing queries - Translate natural-language questions into well-formed BigQuery SQL (Standard SQL dialect) - Run queries — but always run a dry-run first to surface the bytes-scanned estimate - Surface stale partitions, slow queries, and obvious cost problems Cost discipline (this is non-negotiable in BigQuery): - Never run an unconstrained query against a large table — always add a partition filter or LIMIT - Always show the dry-run cost estimate before running, especially for queries above ~1GB scanned - Prefer SELECT specific columns over SELECT * — flag whenever a user asks for SELECT * - For repeated investigations, suggest a materialised view or a scheduled query When you write SQL, show it in a code block before running it. Explain non-obvious choices (CTEs, window functions, partition pruning) in one sentence each.
Open Agent Studio with this template pre-loaded. Add your token, pick any model, and start chatting.
Use this templateStripe Billing Assistant
Query customers, subscriptions, invoices and payment events without leaving your chat.
View template →Revenue Ops · Stripe, Linear & Slack
Correlate Stripe billing events with Linear bug reports and alert the team on Slack when revenue metrics change.
View template →Neon Data Ops
Manage your Neon Postgres database and stream query results and alerts directly to your Slack workspace.
View template →