Data warehouseData

BigQuery Agent

Talk to your BigQuery warehouse without writing SQL. The hosted Google MCP endpoint exposes datasets, tables, schemas, the query API and the dry-run cost estimator as MCP tools — this template wires them up with a system prompt tuned for cost-aware analytics. Great for "how many DAU last quarter, broken down by plan tier?"-style questions where you want the answer plus the SQL plus the byte estimate.

Default model: Claude Sonnet 4.51 serverAccess token required

Default model

Claude Sonnet 4.5

MCP servers

bigquery.googleapis.com

Auth

gcloud access token + GCP project ID (run `gcloud auth print-access-token`)

What you can do

A few things this template does well out of the box.

  • List datasets and tables in a project, with schemas and row counts
  • Translate natural-language questions into BigQuery SQL — with a dry-run cost estimate before execution
  • Audit data freshness — find stale tables, missing partitions, or partitions that have not loaded
  • Compare metrics across regions, time periods, or dimensions without context-switching to the BQ console
  • Generate Looker-ready SQL from a chat brief, copy-paste into a Looker tile or scheduled query

How it works

Three steps to go from template to a live chat.

1

Click "Use this template"

Agent Studio opens with the MCP server, model and system prompt pre-filled.

2

Add your access token

gcloud access token + GCP project ID (run `gcloud auth print-access-token`)

3

Start chatting

Ask a question, watch live tool calls and switch models at any time to compare answers.

MCP servers used

The endpoints this template connects to by default. You can swap any of them in Agent Studio.

https://bigquery.googleapis.com/mcp

bigquery.googleapis.com

HTTP

Getting your access token

A quick walkthrough for the credential this template needs.

  1. 1**Install gcloud** (if needed): `brew install --cask google-cloud-sdk` on macOS, or follow [cloud.google.com/sdk/install](https://cloud.google.com/sdk/install).
  2. 2**Authenticate**: `gcloud auth login` then `gcloud config set project YOUR_PROJECT_ID`.
  3. 3**Get an access token**: `gcloud auth print-access-token` — copy the output. Paste into the **Access Token** field. (Heads up: tokens expire every ~1 hour — re-run when the agent starts returning 401s.)
  4. 4**Project ID**: paste your GCP project (the one that should be billed for queries) into the **GCP Project ID** field.
  5. 5**Permissions**: the user needs `BigQuery Data Viewer` for read-only, `BigQuery User` to run queries. Grant via IAM or a custom role.
  6. 6Send *"List the datasets in this project"* to confirm.

Try these prompts

Copy one into the studio to see the agent in action.

  • List the top 5 most expensive queries from yesterday in this project. How much did each one scan?

  • Schema for `my_dataset.events`? Sample 5 rows so I can see the shape.

  • Daily active users for the last 30 days, grouped by plan tier. Run it but show me the cost estimate first.

  • Find tables in `analytics_*` datasets that have not had a partition load in 48 hours.

  • Convert this question into BigQuery SQL: "What's the 7-day retention curve for users who signed up in March?" Don't run it, just show me the SQL.

System prompt

The default instructions the model starts with. Edit it any time inside Agent Studio.

You are a BigQuery analytics assistant connected via Google's hosted MCP server.

Use the available tools to:
- Discover datasets, tables and views in the connected project
- Inspect table schemas and sample rows before writing queries
- Translate natural-language questions into well-formed BigQuery SQL (Standard SQL dialect)
- Run queries — but always run a dry-run first to surface the bytes-scanned estimate
- Surface stale partitions, slow queries, and obvious cost problems

Cost discipline (this is non-negotiable in BigQuery):
- Never run an unconstrained query against a large table — always add a partition filter or LIMIT
- Always show the dry-run cost estimate before running, especially for queries above ~1GB scanned
- Prefer SELECT specific columns over SELECT * — flag whenever a user asks for SELECT *
- For repeated investigations, suggest a materialised view or a scheduled query

When you write SQL, show it in a code block before running it. Explain non-obvious choices (CTEs, window functions, partition pruning) in one sentence each.

Ready to try the BigQuery Agent?

Open Agent Studio with this template pre-loaded. Add your token, pick any model, and start chatting.

Use this template

Related templates