The official Figma MCP server, available through Figma's Dev Mode, lets AI models read your design files — frames, components, styles, variables and exported assets — directly through the Model Context Protocol. Ask Claude to describe a component, extract design tokens, or summarise a screen's layout without leaving your conversation.
Bring your own
Claude Sonnet 4.5
MCP Playground runs 30+ models on the same workflow: switch anytime, or use Compare mode to run several in parallel and balance quality vs. cost.
Figma personal access token from figma.com/settings. Figma desktop app must be running. The server connects locally through Dev Mode.
How models use it and what it is built for.
Figma's MCP integration connects through the Figma desktop app, exposing your open design files as structured MCP tool results. Models can read frame hierarchies, inspect component properties and variants, extract colour and typography tokens, list auto-layout constraints, and retrieve asset export URLs. It is designed for developer handoff workflows: instead of a designer writing specs manually, an AI agent can read the Figma file and answer "what font size does the H2 use?" or "list all the components in the Card library." Requires the Figma desktop app and a personal access token.
Typical tools an AI model can call. Exact names vary by version.
Copy any of these into MCP Agent Studio after connecting.
List all the colour styles in this file and their hex values.
Describe the layout and spacing of the "Card — Product" component.
What font families and sizes are used across the design system?
Export the "Hero" frame as an SVG and tell me its dimensions.
This is not a single-model product: you get the same MCP connection with 30+ models (Claude, GPT, Gemini, DeepSeek, open-weight, and more), you can switch mid-conversation, and you can open Compare mode to run the same prompt against multiple models at once. The card above is a suggested starting point for this server — not the only choice.
Default pick for Figma
Claude Sonnet 4.5
Claude Sonnet 4.5 excels at structured design-token extraction and component description. Use Opus 4.7 for complex multi-screen layout reasoning.
Open MCP Agent Studio with the connection pre-filled. Add your token, pick any of 30+ models, and start chatting — no install required.
Open Agent StudioCommon questions about connecting, scoping and using it safely.
It is an official Figma integration that exposes your open design files as MCP tools. AI models can read frame layouts, component specs, design tokens and exported assets through a natural language conversation — no more copy-pasting specs from Dev Mode.
Dev Mode (where the MCP runs) is available on Figma Professional, Organisation and Enterprise plans. It is not available on the free Starter plan.
Yes. The server connects through the Figma desktop application running locally. It is not a cloud-hosted endpoint — the desktop app acts as the bridge between the MCP client and the Figma API.
It can read file structure, inspect components and variants, extract design tokens, describe layouts, and retrieve exported images. It cannot yet create or edit Figma files — it is a read-oriented integration focused on developer handoff.
The server uses a personal access token scoped to your own account. Traffic goes through the local desktop app — nothing is sent to a third-party server. Treat the token like a password and revoke it if compromised at figma.com/settings.
GitHub
Drive GitHub repos, PRs and issues with an AI agent.
Notion
Search, summarise and edit your Notion workspace with AI.
Vercel
List deployments, read logs, manage env keys and roll back from natural language.
Playwright
Give AI models real browser control — navigate, click, fill forms and screenshot any page.