MCP Server for Snowflake: Options, Tools & Setup

Emily Winks profile picture
Data Governance Expert
Updated:05/12/2026
|
Published:05/12/2026
9 min read

Key takeaways

  • Snowflake offers two MCP server options: a Snowflake-managed server, and a self-managed Labs server for full control.
  • The Snowflake-managed server supports Cortex tools, direct SQL, and generic user-defined tools.
  • The self-managed Labs server adds custom Python tools, governed SQL, and semantic view support beyond the managed option.
  • Both Snowflake MCP options stop at Snowflake's boundary. Cross-stack context requires a separate enterprise context layer.

What is the MCP server for Snowflake?

The MCP server for Snowflake is an interface that exposes Snowflake databases, schema metadata, and Cortex AI capabilities to external AI agents and tools through the Model Context Protocol. Snowflake released its MCP server in preview in October 2025 and went GA (generally available) in November. Two deployment options exist: a Snowflake-managed server and a self-hosted Snowflake Labs server, each with different trade-offs for infrastructure control and authentication.

Two deployment options:

  • Snowflake-managed MCP server: Snowflake handles infrastructure and authentication while you define tools and per-tool permissions.
  • Self-managed Snowflake-Labs MCP server: When you want full control over infrastructure, authentication, and authorization.

Is your AI context-ready?

Assess Your Readiness

Build Your AI Context Stack

Get the blueprint for implementing context graphs across your enterprise. This guide walks through the four-layer architecture from metadata foundation to agent orchestration, with practical implementation steps for 2026.

Get the Stack Guide

Why do you need an MCP server for Snowflake?

Permalink to “Why do you need an MCP server for Snowflake?”

AI agents are showing up across various workflows in Snowflake, such as:

  • Writing data and ML pipelines.
  • Observing and improving quality.
  • Enforcing governance.
  • Speeding up the path to curated data.
  • Helping you create Streamlit-based reporting and visualization apps.

AI agents work on context, which they can’t get from traditional APIs in a flexible, extensible manner.

What they need is a single protocol for context, tool discovery and invocation, and governance across all the systems connected to your data warehouse. That’s why Snowflake released its MCP server last year.


How can you use Snowflake’s managed MCP server with AI agents?

Permalink to “How can you use Snowflake’s managed MCP server with AI agents?”

The Snowflake-managed server is just like Cortex, Snowpipe, and Openflow — its infrastructure is managed by Snowflake. The MCP server is an object scoped to a database and a schema. Once you spin it up, you can use Snowflake’s internal tools, which you can define in the YAML specification.

Overall, there are three categories of tools supported by Snowflake:

  • Cortex tools: Allows access to text-to-SQL, semantic search, and agent orchestration based on Cortex Analyst, Search, and Agent.

  • Direct SQL: Gives a path to direct execution of SQL via the SYSTEM_EXECUTE_SQL tool, which supports an optional read_only flag to restrict to SELECT-only queries.

  • Generic: Basically a user-defined function or a stored procedure with custom logic as a tool for the MCP server.

You can create a server by running the following command:

CREATE OR REPLACE MCP SERVER <database_name>.<schema_name>.<mcp_server_name>
FROM SPECIFICATION $$
<YAML spec>
$$;

Once you create the MCP server, you can grant usage to specific roles and services to enable invocations. This type of server is best suited for agents that don’t need to go outside Snowflake.

Also note that this server is deliberately scoped to allow only the execution of tools, without implementing MCP’s resources, prompts, roots, or sampling capabilities. A much broader implementation of that is the self-hosted option built by Snowflake Labs.

CIO Guide to Context Graphs

For data leaders evaluating where to start, Atlan's CIO guide to context graphs walks through a practical four-layer architecture from metadata foundation to agent orchestration.

Get the CIO Guide

How can you use the self-hosted MCP server built by Snowflake Labs?

Permalink to “How can you use the self-hosted MCP server built by Snowflake Labs?”

The self-hosted MCP server by Snowflake Labs is based on FastMCP, the framework most production Python MCP servers are built on. You can use this Snowflake Labs implementation by running the following command:

uvx snowflake-labs-mcp --service-config-file config.yaml

These tools are passed to the server at startup. You can define and declare these tools in the YAML file, which looks something like this:

search_services:
  - service_name: "company_knowledgebase_search"
    description: "To support company-wide semantic and ontological search."
    database_name: "search"
    schema_name: "kb_search"

other_services:
  query_manager: true
  semantic_manager: true
  object_manager: false

sql_statement_permissions:
  Select: true
  Insert: true
  Update: true
  Delete: false
  Drop: false
  Unknown: false

This MCP server type supports all the same Cortex services as the Snowflake-managed MCP server. Additionally, it also supports:

  • Custom Python-defined tools: Probably the most important addition because it allows you to fork the server and define custom tools using FastMCP’s @mcp.tool() decorator. The custom tool then runs within the server process and does the work for you.

  • Object management: While it is possible to manage objects on the managed server through SYSTEM_EXECUTE_SQL or by wrapping a stored procedure with the GENERIC tool option, the self-managed server allows you to do full object management, as shown in the YAML file example above.

  • Governed SQL: Uses the query_manager tool to run AI-generated SQL queries governed by the SQL permissions defined per-expression-type and parsed by sqlglot. This is somewhat possible with the Snowflake-managed version, but without the granular permissions.

  • Semantic View tools: Allow you to work with Semantic Views via the semantic_manager, which helps with understanding organizational metrics and entities/dimensions.

This server also uses FastMCP transports such as stdio and streamable-http, making it ideal for both local and remote deployments. Using this MCP server makes the most sense when you are writing custom Python-based tools, and especially when you are doing local agent development.

However, none of these options alone can work well for an organization with a data and context ecosystem that includes tools beyond Snowflake. Not having access to all those tools means not having access to all the context.


Why should you extend Snowflake-only MCP servers for the rest of your data and AI estate?

Permalink to “Why should you extend Snowflake-only MCP servers for the rest of your data and AI estate?”

Data stacks in large organizations and enterprises are not limited to a single tool or platform. They include various tools for orchestration, ETL/ELT, visualization, and ML workflows, among others. The business logic is spread across these tools, so the context is as well.

For AI agents to operate effectively and accurately, they need full, relevant context about the organization. That’s something these two MCP server options can’t give as they are — more or less limited to Snowflake. And, while the self-hosted MCP server option allows you to define custom MCP tools, it doesn’t give you a proper way to manage the context you gather from these various tools.

What you need in this situation is an enterprise context layer that sits horizontally across your organization, connecting to all the tools in your data stack.

While the Snowflake MCP server handles context within Snowflake, the enterprise context layer extends it by encompassing the rest of your data and AI stack.

That’s where Atlan comes into the picture. It’s a platform built for agentic engineering with context at the core.


How does Atlan enable AI agents with an enterprise context layer?

Permalink to “How does Atlan enable AI agents with an enterprise context layer?”

Atlan builds the context layer by leveraging its Context Lakehouse architecture, which accumulates and organizes context from every tool in your data stack. Atlan can do that because it speaks the protocols that AI agents and humans use to access metadata.

This context is fetched from and becomes available to other tools via the Atlan MCP server. This context, served from Atlan’s Context Lakehouse, includes lineage, governance, and quality metadata across your stack.


Key capabilities of an enterprise context layer like Atlan

With these tools and more, you can use Snowflake’s MCP servers alongside Atlan’s MCP server, because typical MCP clients (agentic or human) connect to multiple MCP servers simultaneously, sometimes even acting as a context facilitator between two separate MCP servers. Atlan does play that role too.

For features related to agentic data engineering and its contribution to semantic standards, Atlan collaborated with Snowflake as a launch partner for Open Semantic Interchange and Snowflake Intelligence.

Inside Atlan AI Labs & The 5x Accuracy Factor

Learn how context engineering drove 5x AI accuracy in real customer systems. Explore real experiments, quantifiable results, and a repeatable playbook for closing the gap between AI demos and production-ready systems.

Download Ebook

Moving forward with MCP server for Snowflake

Permalink to “Moving forward with MCP server for Snowflake”

Snowflake gives you two clear ways to connect your AI agents to your data warehouse: managed MCP server and self-hosted MCP server. Both options work quite well with Snowflake. However, all of the organizational context doesn’t often live within just Snowflake. The context stops at Snowflake’s boundary and that’s where Atlan comes in.

Atlan’s enterprise context layer is built on its proven Context Lakehouse that powers a knowledge graph for business domains. Atlan’s context capabilities and Atlan MCP server power a unified context layer that’s available to all AI agents, going beyond Snowflake.


FAQs about MCP server for Snowflake

Permalink to “FAQs about MCP server for Snowflake”

1. How does the Snowflake MCP server work?

Permalink to “1. How does the Snowflake MCP server work?”

Snowflake’s MCP server exposes Snowflake-native services as tools for agents to use. These tools include Cortex services like Analyst, Search, and Agents, along with other SQL execution and generic services for writing tools as custom user-defined functions and stored procedures. MCP clients connect to servers, discover tools, and invoke them when LLM-based orchestration requires it.

2. What’s the difference between the managed and self-hosted Snowflake MCP servers?

Permalink to “2. What’s the difference between the managed and self-hosted Snowflake MCP servers?”

The key difference between the two is that the managed Snowflake MCP server runs on Snowflake infrastructure and uses its native authentication and authorization. The self-hosted Snowflake Labs version is a Pythonic implementation of FastMCP that lets you add custom Python tools and apply SQL governance, among other things.

3. What permissions are required for agents to work with the Snowflake managed MCP server?

Permalink to “3. What permissions are required for agents to work with the Snowflake managed MCP server?”

The MCP server within Snowflake is just like any other object. First, you have to grant USAGE on the server object to the role of your choice. Second, you need to grant privileges to the individual tools you want the MCP server to access.

4. Can the Snowflake MCP server work with the Atlan MCP server?

Permalink to “4. Can the Snowflake MCP server work with the Atlan MCP server?”

Yes, that’s exactly how they are supposed to work: together. AI agents via MCP clients connect to multiple MCP servers simultaneously, including those for both Snowflake and Atlan. The MCP client orchestrates context exchange between the two servers, passing structured metadata like lineage, governance, quality, and classification, along with semi-structured and unstructured content.

5. What MCP clients can connect to the Snowflake and Atlan MCP servers?

Permalink to “5. What MCP clients can connect to the Snowflake and Atlan MCP servers?”

Both servers work with clients such as Claude, Cursor, CrewAI, and Amazon Bedrock AgentCore. Essentially, both these MCP servers work with any client that supports MCP.

Share this article

signoff-panel-logo

Atlan is the next-generation platform for data and AI governance. It is a control plane that stitches together a business's disparate data infrastructure, cataloging and enriching data with business context and security.

Bridge the context gap.
Ship AI that works.

[Website env: production]