Join our Discord Server
Adesoji Alu Adesoji brings a proven ability to apply machine learning(ML) and data science techniques to solve real-world problems. He has experience working with a variety of cloud platforms, including AWS, Azure, and Google Cloud Platform. He has a strong skills in software engineering, data science, and machine learning. He is passionate about using technology to make a positive impact on the world.

Snowflake, Model Context Protocol (MCP) Server and Claude Desktop

3 min read

 

TL;DR: We’ll set up and run a Model Context Protocol (MCP) server that communicates with Snowflake to run SQL queries. We’ll install using Smithery (a frictionless approach for installing MCP servers). Then we’ll run the server, show logs, and test a quick read query.

Overview

The mcp-snowflake-server is an MCP server that bridges your environment (e.g., Claude Desktop or any MCP-compatible client) to a Snowflake database. It offers a dynamic resource for storing insights, plus six main tools for querying and managing Snowflake:

  • read_query (SELECT statements)
  • write_query (INSERT, UPDATE, DELETE)
  • create_table
  • list_tables
  • describe_table
  • append_insight

Under the hood, it ensures that you don’t accidentally run write operations without the proper flags, and it can update a memo://insights resource as new insights come in.

Prerequisites

  • Ubuntu 23.10 (or other modern Linux/macOS/Windows).
  • Node.js (for the Smithery CLI).
  • Snowflake account credentials: account name, user, password, warehouse, role, database, schema.

If you don’t have Node.js installed, you can quickly do so on Ubuntu:

sudo apt update
sudo apt install nodejs npm

Step 1: Installation with Smithery

The easiest way to run mcp_snowflake_server is via Smithery.

Install the Smithery CLI:

npm install -g @smithery/cli

Install the mcp_snowflake_server:

npx -y @smithery/cli install mcp_snowflake_server --client claude

What happens?

  • Smithery will automatically download and set up the mcp_snowflake_server in a hidden location.
  • It will generate a configuration prompt or JSON snippet, so you can define how the server should be run.

Step 2: Configuring the Server

Smithery reads from a smithery.yaml file or can be configured interactively. It expects the following environment variables for Snowflake:

  • ACCOUNT
  • WAREHOUSE
  • USER
  • PASSWORD
  • ROLE
  • DATABASE
  • SCHEMA

If you want to run the server with a local environment variable configuration, you might do something like:

export ACCOUNT="my_snowflake_acct"
export WAREHOUSE="my_warehouse"
export USER="my_user"
export PASSWORD="my_password"
export ROLE="ACCOUNTADMIN"
export DATABASE="MY_DB"
export SCHEMA="PUBLIC"

If you want to allow write operations, add:

export ALLOW_WRITE="1"

Step 3: Running the Server

After installing with Smithery, you can typically run:

smithery start mcp_snowflake_server

If your smithery.yaml has the commandFunction properly set, it will start the MCP server on stdio. You might see logs like:

Starting Snowflake MCP Server...
Connected to Snowflake account: my_snowflake_acct
Listening for MCP requests on stdio...

Verifying the Logs

If you enabled logging (for example, by specifying LOG_DIR or LOG_LEVEL in your environment), you might see:

[INFO] Attempting Snowflake connection...
[INFO] Connection established successfully.
[DEBUG] Tools loaded: read_query, list_tables, ...
[INFO] MCP Server listening on stdio

Tip: If you provided LOG_DIR="/var/log/snowflake_mcp" in your environment, the logs would be written out to that directory.

Step 4: Testing a Query

With the server running, any MCP-compatible client can talk to it. Below is a conceptual example using a hypothetical local test or an MCP client:

Example: list_tables

{
  "tool": "list_tables",
  "input": {}
}

Example Response:

[
  "MY_TABLE_1",
  "MY_TABLE_2",
  "ANOTHER_TABLE"
]

Example: read_query

{
  "tool": "read_query",
  "input": {
    "query": "SELECT * FROM MY_TABLE_1 LIMIT 5"
  }
}

Example Response:

[
  { "ID": 1, "NAME": "Alice", "CREATED_AT": "2025-01-02" },
  { "ID": 2, "NAME": "Bob",   "CREATED_AT": "2025-01-05" },
  ...
]

Example: append_insight

{
  "tool": "append_insight",
  "input": {
    "insight": "Table MY_TABLE_1 has 50,000 rows"
  }
}

Example Response:

{
  "message": "Insight appended successfully."
}

This also updates the memo://insights resource with your new data insight.

Alternate Approaches

1. Running via Docker

If you prefer Docker, you can use the public Docker image and pass your Snowflake connection as a DSN-like string. For instance, in your claude_desktop_config.json:

{
  "mcpServers": {
    "snowflake": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "ajeetraina777/snowflake-mcp-server",
        "snowflake://user:password@UGB57389/TASTY_BYTES_SAMPLE_DATA/<SCHEMA>?warehouse=<WAREHOUSE>&role=ACCOUNTADMIN"
      ]
    }
  }
}

What to expect: The container will spin up, connect to Snowflake, and respond to MCP commands on stdin/stdout. Stopping the container will kill the server session.

2. Installing Locally (Python + uv tool)

You can clone this repo locally, and run it in a Python virtual environment. Suppose you have:

git clone https://github.com/yourorg/mcp-snowflake-server.git
cd mcp-snowflake-server
python3 -m venv .venv
source .venv/bin/activate
pip install .

Then configure your claude_desktop_config.json (or equivalent MCP config) with something like:

"mcpServers": {
  "snowflake_local": {
      "command": "uv",
      "args": [
          "--directory",
          "/absolute/path/to/mcp_snowflake_server",
          "run",
          "mcp_snowflake_server",
          "--account",
          "the_account",
          "--warehouse",
          "the_warehouse",
          "--user",
          "the_user",
          "--password",
          "their_password",
          "--role",
          "the_role",
          "--database",
          "the_database",
          "--schema",
          "--allow_write"
      ]
  }
}

And run:

uv --directory /absolute/path/to/mcp_snowflake_server \
   run mcp_snowflake_server \
   --account my_snowflake_acct \
   --warehouse my_warehouse \
   --user my_user \
   --password my_password \
   --role ACCOUNTADMIN \
   --database MY_DB \
   --schema PUBLIC

What to Expect: You’ll see console output from the server. Log messages will appear either in the console or in the directory you specify via --log_dir.

Advanced: Write Detection

The server’s code includes a write_detector.py module that looks for SQL keywords like INSERT, UPDATE, DELETE, CREATE, etc. If you haven’t allowed writes (--allow_write or ALLOW_WRITE=1), the server will reject such queries. This is a useful safeguard in data analysis environments.

Wrapping Up

This blog covered:

  • Installation via Smithery (quick approach).
  • Configuration (setting environment variables for your Snowflake credentials).
  • Running the server and verifying logs.
  • Testing with a few tool commands to confirm it can query Snowflake.
  • Alternative Docker and local Python usage for more advanced setups.

Using mcp_snowflake_server, you can easily integrate Snowflake queries into any workflow that supports Model Context Protocol servers—whether that’s Claude Desktop, UVX/UV, or other automation tools. Enjoy your new capabilities to analyze data, discover insights, and keep a rolling memo of observations—all from within a single MCP server interface!

Thanks for reading and happy querying on Ubuntu 23.10! If you have any questions or feedback, feel free to open an issue on GitHub or visit the Smithery documentation.

Have Queries? Join https://launchpass.com/collabnix

Adesoji Alu Adesoji brings a proven ability to apply machine learning(ML) and data science techniques to solve real-world problems. He has experience working with a variety of cloud platforms, including AWS, Azure, and Google Cloud Platform. He has a strong skills in software engineering, data science, and machine learning. He is passionate about using technology to make a positive impact on the world.
Join our Discord Server
Index