Skip to main content

PostgreSQL / MySQL / BigQuery / Snowflake

In this quick start guide, we will write our first script in SQL. We will see how to connect a Windmill instance to an external SQL service and then send queries to the database using Windmill Scripts.

Windmill & PostgreSQL, MySQL, BigQuery and Snowflake


Windmill supports PostgreSQL, MySQL, BigQuery and Snowflake. In any case, it requires creating a dedicated resource.

Create Resource

Windmill provides integrations with many different apps and services with the use of Resources. Resources are rich objects in JSON that allow to store configuration and credentials.

Each Resource has a Resource Type (PostgreSQL, MySQL, BigQuery, Snowflake) that defines the schema that the resource of this type needs to implement. Schemas implement the JSON Schema specification.

tip

You can find a list of all the officially supported Resource Types on Windmill Hub.

PostgreSQL

To be able to connect to a PostgreSQL instance (Supabase, Neon.tech, etc.), we'll need to define a Resource with the PostgreSQL Resource Type first.

Head to the Resources page in the Windmill app, click on "Add a resource/API" in the top right corner and select the PostgreSQL type.

Select PostgreSQL Resource Type

Fill out the form with the information of your PostgreSQL instance and "Test connection" if needed.

Paste in Resource Values

tip

For testing purposes, you can use the sample PostgreSQL Resource provided to every user. It is available under the path f/examples/demo_windmillshowcases.

PostgreSQL: Add a Supabase Database

Windmill provides a wizard to easily add a Supabase database through PostgreSQL.


When creating a new PostgreSQL resource, just "Add a Supabase DB". This will lead you to a Supabase page where you need to pick your organization. Then on Windmill pick a database, fill with database password and that's it.

Use SQL to build on external APIs using Sequin

With Sequin, developers can build on top of third-party services like Salesforce or HubSpot using SQL. More details at:

MySQL

To be able to connect to a MySQL instance, we'll need to define a Resource with the MySQL Resource Type first.

Head to the Resources page in the Windmill app, click on "Add a resource/API" in the top right corner and select the MySQL type.

Select MySQL Resource Type

Fill out the form with the information of your MySQL instance and "Test connection" if needed.

Paste in Resource Values

PropertyTypeDescriptionDefaultRequiredWhere to Find
hoststringInstance hostfalseYour hosting provider's control panel or in your server's MySQL configuration file
portnumberInstance port3306falseYour hosting provider's control panel or in your server's MySQL configuration file
userstringUsernametrueCreated in MySQL (e.g., via phpMyAdmin or MySQL Workbench) or provided by your hosting
databasestringDatabase nametrueCreated in MySQL (e.g., via phpMyAdmin or MySQL Workbench) or provided by your hosting
passwordstringUser's passwordtrueCreated in MySQL (e.g., via phpMyAdmin or MySQL Workbench) or provided by your hosting

BigQuery

To be able to connect to a BigQuery instance, we'll need to define a Resource with the BigQuery Resource Type first.

Head to the Resources page in the Windmill app, click on "Add a resource/API" in the top right corner and select the BigQuery type.

Select BigQuery Resource Type

PropertyTypeDescriptionRequired
auth_provider_x509_cert_urlstringAuth provider X.509 certificate URL.false
client_x509_cert_urlstringClient X.509 certificate URL.false
private_key_idstringID of the private key used for authentication.false
client_emailstringEmail associated with the service account.false
private_keystringPrivate key used for authentication.false
project_idstringGoogle Cloud project ID.true
token_uristringOAuth 2.0 token URI.false
client_idstringClient ID used for OAuth 2.0 authentication.false
auth_uristringOAuth 2.0 authorization URI.false
typestringType of the authentication method.false

Here's a step-by-step guide on where to find each detail.

  1. Service Account Creation:

    • Go to the Google Cloud Console.
    • Select the appropriate project from the top menu.
    • In the left navigation pane, go to "IAM & Admin" > "Service accounts".
    • Click on the "+ CREATE SERVICE ACCOUNT" button.
    • Provide a name and optional description for the service account.
    • Click "Create".
  2. Assign Roles:

    • After creating the service account, you'll be prompted to grant roles to it. Select "BigQuery" roles such as "BigQuery Admin" or "BigQuery Data Editor" based on your needs.
    • Click "Continue" and "Done" to create the service account.
  3. Generate Key:

    • In the "Service accounts" section, find the newly created service account in the list.
    • Click on the three dots on the right and select "Manage keys", then "Add Key".
    • Choose the key type as "JSON" and click "Create".
  4. Properties Details:

    Once you've generated the key, the downloaded JSON file will contain all the required properties.


You can directly "Test connection" if needed.

Snowflake

To be able to connect to a Snowflake instance, we'll need to define a Resource with the Snowflake Resource Type first.

Head to the Resources page in the Windmill app, click on "Add a resource/API" in the top right corner and select the Snowflake type.

Select Snowflake Resource Type

PropertyTypeDescriptionRequired
account_identifierstringSnowflake account identifier in the format <orgname>-<account_name>.true
private_keystringPrivate key used for authentication.true
public_keystringPublic key used for authentication.true
warehousestringSnowflake warehouse to be used for queries.false
usernamestringUsername for Snowflake login.true
databasestringName of the Snowflake database to connect to.true
schemastringSchema within the Snowflake database.false
rolestringRole to be assumed upon connection.false

Here's a step-by-step guide on where to find each detail.

  1. Account Identifier:

    The account identifier typically follows the format: <orgname>-<account_name>. You can find it in the Snowflake web interface:

    • Log in to your Snowflake account.
    • The account identifier can often be found in the URL or at the top of the Snowflake interface after you log in (in the format https://app.snowflake.com/orgname/account_name/).

    Snowflake Documentation on Account Identifier

  2. Username:

    The username is the Snowflake user you will use to connect to the database. You will need to create a user if you don't have one:

    • In the Snowflake web interface, go to the "ACCOUNT" tab.
    • Select "Users" from the left navigation pane.
    • Click the "+ CREATE USER" button to create a new user with a username and password.
  3. Public Key and Private Key:

    To create the public and private keys, you will need to generate them using a tool like OpenSSL:

    • Open a terminal window.
    • Use OpenSSL to generate a public and private key pair. The exact commands may vary based on your operating system.
    • For example, to generate a public key: openssl rsa -pubout -in private_key.pem -out public_key.pem

    Once you have the keys, you can copy the content and paste them into the respective fields in your configuration.

    Snowflake Documentation on Key Pair Authentication & Key Pair Rotation

  4. Warehouse, Schema, Database, and Role:

    These parameters are specific to your Snowflake environment and will depend on how your Snowflake instance is configured:

    • warehouse: The name of the Snowflake warehouse you want to connect to.
    • schema: The name of the Snowflake schema you want to use.
    • database: The name of the Snowflake database you want to connect to.
    • role: The role you want to use for authentication.

    You can find these details in the Snowflake web interface:

    • Log in to your Snowflake account.
    • You can find the names of warehouses, schemas, databases, and roles in the interface or by running SQL queries.

You can directly "Test connection" if needed.

Create Script

Next, let's create a script that will use the newly created Resource. From the Home page, click on the "+Script" button. Name the Script, give it a summary, and select your prefered language, "PostgreSQL", "MySQL", "BigQuery", "Snowflake".

Script creation first step

You can also give more details to your script, in the settings section, you can also get back to that later at any point.

PostgreSQL

Arguments need to be passed in the given format:

-- $1 name1 = default arg
-- $2 name2
INSERT INTO demo VALUES ($1::TEXT, $2::INT) RETURNING *

"name1", "name2" being the names of the arguments, and "default arg" the optional default value.

You can then write your prepared statement.

MySQL

Arguments need to be passed in the given format:

-- ? name1 (text) = default arg
-- ? name2 (int)
INSERT INTO demo VALUES (?, ?)

"name1", "name2" being the names of the arguments, and "default arg" the optional default value.

You can then write your prepared statement.

Mysql statement

BigQuery

Arguments need to be passed in the given format:

-- @name1 (string) = default arg
-- @name2 (integer)
-- @name3 (string[])
INSERT INTO `demodb.demo` VALUES (@name1, @name2, @name3)

"name1", "name2", "name3" being the names of the arguments, "default arg" the optional default value and string, integer and string[] the types.

You can then write your prepared statement.

Snowflake

Arguments need to be passed in the given format:

-- ? name1 (varchar) = default arg
-- ? name2 (int)
INSERT INTO demo VALUES (?, ?)

"name1", "name2" being the names of the arguments, "default arg" the optional default value and varchar & int the types.

You can then write your prepared statement.

Snowflake statement

Raw Queries

A more convenient but less secure option is to execute raw queries with a typescript or deno client. This enable you more flexibility than SQL prepared statement. You can for instance do string interpolation to make the name of the table a parameter of your script: SELECT * FROM ${table}. However this is dangerous since the string is directly interpolated and this open the door for SQL injections. Use with care and only in trusted environment.

We show below an example using a Typescript script: Execute Query and return results.

import { pgClient, type Sql } from 'https://deno.land/x/[email protected]/mod.ts';

type Postgresql = {
host: string;
port: number;
user: string;
dbname: string;
sslmode: string;
password: string;
};

export async function main(db: Postgresql, query: Sql = 'SELECT * FROM demo;') {
if (!query) {
throw Error('Query must not be empty.');
}
const { rows } = await pgClient(db).queryObject(query);
return rows;
}

This will allow you to execute all commands from one script. But that's also the vulnerability of it as it opens the door to harmful commands. To make it more secure, do not take the query directly as input but its parameters and then sanitize it properly so SQL injections are mitigated unless you are in a trusted environment.


tip

You can find more Script examples related to PostgreSQL on Windmill Hub.

After you're done, click on "Deploy", which will save it to your workspace. You can now use this Script in your Flows, app or as standalone.

Customize your script

Feel free to customize your script's metadata (path, name, description), runtime (concurrency limits, worker group, cache, dedicated workers) and generated UI.

Customize SQL

What's next?

Those scripts are minimal working examples, but there's a few more steps that can be useful in a real-world use case:

Scripts are immutable and there is an hash for each deployment of a given script. Scripts are never overwritten and referring to a script by path is referring to the latest deployed hash at that path.

For each script, a UI is autogenerated from the jsonchema inferred from the script signature, and can be customized further as standalone or embedded into rich UIs using the App builder.

In addition to the UI, sync and async webhooks are generated for each deployment.