Learn how to set up Snowflake Openflow using Snowpark Container Services (SPCS) in about 25 minutes. You'll create the foundation needed to start ingesting data from external sources using pre-built connectors.

What is Openflow

Openflow is Snowflake's managed service for building data pipelines in Snowpark Container Services (SPCS). It provides pre-built connectors that make it easy to ingest data from various sources into Snowflake.

Openflow SPCS Overview

Key Benefits:

Available Connectors

Openflow supports 19+ connectors including:

For a complete list with descriptions, see Openflow connectors.

What You Will Learn

What You Will Build

Prerequisites

Before starting, ensure you have:

Setup Overview

Setting up Openflow involves four main tasks:

Step

Task

Persona

Duration

1

Setup Core Snowflake

Snowflake Administrator

10 min

2

Create Deployment

Deployment Engineer / Administrator

5 min

3

Create Runtime Role

Data Engineer

5 min

4

Create Runtime

Data Engineer

5 min

What Happens After This Setup

Once you complete this 25-minute setup, you'll have a production-ready Openflow environment. Here's what you can do immediately:

✅ Start Ingesting Data

✅ No Code Required

✅ Production-Ready Infrastructure

Before creating a deployment, you need to configure core Snowflake components including the Openflow admin role, required privileges, and network configuration.

Companion Repository

This quickstart references the companion GitHub repository which includes Jupyter notebooks and SQL scripts for various Openflow connectors:

Repository: https://github.com/Snowflake-Labs/sfguide-getting-started-with-openflow-spcs

SQL Scripts:

All SQL commands in this quickstart are available as downloadable scripts:

Key Notebooks:

How to Use SQL Scripts in Snowsight

If you downloaded the SQL scripts from the companion repository, here's how to import and execute them in Snowsight:

Using Projects (Recommended):

  1. Navigate to Projects: In Snowsight, go to ProjectsWorksheets (left sidebar)
  2. Create or Open Project: Click "+ Project" to create a new project, or open an existing one
  3. Add SQL File: Click "..." (more options) → "Import SQL file"
  4. Select Downloaded Script: Choose the .sql file you downloaded (e.g., quickstart_setup_core.sql)
  5. Execute Commands:
    • Run All: Click the ▶ Run All button to execute the entire script
    • Step-by-Step: Select individual SQL statements and click ▶ Run for granular control

Create Openflow Admin Role

The OPENFLOW_ADMIN role is the primary administrative role for managing Openflow deployments.

Log into Snowsight and open a SQL worksheet.

Run the following SQL commands to create the Openflow admin role:

-- Create the Openflow admin role (requires ACCOUNTADMIN or equivalent privileges)
USE ROLE ACCOUNTADMIN;
CREATE ROLE IF NOT EXISTS OPENFLOW_ADMIN;

-- Grant necessary privileges
GRANT CREATE DATABASE ON ACCOUNT TO ROLE OPENFLOW_ADMIN;
GRANT CREATE COMPUTE POOL ON ACCOUNT TO ROLE OPENFLOW_ADMIN;
GRANT CREATE INTEGRATION ON ACCOUNT TO ROLE OPENFLOW_ADMIN;
GRANT BIND SERVICE ENDPOINT ON ACCOUNT TO ROLE OPENFLOW_ADMIN;

-- Grant role to current user and ACCOUNTADMIN
GRANT ROLE OPENFLOW_ADMIN TO ROLE ACCOUNTADMIN;
GRANT ROLE OPENFLOW_ADMIN TO USER IDENTIFIER(CURRENT_USER());

This creates the admin role and grants it the necessary permissions to create and manage Openflow deployments.

Enable BCR Bundle 2025_06 for Integration-level Network Policy

Step 1: Check if BCR Bundle 2025_06 is already enabled

-- Check the bundle status
USE ROLE ACCOUNTADMIN;
CALL SYSTEM$BEHAVIOR_CHANGE_BUNDLE_STATUS('2025_06');

Step 2: Enable the bundle if status shows

DISABLED

If the result shows DISABLED, enable the bundle:

-- Enable BCR Bundle 2025_06
CALL SYSTEM$ENABLE_BEHAVIOR_CHANGE_BUNDLE('2025_06');

Verify Setup

Check that all core resources were created successfully:

-- Verify role exists
SHOW ROLES LIKE 'OPENFLOW_ADMIN';

-- Verify grants
SHOW GRANTS TO ROLE OPENFLOW_ADMIN;

-- Verify network rule
DESC NETWORK RULE snowflake_deployment_network_rule;

After configuring core Snowflake, create an Openflow deployment. This is the container environment where Openflow will run.

Access Openflow in Snowsight

  1. Navigate to Openflow: Go to Work with dataIngestionOpenflow
  2. Openflow Interface: You'll see three tabs:
    • Overview - List of available connectors and documentation
    • Runtimes - Manage your runtime environments
    • Deployments - Create and manage Openflow deployments

Create Your First Deployment

  1. Navigate to Deployments Tab: Click on the Deployments tab
  2. Create Deployment: Click Create Deployment button
  3. Enter Deployment Name: QUICKSTART_DEPLOYMENT
  4. Complete the wizard: Follow the prompts shown in the GIF below

Create Openflow Deployment

Verify Deployment Status

Check that your deployment is running via the Snowsight UI:

  1. Navigate to Deployments Tab: Go to Work with dataIngestionOpenflowDeployments
  2. Check Status: Look for your deployment with status ACTIVE

Openflow Deployment Active Status

Expected status: ACTIVE

Create a runtime role that will be used by your Openflow runtime. This role needs access to databases, schemas, and warehouses for data ingestion.

Step 1: Create the Runtime Role and Resources

-- Create runtime role
USE ROLE ACCOUNTADMIN;
CREATE ROLE IF NOT EXISTS QUICKSTART_ROLE;

-- Create database for Openflow resources
CREATE DATABASE IF NOT EXISTS QUICKSTART_DATABASE;

-- Create warehouse for data processing
CREATE WAREHOUSE IF NOT EXISTS QUICKSTART_WH
  WAREHOUSE_SIZE = MEDIUM
  AUTO_SUSPEND = 300
  AUTO_RESUME = TRUE;

-- Grant privileges to runtime role
GRANT USAGE ON DATABASE QUICKSTART_DATABASE TO ROLE QUICKSTART_ROLE;
GRANT USAGE ON WAREHOUSE QUICKSTART_WH TO ROLE QUICKSTART_ROLE;

-- Grant runtime role to Openflow admin
GRANT ROLE QUICKSTART_ROLE TO ROLE OPENFLOW_ADMIN;

Step 2: Create External Access Integration

External Access Integrations allow your runtime to connect to external data sources. This quickstart creates one integration with network rules for both Google Drive and PostgreSQL connectors.

-- Create schema for network rules
USE ROLE ACCOUNTADMIN;
CREATE SCHEMA IF NOT EXISTS QUICKSTART_DATABASE.NETWORKS;

-- Create network rule for Google APIs
CREATE OR REPLACE NETWORK RULE google_api_network_rule
  MODE = EGRESS
  TYPE = HOST_PORT
  VALUE_LIST = (
    'admin.googleapis.com',
    'oauth2.googleapis.com',
    'www.googleapis.com',
    'google.com'
  );

-- Create network rule for your Google Workspace domain (optional)
-- Replace 'your-domain.com' with your actual domain
CREATE OR REPLACE NETWORK RULE workspace_domain_network_rule
  MODE = EGRESS
  TYPE = HOST_PORT
  VALUE_LIST = ('your-domain.com');

-- Create network rule for PostgreSQL endpoint
-- Replace 'your-postgres-host.com:5432' with your actual endpoint
CREATE OR REPLACE NETWORK RULE postgres_network_rule
  MODE = EGRESS
  TYPE = HOST_PORT
  VALUE_LIST = ('your-postgres-host.com:5432');

-- Create ONE external access integration with ALL network rules
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION quickstart_access
  ALLOWED_NETWORK_RULES = (
    QUICKSTART_DATABASE.NETWORKS.google_api_network_rule,
    QUICKSTART_DATABASE.NETWORKS.workspace_domain_network_rule,
    QUICKSTART_DATABASE.NETWORKS.postgres_network_rule
  )
  ENABLED = TRUE
  COMMENT = 'Openflow SPCS runtime access for Google Drive and PostgreSQL connectors';

-- Grant usage to runtime role
GRANT USAGE ON INTEGRATION quickstart_access TO ROLE QUICKSTART_ROLE;

Verify Setup

-- Verify role and grants
SHOW ROLES LIKE 'QUICKSTART_ROLE';
SHOW GRANTS TO ROLE QUICKSTART_ROLE;

-- Verify integration
SHOW INTEGRATIONS LIKE 'quickstart_access';
DESC INTEGRATION quickstart_access;

Create a runtime associated with the previously created runtime role. A runtime is the execution environment for your Openflow connectors.

Create Runtime via Snowsight

Follow these steps to create your runtime:

  1. Navigate to Runtimes: Go to Work with dataIngestionOpenflowRuntimes tab
  2. Click Create Runtime: Click the Create Runtime button in the top right
  3. Enter Runtime Name: QUICKSTART_RUNTIME
  4. Select Runtime Role: Choose QUICKSTART_ROLE from the dropdown
  5. Select External Access Integration: Choose quickstart_access from the dropdown
  6. Select Compute Pool: Choose an existing compute pool from the list
  7. Click Create: Complete the runtime creation

Create Openflow Runtime

Verify Runtime Status

Check that your runtime is active:

  1. Navigate to Runtimes Tab: Go to Work with dataIngestionOpenflowRuntimes
  2. Check Status: Look for QUICKSTART_RUNTIME with status ACTIVE

Openflow Runtime Active Status

Expected status: ACTIVE

Access the Runtime Canvas

Once your runtime is active, you can access the Openflow canvas to add and configure connectors:

Click on the runtime name (QUICKSTART_RUNTIME) to open the canvas where you can add connectors and build data pipelines.

Openflow Runtime Empty Canvas

With your Openflow SPCS infrastructure set up, you're ready to configure connectors to ingest data from external sources.

Summary of What You Built

After completing this guide, you have:

You're now ready to configure connectors and build data pipelines!

Choose Your Next Path

Now that your infrastructure is ready, here are your next options:

Configure Connectors

Add and configure connectors for your data sources:

Explore Companion Notebooks

Use the notebooks from the companion repository for detailed connector setup:

Build Data Pipelines

Try these common use cases:

Deployment Not Starting

Issue: Deployment status stuck in CREATING or shows ERROR

Solutions:

  1. Verify Role Privileges:
    -- Check OPENFLOW_ADMIN has required privileges
    SHOW GRANTS TO ROLE OPENFLOW_ADMIN;
    
  2. Review Event Table Logs (if enabled):
    -- Query event table for errors
    SELECT * FROM OPENFLOW_CONFIG.EVENTS.openflow_events
    WHERE RECORD:severity = 'ERROR'
    ORDER BY TIMESTAMP DESC
    LIMIT 100;
    
  3. Check Network Rule:
    -- Verify network rule exists
    DESC NETWORK RULE snowflake_deployment_network_rule;
    

Runtime Cannot Access External Services

Issue: Connector fails to connect to external services (e.g., Google Drive, databases)

Solutions:

  1. Verify External Access Integration:
    -- Check integration exists and is enabled
    SHOW INTEGRATIONS LIKE 'openflow_%';
    DESC INTEGRATION your_integration_name;
    
  2. Check Network Rules:
    -- Verify all required hostnames are in network rules
    DESC NETWORK RULE your_network_rule;
    
  3. Test Network Connectivity:
    • Ensure the external service endpoint is correct
    • Verify firewall rules allow traffic from Snowflake IP ranges
    • Check service credentials are valid
  4. Review Integration Grants:
    -- Verify runtime role has access
    SHOW GRANTS ON INTEGRATION your_integration_name;
    

Permission Errors

Issue: "Insufficient privileges" errors when creating resources

Solutions:

  1. Verify Current Role:
    -- Check which role you're using
    SELECT CURRENT_ROLE();
    
    -- Switch to appropriate role
    USE ROLE OPENFLOW_ADMIN;
    -- or
    USE ROLE ACCOUNTADMIN;
    
  2. Check Role Hierarchy:
    -- Verify role grants
    SHOW GRANTS TO ROLE OPENFLOW_ADMIN;
    SHOW GRANTS TO USER your_username;
    
  3. Grant Missing Privileges:
    -- Example: Grant database creation if missing
    USE ROLE ACCOUNTADMIN;
    GRANT CREATE DATABASE ON ACCOUNT TO ROLE OPENFLOW_ADMIN;
    

Getting Help

If you continue experiencing issues:

  1. Check Documentation: Openflow Documentation
  2. Community Support: Snowflake Community
  3. GitHub Issues: Companion Repository Issues

When you're finished with the quickstart or want to remove resources, use Snowsight UI to clean up.

Remove Deployments and Runtimes

Via Snowsight:

  1. Navigate to Openflow: Go to Work with dataIngestionOpenflow
  2. Remove Runtime:
    • Go to Runtimes tab
    • Find QUICKSTART_RUNTIME
    • Click on the runtime and select Delete
  3. Remove Deployment:
    • Go to Deployments tab
    • Find QUICKSTART_DEPLOYMENT
    • Click on the deployment and select Delete

Remove Supporting Resources

After removing deployments and runtimes, clean up the supporting resources:

-- Switch to ACCOUNTADMIN
USE ROLE ACCOUNTADMIN;

-- Drop external access integration
DROP INTEGRATION IF EXISTS quickstart_access;

-- Drop network rules
DROP NETWORK RULE IF EXISTS google_api_network_rule;
DROP NETWORK RULE IF EXISTS workspace_domain_network_rule;
DROP NETWORK RULE IF EXISTS postgres_network_rule;

-- Drop warehouse
DROP WAREHOUSE IF EXISTS QUICKSTART_WH;

-- Drop database
DROP DATABASE IF EXISTS QUICKSTART_DATABASE;

-- Drop role
DROP ROLE IF EXISTS QUICKSTART_ROLE;

Congratulations! You've successfully set up Snowflake Openflow using Snowpark Container Services (SPCS). You now have a fully functional data integration platform ready to connect to external data sources.

What You Accomplished

Key Capabilities Enabled

Next Steps

  1. Configure Your First Connector: Use the companion notebooks (EAI_GDRIVE.ipynb or EAI_POSTGRES.ipynb) to set up Google Drive or PostgreSQL
  2. Build Data Pipelines: Set up automated data ingestion from your chosen source
  3. Add More Connectors: Explore the 19+ available connectors for different data sources
  4. Learn Best Practices: Review the official Openflow documentation for tips and patterns

Related Resources

Official Documentation:

Code and Notebooks:

Management and Operations:

Community and Support

We would love your feedback on this QuickStart Guide! Please submit your feedback using the GitHub issues link.