By completing this guide, you will be able to go from raw data to build a machine learning model that can help to predict house prices.

Here is a summary of what you will be able to learn in each step by following this quickstart:

In case you are new to some of the technologies mentioned above, here's a quick summary with links to documentation.

What is Snowpark?

The Snowpark API provides an intuitive library for querying and processing data at scale in Snowflake. Using a library for any of three languages, you can build applications that process data in Snowflake without moving data to the system where your application code runs, and process at scale as part of the elastic and serverless Snowflake engine.

Snowflake currently provides Snowpark libraries for three languages: Java, Python, and Scala.

Snowpark

Learn more about Snowpark.

What is scikit-learn?

It is one of the most popular open source machine learning libraries for Python that also happens to be pre-installed and available for developers to use in Snowpark for Python via Snowflake Anaconda channel. This means that you can use it in Snowpark for Python User-Defined Functions and Stored Procedures without having to manually install it and manage all of its dependencies.

What You'll Learn

Prerequisites

This section covers cloning of the GitHub repository and creating a Python 3.10 environment.

  1. Clone GitHub repository
  2. Download the miniconda installer from https://conda.io/miniconda.html. (OR, you may use any other Python environment with Python 3.10).
  3. Open environment.yml and paste in the following config:
name: snowpark_scikit_learn
channels:
  - https://repo.anaconda.com/pkgs/snowflake/
  - nodefaults
dependencies:
  - python=3.10
  - pip
  - snowflake-snowpark-python==1.23.0
  - snowflake-ml-python==1.6.4
  - snowflake==1.0.0
  - ipykernel
  - matplotlib
  - seaborn
  1. From the root folder, create conda environment by running below command.
conda env create -f environment.yml
conda activate snowpark_scikit_learn
  1. Download and install VS Code or you could use juypter notebook or any other IDE of your choice
  2. Update config.py with your Snowflake account details and credentials.

Troubleshooting pyarrow related issues

The Notebook linked below covers the following data ingestion tasks.

  1. Download data file to be used in the lab
  2. Read downloaded data as pandas dataframe
  3. Connect to Snowflake using session object
  4. Create database, schema and warehouse
  5. Load pandas dataframe object into Snowflake table

Data Ingest Notebook in Jupyter or Visual Studio Code

To get started, follow these steps:

  1. In a terminal window, browse to this folder and run jupyter notebook at the command line. (You may also use other tools and IDEs such Visual Studio Code.)
  2. Open and run through the cells in 1_snowpark_housing_data_ingest.ipynb

The Notebook linked below covers the following data exploration tasks.

  1. Establish secure connection from Snowpark Python to Snowflake
  2. Compare Snowpark dataframe to Pandas dataframe
  3. Use describe function to understand data
  4. Build some visualisation using seaborn and pyplot

Data Exploration Notebook in Jupyter or Visual Studio Code

To get started, follow these steps:

  1. If not done already, in a terminal window, browse to this folder and run jupyter notebook at the command line. (You may also use other tools and IDEs such Visual Studio Code.)
  2. Open and run through the cells in 2_data_exploration_transformation.ipynb

The Notebook linked below covers the following machine learning tasks.

  1. Establish secure connection from Snowpark Python to Snowflake
  2. Get features and target from Snowflake table into Snowpark DataFrame
  3. Create Snowflake stage for the Python Stored Procedure code
  4. Prepare features using scikit learn for model training
  5. Create a Python Stored Procedure to deploy model training code on Snowflake
  6. Optinally use Snowpark optimised warehouse for model training
  7. Log the model to Snowflake Model Registry and use it for inference on new data points

End-To-End-ML

Machine Learning Notebook in Jupyter or Visual Studio Code

To get started, follow these steps:

  1. If not done already, in a terminal window, browse to this folder and run jupyter notebook at the command line. (You may also use other tools and IDEs such Visual Studio Code.)
  2. Open and run through the 3_snowpark_end_to_end_ml.ipynb

Congratulations! You've successfully completed the lab using Snowpark for Python and scikit-learn.

What You Learned

Related Resources