In the gaming industry, Player data is often fragmented across multiple sources, including event streams, purchase histories, and platform-specific behaviors. This disjointed data makes it challenging for gaming companies and key personas such as marketing and product to gain a holistic understanding of individual players and overall game performance. In this demo we will create a "Player 360" dashboard, which integrates and visualizes diverse player data sources into a single, dynamic "Player Card." This card will offer insights into player activity & interactions (behavior, buying habits, etc), and this preliminary use case will potentially show the highest risk for churn players. Similarly, we will create a "Game 360" dashboard which integrates games-level insights across the playerbase, focusing on key KPIs, metrics, and overall user behavior.
By aggregating these data streams in a unified system and leveraging Snowflake features such as Snowpark ML, Snowflake Notebooks, and Streamlit in Snowflake, companies can more effectively understand player/customer journeys, tailor in-game experiences, and optimize monetization strategies.
Architecture Diagram:
Step 1. - Copy the the setup script and run it in Snowsight SQL Worksheet. This is the setup script.
Step 2. - Copy the the code below and run it in Snowsight SQL Worksheet. This is the build script that populates the database with the RAW data.
Step 3. - Copy the the code in the build script and run it in Snowsight SQL Worksheet. This is the build script that populates the analytics schema.
Step 4 - Upload Notebook and Streamlit files
PLAYER_360.ANALYTIC
schema Click ‘+ Files' in the top right of the stage. Upload all files that you downloaded from GitHub into the stage. Make sure your the files match the following:0_start_here.ipynb
notebook file and environment.yml
file to the @PLAYER_360.ANALYTIC.notebook_static_churn_prediction
stage 1_Rolling_Churn_Prediction_Model.ipynb
notebook file (excluding environment.yml) to @PLAYER_360.ANALYTIC.notebook_rolling_churn_prediction
stage PLAYER_360.APP
schema Click ‘+ Files' in the top right of the stage. Upload all files that you downloaded from GitHub into the stage. Make sure your the files match the following:PLAYER_360.py, GAME_360.py, environment.yml
) in streamlit folder to the @PLAYER_360.APP.streamlit_player36
stage. Note that the images subfolder does not need to be uploaded Step 1: Run following 0_notebook_setup.sql script as a sql worksheet in snowsight
use role SYSADMIN;
CREATE OR REPLACE NOTEBOOK PLAYER_360.ANALYTIC.PLAYER_360_static_churn_prediction
FROM '@PLAYER_360.ANALYTIC.notebook_static_churn_prediction'
MAIN_FILE = '0_start_here.ipynb'
QUERY_WAREHOUSE = 'PLAYER_360_DATA_APP_WH';
Step 2: The notebook has now been created in your Snowflake account! All packages and Python setup has already been completed.
To access it, navigate to Snowsight, select the SYSADMIN
role, and click the Project, click the Notebooks tab. Open PLAYER_360_static_churn_prediction notebook and run each of the cells.
Within this notebook, you'll visualize player segments and analyze key churn features, perform feature engineering using the Snowflake Preprocessing Pipelines, and using the model registry to manage and log your ML Model.
Step 1: Run the 1_notebook_setup.sql script as a sql worksheet in snowsight. This sets up compute pools and external access integration and programmatically creates Snowflake Container Notebook for notebook 2.
ALTER SESSION SET query_tag = '{"origin":"sf_sit-is", "name":"Player_360", "version":{"major":1, "minor":0}, "attributes":{"is_quickstart":1, "source":"sql"}}';
USE ROLE accountadmin;
USE DATABASE PLAYER_360;
USE SCHEMA ANALYTIC;
-- Create and grant access to compute pools
CREATE COMPUTE POOL IF NOT EXISTS PLAYER_360_cpu_xs_5_nodes
MIN_NODES = 1
MAX_NODES = 5
INSTANCE_FAMILY = CPU_X64_XS;
CREATE COMPUTE POOL IF NOT EXISTS PLAYER_360_gpu_s_5_nodes
MIN_NODES = 1
MAX_NODES = 5
INSTANCE_FAMILY = GPU_NV_S;
GRANT USAGE ON COMPUTE POOL PLAYER_360_cpu_xs_5_nodes TO ROLE SYSADMIN;
GRANT USAGE ON COMPUTE POOL PLAYER_360_gpu_s_5_nodes TO ROLE SYSADMIN;
-- Create and grant access to EAIs
-- Substep #1: create network rules (these are schema-level objects; end users do not need direct access to the network rules)
create or replace network rule PLAYER_360_allow_all_rule
TYPE = 'HOST_PORT'
MODE= 'EGRESS'
VALUE_LIST = ('0.0.0.0:443','0.0.0.0:80');
-- Substep #2: create external access integration (these are account-level objects; end users need access to this to access the public internet with endpoints defined in network rules)
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION PLAYER_360_allow_all_integration
ALLOWED_NETWORK_RULES = (PLAYER_360_allow_all_rule)
ENABLED = true;
CREATE OR REPLACE NETWORK RULE PLAYER_360_pypi_network_rule
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('pypi.org', 'pypi.python.org', 'pythonhosted.org', 'files.pythonhosted.org');
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION PLAYER_360_pypi_access_integration
ALLOWED_NETWORK_RULES = (PLAYER_360_pypi_network_rule)
ENABLED = true;
GRANT ALL PRIVILEGES ON INTEGRATION PLAYER_360_allow_all_integration TO ROLE SYSADMIN;
GRANT ALL PRIVILEGES ON INTEGRATION PLAYER_360_pypi_access_integration TO ROLE SYSADMIN;
USE ROLE SYSADMIN;
USE WAREHOUSE PLAYER_360_DATA_APP_WH;
USE DATABASE PLAYER_360;
USE SCHEMA ANALYTIC;
CREATE OR REPLACE NOTEBOOK PLAYER_360.ANALYTIC.PLAYER_360_rolling_churn_prediction
FROM '@PLAYER_360.ANALYTIC.notebook_rolling_churn_prediction'
MAIN_FILE = '1_Rolling_Churn_Prediction_Model.ipynb'
QUERY_WAREHOUSE = 'PLAYER_360_DATA_APP_WH'
COMPUTE_POOL='PLAYER_360_gpu_s_5_nodes'
RUNTIME_NAME='SYSTEM$GPU_RUNTIME';
ALTER NOTEBOOK PLAYER_360_rolling_churn_prediction ADD LIVE VERSION FROM LAST;
ALTER NOTEBOOK PLAYER_360_rolling_churn_prediction set external_access_integrations = ("PLAYER_360_pypi_access_integration",
"PLAYER_360_allow_all_integration");
Step 2: The notebook has now been created in your Snowflake account! All the setup has been completed including for the compute pools, External Acces Integration, and container runtimes!
To access it, navigate to Snowsight, select the SYSADMIN
role, and click the Project, click the Notebooks tab. Open PLAYER_360_rolling_churn_prediction notebook and run each of the cells.
This notebook creates a ML Model that can handle inference on incoming near-real time streaming data, representing the use case of predicting churn based off a customer's recent behavior as opposed to overall behavior. Within this notebook, you'll explore how to use Snowflake Notebooks using Container Runtime, perform even more data and feature engineering, and utilize the Model Registry.
Step 1: Run following streamlit_setup.sql script
use role SYSADMIN;
CREATE OR REPLACE STREAMLIT PLAYER_360.APP.PLAYER_360_streamlit
ROOT_LOCATION = '@PLAYER_360.APP.streamlit_player360'
MAIN_FILE = 'PLAYER_360.py'
QUERY_WAREHOUSE = 'PLAYER_360_DATA_APP_WH'
COMMENT = '{"origin":"sf_sit-is","name":"player_360","version":{"major":1, "minor":0},"attributes":{"is_quickstart":1, "source":"streamlit"}}';
Step 2: The notebook has now been created in your Snowflake account! All the setup has been completed for the package imports and the streamlit is ready to go! To access it, navigate to Snowsight, select the SYSADMIN
role, and click the Project, click the Streamlit tab. Open PLAYER_360_streamlit app and explore the different features!
2a) Explore Player 360 page
Recommended Actions:
Points, Sessions, Purchases
2b) Explore Game 360 page
Recommended Actions:
Static Demographics
section, perform Exploratory Data Analysis (EDA) for key demographics (Location, Gender, Age, Rank) and display population segment breakdown.Static Game Metrics
, zoom into specific time periods for deeper analysis of key KPIs.Dynamic Demographics
and compare the correlation matrix, noting significant differences (e.g., red flags in correlations).Dynamic Churnlikelihood
Extract the demographic features most relevant for predicting churn.Filtered Dataframe
Sort the data and export the filtered dataset.This demo has showcased how Snowflake enables the creation of unified "Player 360" and "Game 360" dashboards by aggregating diverse player and game data streams into a single, actionable view. Through the use of Snowpark ML, Snowflake Notebooks, and Streamlit, we've demonstrated how companies can leverage these integrated tools to gain deeper insights into player behavior, churn risk, and overall game performance. By consolidating data sources and applying advanced analytics, gaming companies can optimize player engagement and refine monetization strategies, all within a governed and scalable environment.