Microsoft Copilot Studio is a conversational AI platform that allows users to create and customize copilots, using a low-code, graphical interface, enabling them to build agents for various scenarios and integrate them with Microsoft 365 Copilot or as standalone applications.
Cortex Agents orchestrate across both structured and unstructured data sources to deliver insights. They plan tasks, use tools to execute these tasks, and generate responses. Agents use Cortex Analyst (structured) and Cortex Search (unstructured) as tools, along with LLMs, to analyze data. Cortex Search extracts insights from unstructured sources, while Cortex Analyst generates SQL to process structured data. A comprehensive support for tool identification and tool execution enables delivery of sophisticated applications grounded in enterprise data.
Using the two services together allows users to build a copilot in Microsoft with all of their Microsoft data and tooling alongside Snowflake Cortex services on top of their Snowflake data with efficiently and securely (with Oauth support).
In this use cases we will build two data sources, one with structured sales data and another with unstructured sales call data. Then we will create a Cortex Agent that uses Search (for unstructured data) and Analyst (for structured data) then wrap a Cortex Agent around it so that it can combine both the services in a unified agentic experience. This can then be used by Copilot leveraging oauth authentication and triggered by a simple phrase in your Microsoft Copilot to access sales data easily with plain text questions.
Snowflake Cortex has proven to be a best-in-class platform for building GenAI services and agents with your data and while there is overlap with functionality in Microsoft Copilot Studio we see many customers who want to build GenAI services in Snowflake Cortex then materialize those services to MS Copilot. Having MS Copilot serve as the single Copilot interface for all GenAI services and agents.
To add a Snowflake Role as a Role for OAuth flows where the programmatic client requests an access token for itself:
"appRoles":[
{
"allowedMemberTypes": [ "Application"],
"description": "Analyst.",
"displayName": "Analyst",
"id": "3ea51f40-2ad7-4e79-aa18-12c45156dc6a",
"isEnabled": true,
"origin": "Application",
"value": "session:role:analyst"
}
]
Click on API Permissions.
Collect Azure AD information for Snowflake
We recommend you test the configuration at this time, please use the curl below and check if ENTRA is issuing a token or any API testing tool like Insomnia or others.
curl -X POST -H "Content-Type: application/x-www-form-urlencoded;charset=UTF-8" --data-urlencode "client_id=<client_id from above>" --data-urlencode "client_secret=<Secret from above>" --data-urlencode "grant_type=client_credentials" --data-urlencode "scope=api://<Appl_URI_ID>/.default" 'https://login.microsoftonline.com/<tenant_id>/oauth2/v2.0/token'
After this take the token from the curl command and decode it here: jwt.ms. Extract the "sub" value, this will be the login name for the snowflake user you create for the azure SP.
The external_oauth_audience_list parameter of the security integration must match the Application ID URI that you specified while configuring Microsoft Entra ID. Service Principal Authentication:
create security integration external_oauth_azure
type = external_oauth
enabled = true
external_oauth_type = azure
external_oauth_issuer = '<AZURE_AD_ISSUER>'
external_oauth_jws_keys_url = '<AZURE_AD_JWS_KEY_ENDPOINT>'
external_oauth_audience_list = ('<SNOWFLAKE_APPLICATION_ID_URI>')
external_oauth_token_user_mapping_claim = ‘sub'
external_oauth_snowflake_user_mapping_attribute = 'login_name';
Create a user for Service Principal based connection:
The sub value should be mapped to a user in Snowflake, avoid using high privilege accounts Accountadmin, Orgadmin, Securityadmin.
CREATE OR REPLACE USER SNOWSQL_OAUTH_USER
LOGIN_NAME = '<sub value from decoded token>'
DISPLAY_NAME = 'SnowSQL OAuth User'
COMMENT = 'A system user for SnowSQL client to be used for OAuth based connectivity';
CREATE ROLE ANALYST;
GRANT ROLE ANALYST TO USER SNOWSQL_OAUTH_USER;
--To validate the token, execute below command in snowflake
select system$verify_external_oauth_token(‘<token>');
Copy and paste the below power shell script into a text editor.
$modules = @(
@{ Name = "Microsoft.Graph.Authentication"; InstallCommand = "Install-Module -Name Microsoft.Graph.Authentication -Force -Scope CurrentUser -Verbose" },
@{ Name = "Microsoft.Graph.Applications"; InstallCommand = "Install-Module -Name Microsoft.Graph.Applications -Force -Scope CurrentUser -Verbose" },
@{ Name = "Microsoft.Graph.Identity.SignIns"; InstallCommand = "Install-Module -Name Microsoft.Graph.Identity.SignIns -Force -Scope CurrentUser -Verbose" }
)
foreach ($module in $modules) {
if (-not (Get-Module -ListAvailable -Name $module.Name)) {
try {
Invoke-Expression $module.InstallCommand
} catch {
Write-Host "Warning: Failed to install $($module.Name). Continuing..."
}
}
}
# Connect to Microsoft Graph with required scopes
Connect-MgGraph -Scopes "Application.ReadWrite.All", "Directory.ReadWrite.All" -TenantId "< tenant id >"
$resourceAppName = Read-Host "Enter Resource App Name"
$clientAppName = Read-Host "Enter Client App Name"
$roleName = Read-Host "Enter App Role Name"
$tenantId = Read-Host "Enter Tenant ID"
# Create resource application
$resourceAppParams = @{
DisplayName = $resourceAppName
SignInAudience = "AzureADMyOrg"
}
$resourceApp = New-MgApplication @resourceAppParams
Start-Sleep -Seconds 10
$resourceAppId = $resourceApp.AppId
$resourceObjectId = $resourceApp.Id
Write-Host "Resource App created: $resourceAppId"
# Create client application
$clientAppParams = @{
DisplayName = $clientAppName
SignInAudience = "AzureADMyOrg"
}
$clientApp = New-MgApplication @clientAppParams
Start-Sleep -Seconds 10
$clientAppId = $clientApp.AppId
$clientObjectId = $clientApp.Id
Write-Host "Client App created: $clientAppId"
# Set identifier URI for resource app
$identifierUri = "api://$resourceAppId"
Update-MgApplication -ApplicationId $resourceApp.Id -IdentifierUris @($identifierUri)
# Create client secret
$passwordCredential = @{
DisplayName = "SnowflakeSecret"
EndDateTime = (Get-Date).AddYears(1)
}
$secret = Add-MgApplicationPassword -ApplicationId $clientApp.Id -PasswordCredential $passwordCredential
Write-Host "Secret created = $($secret.SecretText)"
# Create app role
$appRole = @{
AllowedMemberTypes = @("Application")
Description = "Snowflake app role"
DisplayName = $roleName
Id = [Guid]::NewGuid()
IsEnabled = $true
Value = "session:role:$roleName"
}
# Get existing app roles and add new one
$existingApp = Get-MgApplication -ApplicationId $resourceApp.Id
$appRoles = $existingApp.AppRoles + $appRole
Update-MgApplication -ApplicationId $resourceApp.Id -AppRoles $appRoles
# Create API permission scope
$scope = @{
Id = [Guid]::NewGuid()
Value = "session:scope:$roleName"
AdminConsentDescription = "Can administer the Snowflake account"
AdminConsentDisplayName = "Account Admin"
IsEnabled = $true
Type = "Admin"
}
$api = @{
Oauth2PermissionScopes = @($scope)
}
Update-MgApplication -ApplicationId $resourceApp.Id -Api $api
# Create service principals
$resourceSP = New-MgServicePrincipal -AppId $resourceAppId
Start-Sleep -Seconds 10
$clientSP = New-MgServicePrincipal -AppId $clientAppId
Start-Sleep -Seconds 10
# Assign app role
$appRoleAssignment = @{
PrincipalId = $clientSP.Id
ResourceId = $resourceSP.Id
AppRoleId = $appRole.Id
}
New-MgServicePrincipalAppRoleAssignment -ServicePrincipalId $clientSP.Id -BodyParameter $appRoleAssignment
# Grant OAuth2 permission
$permissionGrant = @{
ClientId = $clientSP.Id
ConsentType = "AllPrincipals"
ResourceId = $resourceSP.Id
Scope = $scope.Value
}
New-MgOauth2PermissionGrant -BodyParameter $permissionGrant
# Configure required resource access
$requiredResourceAccess = @{
ResourceAppId = $resourceAppId
ResourceAccess = @(
@{
Id = $scope.Id
Type = "Scope"
},
@{
Id = $appRole.Id
Type = "Role"
}
)
}
Update-MgApplication -ApplicationId $clientApp.Id -RequiredResourceAccess @($requiredResourceAccess)
# Get access token
$tokenUrl = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
$body = @{
client_id = $clientAppId
client_secret = $secret.SecretText
scope = "$resourceAppId/.default"
grant_type = "client_credentials"
}
$tokenResponse = Invoke-RestMethod -Method Post -Uri $tokenUrl -Body $body
$accessToken = $tokenResponse.access_token
Write-Host "Access Token: $accessToken"
# Decode token
$tokenParts = $accessToken -split '\.'
$payload = $tokenParts[1]
# Add padding to the payload
switch ($payload.Length % 4) {
2 { $payload += '==' }
3 { $payload += '=' }
}
$decodedPayload = [System.Text.Encoding]::UTF8.GetString([System.Convert]::FromBase64String($payload))
$payloadObject = $decodedPayload | ConvertFrom-Json
$sub = $payloadObject.sub
# Save output to file
$desktopPath = [System.Environment]::GetFolderPath("Desktop")
$outputFile = "snowflakeinfo.txt"
$outputContent = @"
============================================================================================================================
Snowflake power platform connection info
============================================================================================================================
Tenant: $TenantId
Client ID: $clientAppId
Client Secret: $($secret.SecretText)
Resource URL: $resourceAppId
============================================================================================================================
Snowflake SQL Commands. Please execute these commands on Snowflake for adding Security Integration and Creating User, Role
============================================================================================================================
create security integration external_azure
type = external_oauth
enabled = true
external_oauth_any_role_mode = 'ENABLE'
external_oauth_type = azure
external_oauth_issuer = 'https://sts.windows.net/$TenantId/'
external_oauth_jws_keys_url = 'https://login.microsoftonline.com/$TenantId/discovery/v2.0/keys'
external_oauth_audience_list = ('$resourceAppId')
external_oauth_token_user_mapping_claim = (upn,sub)
external_oauth_snowflake_user_mapping_attribute = 'login_name';
ALTER ACCOUNT SET EXTERNAL_OAUTH_ADD_PRIVILEGED_ROLES_TO_BLOCKED_LIST = FALSE;
CREATE OR REPLACE USER AAD_SP_USER
LOGIN_NAME = '$sub'
DISPLAY_NAME = 'AAD_SP_USER'
COMMENT = 'Snowflake User';
CREATE ROLE IF NOT EXISTS $roleName;
GRANT ROLE $roleName TO USER AAD_SP_USER;
"@
$outputContent | Out-File -FilePath $outputFile -Encoding UTF8
Write-Host "Info saved to: $outputFile"
Go to your Azure Portal and open up a Powershell terminal and upload the Powershell script.
After uploading the script you will execute it by running:
./AppCreationAndConfig.ps1
You will be prompted to name your registration and provide your tenant id again. This should all take about 1-2 minutes! Doing everything that is done manually in Option 1.
You will then download the snowflakeinfo.txt file and use that code to create your security integration and user in Snowflake.
-- Create database and schema
CREATE OR REPLACE DATABASE sales_intelligence;
CREATE OR REPLACE SCHEMA sales_intelligence.data;
CREATE OR REPLACE WAREHOUSE sales_intelligence_wh;
-- Create tables for sales data
CREATE TABLE sales_conversations (
conversation_id VARCHAR,
transcript_text TEXT,
customer_name VARCHAR,
deal_stage VARCHAR,
sales_rep VARCHAR,
conversation_date TIMESTAMP,
deal_value FLOAT,
product_line VARCHAR
);
CREATE TABLE sales_metrics (
deal_id FLOAT PRIMARY KEY,
customer_name VARCHAR,
deal_value FLOAT,
close_date DATE,
sales_stage VARCHAR,
win_status BOOLEAN,
sales_rep VARCHAR,
product_line VARCHAR
);
-- First, let's insert data into sales_conversations
INSERT INTO sales_conversations
(conversation_id, transcript_text, customer_name, deal_stage, sales_rep, conversation_date, deal_value, product_line)
VALUES
('CONV001', 'Initial discovery call with TechCorp Inc''s IT Director and Solutions Architect. Client showed strong interest in our enterprise solution features, particularly the automated workflow capabilities. Main discussion centered around integration timeline and complexity. They currently use Legacy System X for their core operations and expressed concerns about potential disruption during migration. Team asked detailed questions about API compatibility and data migration tools. Action items: 1) Provide detailed integration timeline document 2) Schedule technical deep-dive with their infrastructure team 3) Share case studies of similar Legacy System X migrations. Client mentioned Q2 budget allocation for digital transformation initiatives. Overall positive engagement with clear next steps.', 'TechCorp Inc', 'Discovery', 'Sarah Johnson', '2024-01-15 10:30:00', 75000, 'Enterprise Suite'),
('CONV002', 'Follow-up call with SmallBiz Solutions'' Operations Manager and Finance Director. Primary focus was on pricing structure and ROI timeline. They compared our Basic Package pricing with Competitor Y''s small business offering. Key discussion points included: monthly vs. annual billing options, user license limitations, and potential cost savings from process automation. Client requested detailed ROI analysis focusing on: 1) Time saved in daily operations 2) Resource allocation improvements 3) Projected efficiency gains. Budget constraints were clearly communicated - they have a maximum budget of $30K for this year. Showed interest in starting with basic package with room for potential upgrade in Q4. Need to provide competitive analysis and customized ROI calculator by next week.', 'SmallBiz Solutions', 'Negotiation', 'Mike Chen', '2024-01-16 14:45:00', 25000, 'Basic Package'),
('CONV003', 'Strategy session with SecureBank Ltd''s CISO and Security Operations team. Extremely positive 90-minute deep dive into our Premium Security package. Customer emphasized immediate need for implementation due to recent industry compliance updates. Our advanced security features, especially multi-factor authentication and encryption protocols, were identified as perfect fits for their requirements. Technical team was particularly impressed with our zero-trust architecture approach and real-time threat monitoring capabilities. They''ve already secured budget approval and have executive buy-in. Compliance documentation is ready for review. Action items include: finalizing implementation timeline, scheduling security audit, and preparing necessary documentation for their risk assessment team. Client ready to move forward with contract discussions.', 'SecureBank Ltd', 'Closing', 'Rachel Torres', '2024-01-17 11:20:00', 150000, 'Premium Security'),
('CONV004', 'Comprehensive discovery call with GrowthStart Up''s CTO and Department Heads. Team of 500+ employees across 3 continents discussed current challenges with their existing solution. Major pain points identified: system crashes during peak usage, limited cross-department reporting capabilities, and poor scalability for remote teams. Deep dive into their current workflow revealed bottlenecks in data sharing and collaboration. Technical requirements gathered for each department. Platform demo focused on scalability features and global team management capabilities. Client particularly interested in our API ecosystem and custom reporting engine. Next steps: schedule department-specific workflow analysis and prepare detailed platform migration plan.', 'GrowthStart Up', 'Discovery', 'Sarah Johnson', '2024-01-18 09:15:00', 100000, 'Enterprise Suite'),
('CONV005', 'In-depth demo session with DataDriven Co''s Analytics team and Business Intelligence managers. Showcase focused on advanced analytics capabilities, custom dashboard creation, and real-time data processing features. Team was particularly impressed with our machine learning integration and predictive analytics models. Competitor comparison requested specifically against Market Leader Z and Innovative Start-up X. Price point falls within their allocated budget range, but team expressed interest in multi-year commitment with corresponding discount structure. Technical questions centered around data warehouse integration and custom visualization capabilities. Action items: prepare detailed competitor feature comparison matrix and draft multi-year pricing proposals with various discount scenarios.', 'DataDriven Co', 'Demo', 'James Wilson', '2024-01-19 13:30:00', 85000, 'Analytics Pro'),
('CONV006', 'Extended technical deep dive with HealthTech Solutions'' IT Security team, Compliance Officer, and System Architects. Four-hour session focused on API infrastructure, data security protocols, and compliance requirements. Team raised specific concerns about HIPAA compliance, data encryption standards, and API rate limiting. Detailed discussion of our security architecture, including: end-to-end encryption, audit logging, and disaster recovery protocols. Client requires extensive documentation on compliance certifications, particularly SOC 2 and HITRUST. Security team performed initial architecture review and requested additional information about: database segregation, backup procedures, and incident response protocols. Follow-up session scheduled with their compliance team next week.', 'HealthTech Solutions', 'Technical Review', 'Rachel Torres', '2024-01-20 15:45:00', 120000, 'Premium Security'),
('CONV007', 'Contract review meeting with LegalEase Corp''s General Counsel, Procurement Director, and IT Manager. Detailed analysis of SLA terms, focusing on uptime guarantees and support response times. Legal team requested specific modifications to liability clauses and data handling agreements. Procurement raised questions about payment terms and service credit structure. Key discussion points included: disaster recovery commitments, data retention policies, and exit clause specifications. IT Manager confirmed technical requirements are met pending final security assessment. Agreement reached on most terms, with only SLA modifications remaining for discussion. Legal team to provide revised contract language by end of week. Overall positive session with clear path to closing.', 'LegalEase Corp', 'Negotiation', 'Mike Chen', '2024-01-21 10:00:00', 95000, 'Enterprise Suite'),
('CONV008', 'Quarterly business review with GlobalTrade Inc''s current implementation team and potential expansion stakeholders. Current implementation in Finance department showcasing strong adoption rates and 40% improvement in processing times. Discussion focused on expanding solution to Operations and HR departments. Users highlighted positive experiences with customer support and platform stability. Challenges identified in current usage: need for additional custom reports and increased automation in workflow processes. Expansion requirements gathered from Operations Director: inventory management integration, supplier portal access, and enhanced tracking capabilities. HR team interested in recruitment and onboarding workflow automation. Next steps: prepare department-specific implementation plans and ROI analysis for expansion.', 'GlobalTrade Inc', 'Expansion', 'James Wilson', '2024-01-22 14:20:00', 45000, 'Basic Package'),
('CONV009', 'Emergency planning session with FastTrack Ltd''s Executive team and Project Managers. Critical need for rapid implementation due to current system failure. Team willing to pay premium for expedited deployment and dedicated support team. Detailed discussion of accelerated implementation timeline and resource requirements. Key requirements: minimal disruption to operations, phased data migration, and emergency support protocols. Technical team confident in meeting aggressive timeline with additional resources. Executive sponsor emphasized importance of going live within 30 days. Immediate next steps: finalize expedited implementation plan, assign dedicated support team, and begin emergency onboarding procedures. Team to reconvene daily for progress updates.', 'FastTrack Ltd', 'Closing', 'Sarah Johnson', '2024-01-23 16:30:00', 180000, 'Premium Security'),
('CONV010', 'Quarterly strategic review with UpgradeNow Corp''s Department Heads and Analytics team. Current implementation meeting basic needs but team requiring more sophisticated analytics capabilities. Deep dive into current usage patterns revealed opportunities for workflow optimization and advanced reporting needs. Users expressed strong satisfaction with platform stability and basic features, but requiring enhanced data visualization and predictive analytics capabilities. Analytics team presented specific requirements: custom dashboard creation, advanced data modeling tools, and integrated BI features. Discussion about upgrade path from current package to Analytics Pro tier. ROI analysis presented showing potential 60% improvement in reporting efficiency. Team to present upgrade proposal to executive committee next month.', 'UpgradeNow Corp', 'Expansion', 'Rachel Torres', '2024-01-24 11:45:00', 65000, 'Analytics Pro');
-- Now, let's insert corresponding data into sales_metrics
INSERT INTO sales_metrics
(deal_id, customer_name, deal_value, close_date, sales_stage, win_status, sales_rep, product_line)
VALUES
('001', 'TechCorp Inc', 75000, '2024-02-15', 'Closed', true, 'Sarah Johnson', 'Enterprise Suite'),
('002', 'SmallBiz Solutions', 25000, '2024-02-01', 'Lost', false, 'Mike Chen', 'Basic Package'),
('003', 'SecureBank Ltd', 150000, '2024-01-30', 'Closed', true, 'Rachel Torres', 'Premium Security'),
('004', 'GrowthStart Up', 100000, '2024-02-10', 'Pending', false, 'Sarah Johnson', 'Enterprise Suite'),
('005', 'DataDriven Co', 85000, '2024-02-05', 'Closed', true, 'James Wilson', 'Analytics Pro'),
('006', 'HealthTech Solutions', 120000, '2024-02-20', 'Pending', false, 'Rachel Torres', 'Premium Security'),
('007', 'LegalEase Corp', 95000, '2024-01-25', 'Closed', true, 'Mike Chen', 'Enterprise Suite'),
('008', 'GlobalTrade Inc', 45000, '2024-02-08', 'Closed', true, 'James Wilson', 'Basic Package'),
('009', 'FastTrack Ltd', 180000, '2024-02-12', 'Closed', true, 'Sarah Johnson', 'Premium Security'),
('010', 'UpgradeNow Corp', 65000, '2024-02-18', 'Pending', false, 'Rachel Torres', 'Analytics Pro');
-- Enable change tracking
ALTER TABLE sales_conversations SET CHANGE_TRACKING = TRUE;
-- Create the search service
CREATE OR REPLACE CORTEX SEARCH SERVICE sales_conversation_search
ON transcript_text
ATTRIBUTES customer_name, deal_stage, sales_rep
WAREHOUSE = sales_intelligence_wh
TARGET_LAG = '1 hour'
AS (
SELECT
transcript_text,
customer_name,
deal_stage,
sales_rep,
conversation_date
FROM sales_conversations
WHERE conversation_date >= '2024-01-01'
);
CREATE OR REPLACE STAGE models DIRECTORY = (ENABLE = TRUE);
To set up Cortex Analyst you will have to upload a semantic file.
Cortex Analyst is a highly accurate text to sql generator and in order to produce highly accurate results a semantic file such as this one is required. Cortex Analyst will use this semantic file along with user prompts to generate accurate SQL.
Now run the below code in the same SQL worksheet to create a Stored Procedure that calls a Snowflake Cortex Agents and will use a llama model to determine whether or not to use the Search or Analyst.
CREATE OR REPLACE PROCEDURE call_cortex_agent_proc(query STRING, limit INT)
RETURNS VARIANT
LANGUAGE PYTHON
RUNTIME_VERSION = '3.8'
PACKAGES = ('snowflake-snowpark-python')
HANDLER = 'call_cortex_agent_proc'
AS $$
import json
import _snowflake
import re
from snowflake.snowpark.context import get_active_session
def call_cortex_agent_proc(query: str, limit: int = 10):
session = get_active_session()
API_ENDPOINT = "/api/v2/cortex/agent:run"
API_TIMEOUT = 50000
CORTEX_SEARCH_SERVICES = "sales_intelligence.data.sales_conversation_search"
SEMANTIC_MODELS = "@sales_intelligence.data.models/sales_metrics_model.yaml"
payload = {
"model": "llama3.1-70b",
"messages": [{"role": "user", "content": [{"type": "text", "text": query}]}],
"tools": [
{"tool_spec": {"type": "cortex_analyst_text_to_sql", "name": "analyst1"}},
{"tool_spec": {"type": "cortex_search", "name": "search1"}}
],
"tool_resources": {
"analyst1": {"semantic_model_file": SEMANTIC_MODELS},
"search1": {"name": CORTEX_SEARCH_SERVICES, "max_results": limit}
}
}
try:
resp = _snowflake.send_snow_api_request(
"POST", API_ENDPOINT, {}, {}, payload, None, API_TIMEOUT
)
if resp["status"] != 200:
return {"error": "API call failed"}
response_content = json.loads(resp["content"])
return process_cortex_response(response_content, session)
except Exception as e:
return {"error": str(e)}
def clean_text(text):
""" Cleans up unwanted characters and symbols from search results. """
text = re.sub(r'[\u3010\u3011\u2020\u2021]', '', text) # Remove unwanted symbols
text = re.sub(r'^\s*ns\s+\d+\.*', '', text) # Remove prefixes like "ns 1."
text = text.strip() # Trim whitespace
return text
def process_cortex_response(response, session):
""" Parses Cortex response and executes SQL if provided. """
result = {"type": "unknown", "text": None, "sql": None, "query_results": None}
full_text_response = [] # Stores formatted search responses
for event in response:
if event.get("event") == "message.delta":
data = event.get("data", {})
delta = data.get("delta", {})
for content_item in delta.get("content", []):
content_type = content_item.get("type")
if content_type == "tool_results":
tool_results = content_item.get("tool_results", {})
for result_item in tool_results.get("content", []):
if result_item.get("type") == "json":
json_data = result_item.get("json", {})
if "sql" in json_data:
result["type"] = "cortex_analyst"
result["sql"] = json_data["sql"]
result["text"] = json_data.get("text", "")
# Execute the generated SQL query in Snowflake
try:
query_results = session.sql(result["sql"]).collect()
result["query_results"] = [row.as_dict() for row in query_results]
except Exception as e:
result["query_results"] = {"error": str(e)}
elif "searchResults" in json_data:
result["type"] = "cortex_search"
formatted_results = []
for sr in json_data.get("searchResults", []):
search_text = clean_text(sr.get("text", "").strip())
citation = sr.get("citation", "").strip()
if search_text:
if citation:
formatted_results.append(f"- {search_text} (Source: {citation})")
else:
formatted_results.append(f"- {search_text}")
if formatted_results:
full_text_response.extend(formatted_results)
elif content_type == "text":
text_piece = clean_text(content_item.get("text", "").strip())
if text_piece:
full_text_response.append(text_piece)
result["text"] = "\n".join(full_text_response) if full_text_response else "No relevant search results found."
return result
$$;
And last we will run this below script to grant the appropriate privileges to the ANALYST role we created.
GRANT USAGE ON DATABASE SALES_INTELLIGENCE TO ROLE ANALYST;
GRANT USAGE ON SCHEMA DATA TO ROLE ANALYST;
GRANT USAGE ON CORTEX SEARCH SERVICE SALES_CONVERSATION_SEARCH TO ROLE ANALYST;
GRANT USAGE ON WAREHOUSE SALES_INTELLIGENCE_WH TO ROLE ANALYST;
GRANT READ ON STAGE MODELS TO ROLE ANALYST;
GRANT USAGE ON PROCEDURE call_cortex_agent_proc(VARCHAR, NUMBER) TO ROLE ANALYST;
Go to your Microsoft Copilot Studio.
After several seconds you should see your new Sales Agent
Go topics at the top menu bar
Select the plus sign under the Question
A new browser window will open with the Flow
Enter in the Snowflake Instance URL here, sames as used in the SaaS URL from the last step
CALL sales_intelligence.data.call_cortex_agent_proc('<>', 1);
Go back to the copilot studio browser
Go back to the Flow window and select Submit SQL Statement for Execution
Now let's test the Agent!
Feel free to play around with different questions based on the data we loaded earlier in the quickstart.
This quickstart showed you how to set up the connection to Cortex Agents from MS Copilots. In order to make for a more robust Copilot you will likely add additional triggers to other Agents or Knowledge Sources so that your Copilot is more versatile and can act on specific prompts/triggers.
By following this quickstart, you learned how to: