From 88ad7da753c458568e2558d1f5037c9efae4a571 Mon Sep 17 00:00:00 2001 From: Alex Dinsmoor Date: Mon, 1 Dec 2025 10:37:55 +1100 Subject: [PATCH 1/4] add GPT Action for Snowflake Cortex Analyst --- .../gpt_action_snowflake_cortex_analyst.ipynb | 553 ++++++++++++++++++ 1 file changed, 553 insertions(+) create mode 100644 examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb diff --git a/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb b/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb new file mode 100644 index 0000000000..cc1c692a9f --- /dev/null +++ b/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb @@ -0,0 +1,553 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# GPT Actions - Snowflake direct\n", + "## Introduction" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This page provides an instruction & guide for developers building a GPT Action for a specific application. Before you proceed, make sure to first familiarize yourself with the following information:\n", + "\n", + "\n", + "\n", + "* [Introduction to GPT Actions](https://platform.openai.com/docs/actions)\n", + "* [Introduction to GPT Actions Library](https://platform.openai.com/docs/actions/actions-library)\n", + "* [Example of Building a GPT Action from Scratch](https://platform.openai.com/docs/actions/getting-started)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This particular GPT Action provides an overview of how to connect to a Snowflake Data Warehouse. This Action takes a user’s question, scans the relevant tables to gather the data schema, then writes a SQL query to answer the user’s question.\n", + "\n", + "Note: This cookbook returns back a [ResultSet SQL statement](https://docs.snowflake.com/en/developer-guide/sql-api/handling-responses#getting-the-data-from-the-results), rather than the full result that is not limited by GPT Actions application/json payload limit. For production and advanced use-case, a middleware is required to return back a CSV file. You can follow instructions in the [GPT Actions - Snowflake Middleware cookbook](../gpt_actions_library/gpt_action_snowflake_middleware) to implement this flow instead." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Value + Example Business Use Cases" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "Value: Users can now leverage ChatGPT's natural language capability to connect directly to Snowflake’s Data Warehouse.\n", + "\n", + "Example Use Cases:\n", + "\n", + "\n", + "\n", + "* Data scientists can connect to tables and run data analyses using ChatGPT's Data Analysis\n", + "* Citizen data users can ask basic questions of their transactional data\n", + "* Users gain more visibility into their data & potential anomalies" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Application Information" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Application Key Links" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "Check out these links from the application before you get started:\n", + "\n", + "* Application Website: [https://app.snowflake.com/](https://app.snowflake.com/)\n", + "* Application API Documentation: [https://docs.snowflake.com/en/developer-guide/sql-api/intro](https://docs.snowflake.com/en/developer-guide/sql-api/intro)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Application Prerequisites" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before you get started, make sure you go through the following steps in your application environment:\n", + "\n", + "* Provision a Snowflake Data Warehouse\n", + "* Ensure that the user authenticating into Snowflake via ChatGPT has access to the database, schemas, and tables with the necessary role" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 1. Configure the Custom GPT" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Set GPT Instructions\n", + "\n", + "Once you've created a Custom GPT, copy the text below in the Instructions panel. Have questions? Check out [Getting Started Example](https://platform.openai.com/docs/actions/getting-started) to see how this step works in more detail." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "plaintext" + } + }, + "outputs": [], + "source": [ + "**Context**: You are an expert at writing Snowflake SQL queries. A user is going to ask you a question. \n", + "\n", + "**Instructions**:\n", + "1. No matter the user's question, start by running `runQuery` operation using this query: \"SELECT column_name, table_name, data_type, comment FROM {database}.INFORMATION_SCHEMA.COLUMNS\" \n", + "-- Assume warehouse = \"\", database = \"\", unless the user provides different values \n", + "2. Convert the user's question into a SQL statement that leverages the step above and run the `runQuery` operation on that SQL statement to confirm the query works. Add a limit of 100 rows\n", + "3. Now remove the limit of 100 rows and return back the query for the user to see\n", + "4. Use the role when querying Snowflake\n", + "5. Run each step in sequence. Explain what you are doing in a few sentences, run the action, and then explain what you learned. This will help the user understand the reason behind your workflow. \n", + "\n", + "**Additional Notes**: If the user says \"Let's get started\", explain that the user can provide a project or dataset, along with a question they want answered. If the user has no ideas, suggest that we have a sample flights dataset they can query - ask if they want you to query that" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### OpenAPI Schema\n", + "\n", + "Once you've created a Custom GPT, copy the text below in the Actions panel. Update the servers url to match your Snowflake Account Name url plus `/api/v2` as described [here](https://docs.snowflake.com/en/user-guide/organizations-connect#standard-account-urls). Have questions? Check out [Getting Started Example](https://platform.openai.com/docs/actions/getting-started) to see how this step works in more detail." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "yaml" + } + }, + "outputs": [], + "source": [ + "openapi: 3.1.0\n", + "info:\n", + " title: Snowflake Statements API\n", + " version: 1.0.0\n", + " description: API for executing statements in Snowflake with specific warehouse and role settings.\n", + "servers:\n", + " - url: 'https://-.snowflakecomputing.com/api/v2'\n", + "\n", + "\n", + "paths:\n", + " /statements:\n", + " post:\n", + " summary: Execute a SQL statement in Snowflake\n", + " description: This endpoint allows users to execute a SQL statement in Snowflake, specifying the warehouse and roles to use.\n", + " operationId: runQuery\n", + " tags:\n", + " - Statements\n", + " requestBody:\n", + " required: true\n", + " content:\n", + " application/json:\n", + " schema:\n", + " type: object\n", + " properties:\n", + " warehouse:\n", + " type: string\n", + " description: The name of the Snowflake warehouse to use for the statement execution.\n", + " role:\n", + " type: string\n", + " description: The Snowflake role to assume for the statement execution.\n", + " statement:\n", + " type: string\n", + " description: The SQL statement to execute.\n", + " required:\n", + " - warehouse\n", + " - role\n", + " - statement\n", + " responses:\n", + " '200':\n", + " description: Successful execution of the SQL statement.\n", + " content:\n", + " application/json:\n", + " schema:\n", + " type: object\n", + " properties:\n", + " status:\n", + " type: string\n", + " data:\n", + " type: object\n", + " additionalProperties: true\n", + " '400':\n", + " description: Bad request, e.g., invalid SQL statement or missing parameters.\n", + " '401':\n", + " description: Authentication error, invalid API credentials.\n", + " '500':\n", + " description: Internal server error." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2. Configure Snowflake Integration" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Below are instructions on setting up authentication with this 3rd party application. Have questions? Check out [Getting Started Example](https://platform.openai.com/docs/actions/getting-started) to see how this step works in more detail." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Configure IP Whitelisting for ChatGPT\n", + "Snowflake accounts with network policies that limit connections by IP, may require exceptions to be added for ChatGPT.\n", + "* Review the Snowflake documentation on [Network Policies](https://docs.snowflake.com/en/user-guide/network-policies)\n", + "* Go to the Snowflake Worksheets\n", + "* Create a network rule with the ChatGPT IP egress ranges listed [here](https://platform.openai.com/docs/actions/production/ip-egress-ranges#ip-egress-ranges)\n", + "* Create a corresponding Network Policy" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "yaml" + } + }, + "outputs": [], + "source": [ + "## ChatGPT IP ranges available at https://openai.com/chatgpt-actions.json\n", + "CREATE NETWORK RULE chatgpt_network_rule\n", + " MODE = INGRESS\n", + " TYPE = IPV4\n", + " VALUE_LIST = ('23.102.140.112/28',...,'40.84.180.128/28');\n", + "\n", + "CREATE NETWORK POLICY chatgpt_network_policy\n", + " ALLOWED_NETWORK_RULE_LIST = ('chatgpt_network_rule');" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Network policies can be applied at the account, security integration, and user level. The most specific network policy overrides the more general network policies. Depending on how these policies are applied, you may need to alter the policies for individual users in addition to the security integration. If you face this issue, you may encounter Snowflake's error code 390422 or a generic \"Invalid Client\" error." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create the Security Integration\n", + "* Review the Snowflake OAuth Overview: [https://docs.snowflake.com/en/user-guide/oauth-snowflake-overview](https://docs.snowflake.com/en/user-guide/oauth-snowflake-overview)\n", + "* Create new OAuth credentials through a [Security Integration](https://docs.snowflake.com/en/sql-reference/sql/create-security-integration-oauth-snowflake) - you will need a new one for each OAuth app/custom GPT since Snowflake Redirect URIs are 1-1 mapped to Security Integrations" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "yaml" + } + }, + "outputs": [], + "source": [ + "CREATE SECURITY INTEGRATION CHATGPT_INTEGRATION\n", + " TYPE = OAUTH\n", + " ENABLED = TRUE\n", + " OAUTH_CLIENT = CUSTOM\n", + " OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'\n", + " OAUTH_REDIRECT_URI = 'https://oauth.pstmn.io/v1/callback' --- // this is a temporary value while testing your integration. You will replace this with the value your GPT provides\n", + " OAUTH_ISSUE_REFRESH_TOKENS = TRUE\n", + " OAUTH_REFRESH_TOKEN_VALIDITY = 7776000\n", + " NETWORK_POLICY = chatgpt_network_policy; --- // this line should only be included if you followed step 1 above" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
\n", + " Optional: Automate Network Rule Configuration\n", + " \n", + " There are now over 100 egress IP addresses used by ChatGPT. The list updates irregularly and without announcement. To keep up to date with it, we can fetch the list on a daily basis and apply it to our network rule.\n", + "\n", + " ### Network rule to allow outbound traffic to OpenAI\n", + " ```sql\n", + " CREATE OR REPLACE NETWORK RULE chatgpt_actions_rule\n", + " MODE = EGRESS -- outbound\n", + " TYPE = HOST_PORT\n", + " VALUE_LIST = ('openai.com:443');\n", + " ```\n", + " ### Access Integration to apply the rule\n", + " ```sql\n", + " CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION chatgpt_actions_integration\n", + " ALLOWED_NETWORK_RULES = (chatgpt_actions_rule)\n", + " ENABLED = TRUE;\n", + " ```\n", + "\n", + " ### UDF to Fetch the IP ranges\n", + " ```sql\n", + " CREATE OR REPLACE FUNCTION getChatGPTActionsAddresses()\n", + " RETURNS ARRAY -- array\n", + " LANGUAGE PYTHON\n", + " RUNTIME_VERSION = 3.10\n", + " PACKAGES = ('requests')\n", + " EXTERNAL_ACCESS_INTEGRATIONS = (chatgpt_actions_integration)\n", + " HANDLER = 'get_ip_address_ranges'\n", + "AS\n", + "$$\n", + "import requests\n", + "\n", + "def get_ip_address_ranges():\n", + " resp = requests.get(\"https://openai.com/chatgpt-actions.json\", timeout=10)\n", + " resp.raise_for_status()\n", + " data = [entry[\"ipv4Prefix\"] for entry in resp.json().get(\"prefixes\", []) if \"ipv4Prefix\" in entry]\n", + " return data\n", + "$$;\n", + " ```\n", + " ### Procedure to update the network rule\n", + " ```sql\n", + " CREATE OR REPLACE PROCEDURE update_chatgpt_network_rule()\n", + " RETURNS STRING\n", + " LANGUAGE SQL\n", + "AS\n", + "$$\n", + "DECLARE\n", + " ip_list STRING;\n", + "BEGIN\n", + " -- Properly quote the IPs for use in VALUE_LIST\n", + " ip_list := '''' || ARRAY_TO_STRING(getChatGPTActionsAddresses(), ''',''') || '''';\n", + "\n", + " -- Run the dynamic SQL to update the rule\n", + " EXECUTE IMMEDIATE\n", + " 'ALTER NETWORK RULE chatgpt_network_rule SET VALUE_LIST = (' || ip_list || ')';\n", + "\n", + " RETURN 'chatgpt_network_rule updated with ' || ARRAY_SIZE(getChatGPTActionsAddresses()) || ' entries';\n", + "END;\n", + "$$;\n", + " ```\n", + "\n", + " ### Call the procedure\n", + " ```sql\n", + " CALL update_chatgpt_network_rule();\n", + " ```\n", + "\n", + " ### Run the procedure every day at 6AM Pacific Time\n", + " ```sql\n", + " CREATE OR REPLACE TASK auto_update_chatgpt_network_rule\n", + " WAREHOUSE = COMPUTE_WH\n", + " SCHEDULE = 'USING CRON 0 6 * * * America/Los_Angeles'\n", + "AS\n", + " CALL update_chatgpt_network_rule();\n", + " ```\n", + "
" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3. Configure GPT Action Authentication\n", + "### Gather key information from Snowflake\n", + "* Retrieve your OAuth Client ID, Auth URL, and Token URL\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "yaml" + } + }, + "outputs": [], + "source": [ + "DESCRIBE SECURITY INTEGRATION CHATGPT_INTEGRATION;" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "You’ll find the required information in these 3 rows:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "![../../../images/snowflake_direct_oauth.png](../../../images/snowflake_direct_oauth.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "* Retrieve your OAuth Client Secret using SHOW_OAUTH_CLIENT_SECRETS" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "yaml" + } + }, + "outputs": [], + "source": [ + "SELECT \n", + "trim(parse_json(SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('CHATGPT_INTEGRATION')):OAUTH_CLIENT_ID) AS OAUTH_CLIENT_ID\n", + ", trim(parse_json(SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('CHATGPT_INTEGRATION')):OAUTH_CLIENT_SECRET) AS OAUTH_CLIENT_SECRET;" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now is a good time to [test your Snowflake integration in Postman](https://community.snowflake.com/s/article/How-to-configure-postman-for-testing-SQL-API-with-OAuth). If you configured a network policy for your security integration, ensure that it includes the IP of the machine you're using to test." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Set OAuth Values in GPT Action Authentication" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In ChatGPT, click on \"Authentication\" and choose \"OAuth\". Enter in the information below.\n", + "\n", + "| Form Field | Value |\n", + "| -------- | -------- |\n", + "| Authentication Type | OAuth |\n", + "| Client ID | OAUTH_CLIENT_ID from SHOW_OAUTH_CLIENT_SECRETS |\n", + "| Client Secret | OAUTH_CLIENT_SECRET from SHOW_OAUTH_CLIENT_SECRETS |\n", + "| Authorization URL | OAUTH_AUTHORIZATION_ENDPOINT from DESCRIBE SECURITY INTEGRATION |\n", + "| Token URL | OAUTH_TOKEN_ENDPOINT from DESCRIBE SECURITY INTEGRATION |\n", + "| Scope | session:role:CHATGPT_INTEGRATION_ROLE* |\n", + "| Token Exchange Method | Default (POST Request) |\n", + "\n", + "\n", + "*Snowflake scopes pass the role in the format `session:role:` for example `session:role:CHATGPT_INTEGRATION_ROLE`. You can optionally leave this field empty and specify the role in the GPT instructions, but by adding it here it becomes included in OAuth Consent Request which can sometimes be more reliable. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 4. Update the Snowflake Integration Redirect URI" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "Once you've set up authentication in ChatGPT, follow the steps below in the application to finalize the Action.\n", + "\n", + "* Copy the callback URL from the GPT Action\n", + "* Update the Redirect URI in your Security Integration to the callback URL provided in ChatGPT.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "yaml" + } + }, + "outputs": [], + "source": [ + "ALTER SECURITY INTEGRATION CHATGPT_INTEGRATION SET OAUTH_REDIRECT_URI='https://chat.openai.com/aip//oauth/callback';" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### FAQ & Troubleshooting" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "* This guide is intended to illustrate general concepts and is provided for reference purposes only. We are unable to provide full support for the third party API integration. \n", + "* The callback url can change if you update the YAML, double check it is correct when making changes.\n", + "* _Callback URL Error:_ If you get a callback URL error in ChatGPT, pay close attention to the Post-Action Steps above. You need to add the callback URL directly into your Security Integration for the action to authenticate correctly\n", + "* _Schema calls the wrong warehouse or database:_ If ChatGPT calls the wrong warehouse or database, consider updating your instructions to make it more explicit either (a) which warehouse / database should be called or (b) to require the user provide those exact details before it runs the query\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "_Are there integrations that you’d like us to prioritize? Are there errors in our integrations? File a PR or issue in our github, and we’ll take a look._\n" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.6" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} From ef2649edff69a88719c4dfe518a1a192f4bbbe91 Mon Sep 17 00:00:00 2001 From: Alex Dinsmoor Date: Mon, 1 Dec 2025 10:42:35 +1100 Subject: [PATCH 2/4] sync updates --- .../gpt_action_snowflake_cortex_analyst.ipynb | 128 +++++++++++------- 1 file changed, 77 insertions(+), 51 deletions(-) diff --git a/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb b/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb index cc1c692a9f..10164a2b62 100644 --- a/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb +++ b/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# GPT Actions - Snowflake direct\n", + "# GPT Actions - Snowflake Cortex Analyst\n", "## Introduction" ] }, @@ -21,13 +21,24 @@ "* [Example of Building a GPT Action from Scratch](https://platform.openai.com/docs/actions/getting-started)" ] }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "plaintext" + } + }, + "outputs": [], + "source": [] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This particular GPT Action provides an overview of how to connect to a Snowflake Data Warehouse. This Action takes a user’s question, scans the relevant tables to gather the data schema, then writes a SQL query to answer the user’s question.\n", + "This particular GPT Action provides an overview of how to connect to the Snowflake Cortex Analyst REST API. This Action takes a user's question and submits it to Snowflake Cortex Analyst to draft the appropriate SQL logic based on semantic view definitions in Snowflake. \n", "\n", - "Note: This cookbook returns back a [ResultSet SQL statement](https://docs.snowflake.com/en/developer-guide/sql-api/handling-responses#getting-the-data-from-the-results), rather than the full result that is not limited by GPT Actions application/json payload limit. For production and advanced use-case, a middleware is required to return back a CSV file. You can follow instructions in the [GPT Actions - Snowflake Middleware cookbook](../gpt_actions_library/gpt_action_snowflake_middleware) to implement this flow instead." + "Note: This cookbook returns back a message object that includes message text and SQL (if applicable). See the [documentation here for more details](https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-analyst/rest-api). It does not executed the SQL statement. For that capability, consider using the [GPT Action - Snowlfake Direct](https://cookbook.openai.com/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_direct) to then execute SQL statements. " ] }, { @@ -42,15 +53,12 @@ "metadata": {}, "source": [ "\n", - "Value: Users can now leverage ChatGPT's natural language capability to connect directly to Snowflake’s Data Warehouse.\n", + "Value: Users can now leverage ChatGPT's capabilities (natural language, Advanced Data Analysis, other tools) alongside Snowflake semantic views to ensure \n", "\n", "Example Use Cases:\n", - "\n", - "\n", - "\n", - "* Data scientists can connect to tables and run data analyses using ChatGPT's Data Analysis\n", - "* Citizen data users can ask basic questions of their transactional data\n", - "* Users gain more visibility into their data & potential anomalies" + "* Users gain more visibility into their data & metric definitions\n", + "* Central data teams can set guardrails to ensure only governed metric definitions are used in conversational analytics use cases \n", + "* Citizen data users can ask basic questions of their transactional data" ] }, { @@ -75,7 +83,8 @@ "Check out these links from the application before you get started:\n", "\n", "* Application Website: [https://app.snowflake.com/](https://app.snowflake.com/)\n", - "* Application API Documentation: [https://docs.snowflake.com/en/developer-guide/sql-api/intro](https://docs.snowflake.com/en/developer-guide/sql-api/intro)" + "* Snowflake semantic view overview [https://docs.snowflake.com/en/user-guide/views-semantic/overview](https://docs.snowflake.com/en/user-guide/views-semantic/overview)\n", + "* Application API Documentation: [https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-analyst/rest-api](https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-analyst/rest-api)" ] }, { @@ -109,7 +118,9 @@ "\n", "### Set GPT Instructions\n", "\n", - "Once you've created a Custom GPT, copy the text below in the Instructions panel. Have questions? Check out [Getting Started Example](https://platform.openai.com/docs/actions/getting-started) to see how this step works in more detail." + "Once you've created a Custom GPT, add instructions based on the behavior you'd like to set. Have questions? Check out [Getting Started Example](https://platform.openai.com/docs/actions/getting-started) to see how this step works in more detail.\n", + "\n", + "See below for an example:" ] }, { @@ -122,17 +133,14 @@ }, "outputs": [], "source": [ - "**Context**: You are an expert at writing Snowflake SQL queries. A user is going to ask you a question. \n", + "**Context**: You are an expert at writing Snowflake Semantic View requests and explaining SQL queries + their results. A user is going to ask you a question. \n", "\n", "**Instructions**:\n", - "1. No matter the user's question, start by running `runQuery` operation using this query: \"SELECT column_name, table_name, data_type, comment FROM {database}.INFORMATION_SCHEMA.COLUMNS\" \n", - "-- Assume warehouse = \"\", database = \"\", unless the user provides different values \n", - "2. Convert the user's question into a SQL statement that leverages the step above and run the `runQuery` operation on that SQL statement to confirm the query works. Add a limit of 100 rows\n", - "3. Now remove the limit of 100 rows and return back the query for the user to see\n", - "4. Use the role when querying Snowflake\n", - "5. Run each step in sequence. Explain what you are doing in a few sentences, run the action, and then explain what you learned. This will help the user understand the reason behind your workflow. \n", + "1. If the user's question could be answered with a SQL query, try sending to the sendCortexAnalystMessage action to request that information from Snowflake. ALWAYS use this action first before trying to write your own SQL. Use semantic_view: {database}.{schema}.{your_semantic_view} by default. \n", + "2. ALWAYS show the generated SQL to the user, explain what info it retrieved succinctly, and ask if they'd like to execute it.\n", + "3. If so, call the runQuery action. Use the role when querying Snowflake\n", "\n", - "**Additional Notes**: If the user says \"Let's get started\", explain that the user can provide a project or dataset, along with a question they want answered. If the user has no ideas, suggest that we have a sample flights dataset they can query - ask if they want you to query that" + "**Additional Notes**: If the user says \"Let's get started\", explain that the user can provide a project or dataset, along with a question they want answered. If the user has no ideas, suggest that we have a semantic model on food truck order data that they can explore. Additionally, the data spans only 2019-2022." ] }, { @@ -157,21 +165,17 @@ "source": [ "openapi: 3.1.0\n", "info:\n", - " title: Snowflake Statements API\n", + " title: Snowflake Cortex Analyst\n", " version: 1.0.0\n", - " description: API for executing statements in Snowflake with specific warehouse and role settings.\n", + " description: >\n", + " OpenAPI specification exposing the capability to query semantic models via Cortex Analyst\n", "servers:\n", - " - url: 'https://-.snowflakecomputing.com/api/v2'\n", - "\n", - "\n", + " - url: https://-.snowflakecomputing.com\n", "paths:\n", - " /statements:\n", + " /api/v2/cortex/analyst/message:\n", " post:\n", - " summary: Execute a SQL statement in Snowflake\n", - " description: This endpoint allows users to execute a SQL statement in Snowflake, specifying the warehouse and roles to use.\n", - " operationId: runQuery\n", - " tags:\n", - " - Statements\n", + " operationId: sendCortexAnalystMessage\n", + " summary: Send a message to the Cortex Analyst for semantic model querying\n", " requestBody:\n", " required: true\n", " content:\n", @@ -179,38 +183,60 @@ " schema:\n", " type: object\n", " properties:\n", - " warehouse:\n", - " type: string\n", - " description: The name of the Snowflake warehouse to use for the statement execution.\n", - " role:\n", - " type: string\n", - " description: The Snowflake role to assume for the statement execution.\n", - " statement:\n", + " messages:\n", + " type: array\n", + " items:\n", + " type: object\n", + " properties:\n", + " role:\n", + " type: string\n", + " enum:\n", + " - analyst\n", + " - user\n", + " content:\n", + " type: array\n", + " items:\n", + " type: object\n", + " properties:\n", + " type:\n", + " type: string\n", + " enum:\n", + " - sql\n", + " - text\n", + " required:\n", + " - type\n", + " required:\n", + " - role\n", + " - content\n", + " semantic_view:\n", " type: string\n", - " description: The SQL statement to execute.\n", + " description: Name of the semantic model to use\n", + " stream:\n", + " type: boolean\n", + " default: false\n", " required:\n", - " - warehouse\n", - " - role\n", - " - statement\n", + " - messages\n", + " - semantic_view\n", " responses:\n", - " '200':\n", - " description: Successful execution of the SQL statement.\n", + " \"200\":\n", + " description: Successful response from Cortex Analyst\n", " content:\n", " application/json:\n", " schema:\n", " type: object\n", + " description: The full response from Cortex Analyst\n", " properties:\n", - " status:\n", + " message:\n", " type: string\n", " data:\n", " type: object\n", " additionalProperties: true\n", - " '400':\n", - " description: Bad request, e.g., invalid SQL statement or missing parameters.\n", - " '401':\n", - " description: Authentication error, invalid API credentials.\n", - " '500':\n", - " description: Internal server error." + " \"400\":\n", + " description: Bad request – invalid input or model\n", + " \"401\":\n", + " description: Unauthorized – check authentication headers\n", + " \"500\":\n", + " description: Internal server error" ] }, { From 655ec75a1fde7f490f3aca7c6b80640893d95408 Mon Sep 17 00:00:00 2001 From: Alex Dinsmoor Date: Mon, 1 Dec 2025 10:46:21 +1100 Subject: [PATCH 3/4] update value statement and correct typo --- .../gpt_action_snowflake_cortex_analyst.ipynb | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb b/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb index 10164a2b62..2604648bbd 100644 --- a/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb +++ b/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb @@ -38,7 +38,7 @@ "source": [ "This particular GPT Action provides an overview of how to connect to the Snowflake Cortex Analyst REST API. This Action takes a user's question and submits it to Snowflake Cortex Analyst to draft the appropriate SQL logic based on semantic view definitions in Snowflake. \n", "\n", - "Note: This cookbook returns back a message object that includes message text and SQL (if applicable). See the [documentation here for more details](https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-analyst/rest-api). It does not executed the SQL statement. For that capability, consider using the [GPT Action - Snowlfake Direct](https://cookbook.openai.com/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_direct) to then execute SQL statements. " + "Note: This cookbook returns back a message object that includes message text and SQL (if applicable). See the [documentation here for more details](https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-analyst/rest-api). It does not execute the SQL statement. For that capability, consider using the [GPT Action - Snowlfake Direct](https://cookbook.openai.com/examples/chatgpt/gpt_actions_library/gpt_action_snowflake_direct) to then execute SQL statements. " ] }, { @@ -53,7 +53,7 @@ "metadata": {}, "source": [ "\n", - "Value: Users can now leverage ChatGPT's capabilities (natural language, Advanced Data Analysis, other tools) alongside Snowflake semantic views to ensure \n", + "Value: Users can now leverage ChatGPT's capabilities (natural language, Advanced Data Analysis, other tools) alongside Snowflake semantic views to enable trusted conversational analytics.\n", "\n", "Example Use Cases:\n", "* Users gain more visibility into their data & metric definitions\n", From c6e3ce1738c895cf5ac7920f506dac054737fae9 Mon Sep 17 00:00:00 2001 From: Alex Dinsmoor Date: Mon, 1 Dec 2025 10:58:51 +1100 Subject: [PATCH 4/4] add authors.yaml and registry.yaml entry --- authors.yaml | 5 +++++ registry.yaml | 10 ++++++++++ 2 files changed, 15 insertions(+) diff --git a/authors.yaml b/authors.yaml index 92b89168e9..1ff455d528 100644 --- a/authors.yaml +++ b/authors.yaml @@ -547,3 +547,8 @@ derrickchoi-openai: name: "Derrick Choi" website: "https://www.linkedin.com/in/derrickchoi/" avatar: "https://avatars.githubusercontent.com/u/211427900" + +alex-dinsmoor-openai: + name: "Alex Dinsmoor" + website: "https://www.linkedin.com/in/alex-dinsmoor-75001158/" + avatar: "https://avatars.githubusercontent.com/u/245790601" \ No newline at end of file diff --git a/registry.yaml b/registry.yaml index bfebed7d39..62bb2a2af4 100644 --- a/registry.yaml +++ b/registry.yaml @@ -2196,6 +2196,16 @@ - chatgpt - chatgpt-data +- title: GPT Actions library - Snowflake Cortex Analyst + path: examples/chatgpt/gpt_actions_library/gpt_action_snowflake_cortex_analyst.ipynb + date: 2025-12-01 + authors: + - alex-dinsmoor-openai + tags: + - gpt-actions-library + - chatgpt + - chatgpt-data + - title: GPT Actions library - Retool Workflow path: examples/chatgpt/gpt_actions_library/gpt_action_retool_workflow.md date: 2024-08-28