Snowflake has recently expanded alerting capability, allowing alerts to be sent via Email or Webhook. Webhook alerts are very powerful, because this enables the ability to send alerts to Slack, Microsoft Teams, Pager Duty, etc. This is useful for a variety of use cases including pipeline failure alerts and data driven alerts.
This post will provide a step-by-step guide to configure Snowflake alerts in Slack. We will cover a detailed example that will send an alert when warehouse usage increases.
Log in to your Slack account, then go to this page.
Click Create an App
Choose “From Scratch”
On the next screen, give the app a name, choose a workspace, then click Create App
The next screen will present your Application Credentials.
Click Incoming Webhooks in the left sidebar:
Toggle “Active Incoming Webhooks” to “On”
After you change the toggle, scroll down to the bottom of that page and click “Add New Webhook to Workspace”.
Choose which channel you want the Snowflake alerts to go to. I created a new channel called “snowflake-alerts”, then I refreshed this page to get it to show in the dropdown.
Click Allow.
After you click allow above, the next page will give you your Webhook URL and a test CURL command.
Copy and paste the test command into your terminal. If you’re on Windows, use Git Bash.
It should instantly send a message to your new Slack channel.
Congrats, your new Webhook is now working!
Now it looks much better!
The Webhook URL contains a secret. Replace the secret string below with your secret string, then run this command in Snowflake.
Reminder: Secrets are schema level objects. Be careful about your worksheet database and schema context, or use fully qualified names.
In my case, I created a secret called gmds_slack_secret
in the public
schema of the analytics
database.
1use schmea analytics.public;
2
3CREATE OR REPLACE SECRET gmds_slack_secret
4 TYPE = GENERIC_STRING
5 SECRET_STRING = 'T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX';
Now that we’ve successfully created the secret, let’s create the Notification Integration:
1CREATE OR REPLACE NOTIFICATION INTEGRATION gmds_slack_webhook_integration
2 TYPE=WEBHOOK
3 ENABLED=TRUE
4 WEBHOOK_URL='https://hooks.slack.com/services/SNOWFLAKE_WEBHOOK_SECRET'
5 WEBHOOK_SECRET=analytics.public.gmds_slack_secret
6 WEBHOOK_BODY_TEMPLATE='{"text": "SNOWFLAKE_WEBHOOK_MESSAGE"}'
7 WEBHOOK_HEADERS=('Content-Type'='application/json');
To send a notification, we use the built-in SYSTEM$SEND_SNOWFLAKE_NOTIFICATION
stored procedure. We must pass the SANITIZE_WEBHOOK_CONTENT
function to the procedure to remove placeholder (i.e. SNOWFLAKE_WEBHOOK_SECRET
from the message.
Here is the code I ran in my account:
1CREATE OR REPLACE NOTIFICATION INTEGRATION gmds_slack_webhook_integration
2 TYPE=WEBHOOK
3 ENABLED=TRUE
4 WEBHOOK_URL='https://hooks.slack.com/services/SNOWFLAKE_WEBHOOK_SECRET'
5 WEBHOOK_SECRET=analytics.public.gmds_slack_secret
6 WEBHOOK_BODY_TEMPLATE='{"text": "SNOWFLAKE_WEBHOOK_MESSAGE"}'
7 WEBHOOK_HEADERS=('Content-Type'='application/json');
The notification arrived in slack instantly!
Now that we have the basic gears in place, let’s move on to a real-world example.
The following query compares recent usage (last completed hour), to the average hourly usage of each warehouse over the last month. In this case, we will flag warehouses with 50% increased usage.
We are using a serverless task because it will save us money! Just omit the warehouse name to make the task serverless.
For example purposes, I’m going to union a dummy record to ensure that every execution of the task produces a row and an alert.
1CREATE OR REPLACE TASK monitor_warehouse_spikes
2SCHEDULE = 'USING CRON 2 * * * * America/New_York'
3SERVERLESS_TASK_MIN_STATEMENT_SIZE = 'XSMALL'
4SERVERLESS_TASK_MAX_STATEMENT_SIZE = 'XSMALL'
5as
6insert into usage_spike_alerts (warehouse_name,last_hour_credits,avg_monthly_credits,credit_diff,percent_increase)
7WITH last_hour_usage AS (
8 SELECT
9 warehouse_name,
10 sum(credits_used) AS last_hour_credits
11 FROM
12 snowflake.account_usage.warehouse_metering_history
13 WHERE
14 start_time >= DATEADD(hour, -2, CURRENT_TIMESTAMP)
15 AND end_time <= CURRENT_TIMESTAMP
1CREATE or replace TABLE usage_spike_alerts (
2 alert_id INT AUTOINCREMENT PRIMARY KEY,
3 warehouse_name STRING NOT NULL,
4 last_hour_credits FLOAT NOT NULL,
5 avg_monthly_credits FLOAT NOT NULL,
6 credit_diff FLOAT NOT NULL,
7 percent_increase FLOAT NOT NULL,
8 inserted_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
9 alert_sent boolean default false
10);
Note this table has a few helper columns with default values not included in the query results:
1execute task monitor_warehouse_spikes;
2select * from usage_spike_alerts where not alert_sent;
We want to do the following:
usage_spike_alerts
table for unsent records: alert_sent==false
1CREATE OR REPLACE PROCEDURE send_usage_spike_alerts()
2RETURNS STRING
3LANGUAGE PYTHON
4RUNTIME_VERSION = '3.8'
5PACKAGES = ('snowflake-snowpark-python')
6HANDLER = 'send_alerts'
7AS $$
8import snowflake.snowpark as snowpark
9
10def send_alerts(session):
11 query = """
12 SELECT warehouse_name, last_hour_credits, avg_monthly_credits, credit_diff, percent_increase
13 FROM usage_spike_alerts
14 WHERE alert_sent = FALSE
15 """
I experienced challenges sending new lines \n
to slack via SANITIZE_WEBHOOK_CONTENT
function. Slack requires a text string literal with double backslash \n
; so a python string of \\n
should work. However, when python passes this string literal to the SQL function, something is happening and the alert is not sent. I was not able to achieve python based alerts with new lines, although this works manually: SANITIZE_WEBHOOK_CONTENT('line1\line2')
We’ve already inserted a test row into the “spikes” table. Let’s run the sproc and ensure:
alert_sent==true
1execute task monitor_warehouse_spikes; -- if you haven't already...
2select * from usage_spike_alerts where not alert_sent; -- review it: sent == false
3CALL send_usage_spike_alerts(); -- send the alert
4-- wait for the alert to come
5select * from usage_spike_alerts where not alert_sent; -- 0 rows
You should have received an alert in Slack!
We want the hourly schedule of the monitor_warehouse_spikes
task to handle the end-to-end process; let’s make sure that after the monitor_warehouse_spikes
task is run, the procedure that sends the alert is also run.
First, wrap the procedure in a serverless task and turn it on:
1CREATE TASK send_usage_spike_alerts_task
2AS
3CALL send_usage_spike_alerts();
4
5alter task send_usage_spike_alerts_task resume;
Next, chain the tasks together:
1ALTER TASK send_usage_spike_alerts_task
2ADD AFTER monitor_warehouse_spikes;
Test it:
1execute task monitor_warehouse_spikes;
This task should now add the new records to the table and send the alert(s) to slack!
When calling send_usage_spike_alerts()
procedure directly, the message appears in slack right away. When chaining send_usage_spike_alerts_task
to monitor_warehouse_spikes
task, it takes up to 3 minutes for the alert to come.
From the Data sidebar menu in the Snowsight UI (database icon in the sidebar), navigate to the database and schema that contains your tasks. Expand the tasks container and select either task we created. Then click the “Graph” tab in the main frame. Here you can manually trigger the task.
You can also review the task history on the “Run History” tab:
The query to check for spikes and sending the alert could have been done in a single task. One may argue that chaining two tasks adds unnecessary complexity. However, I like this design for the following reasons.
Sending Snowflake alerts to slack is incredibly useful. The use cases for this type of alerting are endless!
This article has equipped you with the knowledge of:
I’m looking forward to hearing about the use cases you come up with! 🥂
Jeff Skoldberg is a Sales Engineer at SELECT, helping customers get maximum value out of the SELECT app to reduce their Snowflake spend. Prior to joining SELECT, Jeff was a Data and Analytics Consultant with 15+ years experience in automating insights and using data to control business processes. From a technology standpoint, he specializes in Snowflake + dbt + Tableau. From a business topic standpoint, he has experience in Public Utility, Clinical Trials, Publishing, CPG, and Manufacturing.
Get up and running with SELECT in 15 minutes.
Gain visibility into Snowflake usage, optimize performance and automate savings with the click of a button.