Analyze Cortex Function Spend

Date
  • Morgan Sadr-Hashemi
    VP of Engineering at SELECT

With the rise of LLMs and their explosion in usage, Snowflake has released what they're calling Snowflake Cortex.

For a full breakdown of what Cortex is, you can check out their documentation, but in short:

Snowflake Cortex gives you instant access to industry-leading large language models (LLMs) trained by researchers at companies like Anthropic, Mistral, Reka, Meta, and Google, including Snowflake Arctic, an open enterprise-grade model developed by Snowflake.

Because these LLMs are fully hosted and managed by Snowflake, using them requires no setup. Your data stays within Snowflake, giving you the performance, scalability, and governance you expect.

Naturally, many of our customers are now using Cortex functions to run the world's most cutting-edge models against their data. However, Cortex functions have an additional cost beyond a standard query. They run in a regular query, so the query they run in will have the usual warehouse cost, and storage costs associated with it. But now, it will have a new extra cost in the form of Cortex Function Credits.

Snowflake Cortex LLM functions incur compute cost based on the number of tokens processed. Refer to the Snowflake Service Consumption Table for each function’s cost in credits per million tokens.

A token is the smallest unit of text processed by Snowflake Cortex LLM functions, approximately equal to four characters. The equivalence of raw input or output text to tokens can vary by model.

Very complicated. With the help of SELECT though, we can make it simpler for you to track and manage this new cost to your business.

We’ve added Cortex function cost to your Workloads in SELECT so you can see what they’re costing you there:

SELECT Improved filters.

We have also enabled a metric for this cost type so you can see it over time in a graphical trend:

SELECT Improved filters.

If you would prefer, you can do the same with Credits as well instead of the monetary value.

The eagle-eyed among you may have spotted it in the screenshot above, but we’ve also added the Cortex function models used, and the function names used, to your workloads so you can see which workloads are using which at a glance in the table.

All of this should help you understand the trade-off of using different models, using different functions, and using LLMs vs. not using them as a business case. 🥳

Other Things We shipped

  • 🚀 While we were here, we did the same for Document AI as a service. See your costs and credit usage for queries running Document AI pipelines.

Get up and running with SELECT in 15 minutes.

Snowflake optimization & cost management platform

Gain visibility into Snowflake usage, optimize performance and automate savings with the click of a button.

SELECT web application screenshot

Want to get notified when we release new features? 
Subscribe to get notified.