Top Gradient
Back

Snowflake Cloud Services: Pricing and Monitoring

Author

Jeff SkoldbergSaturday, November 15, 2025

Snowflake's architecture separates into three distinct layers. The storage layer holds your data in cloud object storage (S3, Azure Blob, or GCS). The compute layer consists of virtual warehouses that execute your queries and process data. The Cloud Services layer orchestrates these components, handling authentication, metadata management, query compilation, optimization, access control, security, and infrastructure management. It runs on Snowflake-managed compute resources across multiple availability zones. Unlike virtual warehouses where you control the size and runtime, cloud services scales automatically based on workload demands.

Cloud Services billing is pretty unique; you only pay for it if your daily cloud services consumption exceeds 10% of your daily virtual warehouse usage. This 10% adjustment many customers never see cloud services charges on their bill. But when those charges do appear, they often signal usage patterns worth investigating.

This article explains how cloud services billing works, how to monitor it, and which patterns drive unexpected costs.

How the 10% Adjustment Works

The calculation happens daily in UTC timezone. Snowflake totals your warehouse compute credits and cloud services credits for the day. The adjustment equals the lesser of: (10% of warehouse credits) or (cloud services credits used). Your billable amount is cloud services credits minus the adjustment.

Example 1: Under the threshold (no charge)

  • Warehouse compute: 100 credits
  • Cloud services: 8 credits
  • Adjustment: 8 credits (cloud services is less than 10% threshold)
  • Cloud Services Billed: 0 credits
  • Total Billed: 100 credits

Example 2: Over the threshold (partial charge)

  • Warehouse compute: 100 credits
  • Cloud services: 20 credits
  • Adjustment: 10 credits (the 10% threshold amount)
  • Cloud Services Billed: 10 credits
  • Total Billed: 110 credits

Important exception: Serverless features like Snowpipe, Automatic Clustering, and Materialized Views do NOT count toward the 10% adjustment. They have separate line-item billing.

What Consumes Snowflake Cloud Services Credits?

Cloud services credits are consumed by activities in Snowflake's services layer:

  • Query compilation and optimization - Every query goes through compilation before execution
  • Metadata operations - DDL commands (CREATE, ALTER, DROP), zero-copy cloning, SHOW commands, INFORMATION_SCHEMA queries
  • Authentication and access control - User login, role switching, permission checks
  • Query result caching - Managing and serving cached results
  • File listing operations - COPY commands list files from object storage before loading

Monitoring Snowflake Cloud Services Consumption

Snowflake's ACCOUNT_USAGE schema provides views for tracking cloud services with 2-hour latency and 365-day retention.

Daily Cloud Services Billing

Check which days resulted in actual charges:

1SELECT
2 usage_date,
3 credits_used_cloud_services,
4 credits_adjustment_cloud_services,
5 credits_used_cloud_services + credits_adjustment_cloud_services AS billed_cloud_services,
6 credits_used_compute,
7 ROUND(credits_used_cloud_services / NULLIF(credits_used_compute, 0) * 100, 2) AS cs_pct_of_compute
8FROM snowflake.account_usage.metering_daily_history
9WHERE usage_date >= DATEADD(month, -1, CURRENT_TIMESTAMP())
10 AND credits_used_cloud_services > 0
11ORDER BY billed_cloud_services DESC;

Any positive billed_cloud_services value means you were charged that day.

Cloud Services by Query Type

Snowflake’s query_history view provides a column called credits_used_cloud_services which is very useful to identify which query types consume the most cloud services:

1SELECT
2 query_type,
3 SUM(credits_used_cloud_services) AS total_cs_credits,
4 COUNT(1) AS num_queries,
5 AVG(credits_used_cloud_services) AS avg_cs_per_query
6FROM snowflake.account_usage.query_history
7WHERE start_time >= DATEADD(day, -7, CURRENT_TIMESTAMP())
8 AND credits_used_cloud_services > 0
9GROUP BY query_type
10ORDER BY total_cs_credits DESC;

High Cloud Services Queries

We can use the same approach to isolate individual queries with high Cloud Services spend.

1SELECT
2 query_id,
3 user_name,
4 warehouse_name,
5 query_type,
6 credits_used_cloud_services,
7 SUBSTRING(query_text, 1, 100) AS query_snippet
8FROM snowflake.account_usage.query_history
9WHERE start_time >= DATEADD(day, -7, CURRENT_TIMESTAMP())
10 AND credits_used_cloud_services > 0.001
11ORDER BY credits_used_cloud_services DESC
12LIMIT 100;

Common Cost Drivers

Through working with Snowflake customers, certain patterns consistently emerge as cloud services cost drivers.

1. High-Frequency Metadata Queries

Running simple queries like SELECT 1, SELECT CURRENT_SESSION() or show queries tens of thousands of times per day adds up. Also, queries against the INFORMATION_SCHEMA use cloud services only (no warehouse compute).

Fix: Reduce query frequency in these types of health checks if your query history shows that these types of queries are running up the bill. You can switch to using account_usage schema instead of information_schema if the latency is acceptable. But do this with some caution. Usually it makes more sense to not turn on a warehouse.

2. Excessive DDL and Cloning

DDL operations are entirely metadata operations. Frequently creating/dropping large schemas or cloning entire databases for backup drives cloud services consumption.

Fix: Clone at the most granular level needed. Clone individual tables instead of schemas, schemas instead of databases. Reduce cloning frequency where possible. Also, ensure only necessary DDL is running in your account.

3. Single-Row Inserts and Schema Fragmentation

Snowflake isn't an OLTP system. Single-row inserts consume significant cloud services resources as these operations often replace entire micro partitions. Additionally, Multi-tenant architectures with one schema per customer create excessive metadata.

Fix: Use batch or bulk loads. For multi-tenant apps, when possible, Snowflake recommends to use shared schemas with tables clustered on customer_id and secure views for isolation.

4. COPY Commands with Poor File Selectivity

COPY commands list files from object storage, which uses cloud services compute. Listing thousands of files to copy a few results in high consumption.

Fix: Structure object storage with date prefixes. Use precise file patterns in COPY commands.

1-- Instead of listing everything
2COPY INTO target FROM @stage/raw_data/;
3
4-- List only specific paths: year/month/day
5COPY INTO target FROM @stage/raw_data/2025/10/24/;

5. Complex Query Compilation

Queries with excessive joins, large set operations (IN, NOT IN, EXISTS), or Cartesian products consume significant cloud services during compilation.

Fix: Simplify query structure. Replace large IN lists with temp tables and JOINs. Break complex queries into CTEs.

Best Practices

Monitor regularly. Set up weekly reviews of cloud services consumption. Establish your baseline so you can spot anomalies quickly.

Batch operations. Whether DDL, DML, or data loading, batching is almost always more efficient than individual operations.

Review third-party tool queries. BI tools, ETL platforms, and orchestration systems often generate metadata query patterns you don't directly control. Configuration changes can significantly reduce unnecessary queries.

Set up alerts. Wrap monitoring queries in scheduled tasks with notification integrations. Better yet, use SELECT monitors for turnkey alerting.

Wrapping Up

Cloud services billing is simple in concept - stay under 10% of warehouse usage and you don't pay for it. But the patterns that drive cloud services consumption are often invisible until they hit your bill.

Many customers never pay for cloud services. If you're seeing consistent charges, start by identifying which pattern applies using the monitoring queries provided. The fixes are usually straightforward once you pinpoint the source.

Let us know if if you’re having any challenges isolating or optimizing your Snowflake Cloud Services spend.

Author
Jeff Skoldberg Sales Engineer at SELECT

Jeff Skoldberg is a Sales Engineer at SELECT, helping customers get maximum value out of the SELECT app to reduce their Snowflake spend. Prior to joining SELECT, Jeff was a Data and Analytics Consultant with 15+ years experience in automating insights and using data to control business processes. From a technology standpoint, he specializes in Snowflake + dbt + Tableau. From a business topic standpoint, he has experience in Public Utility, Clinical Trials, Publishing, CPG, and Manufacturing.

Want to hear about our latest Snowflake learnings?Subscribe to get notified.

Get up and running in less than 15 minutes

Connect your Snowflake account and instantly understand your savings potential.

CTA Screen