I am trying to understand if snowflake can run peta byte sized database workload and what is the maximum size of a single snowflake database instance?
-
1I'd recommend reading about the Snowflake architecture. A database in Snowflake references the amount of data storage, which is only limited by the cloud provider's blob storage (near infinite). You might be wanting to better understand the warehouse/compute side of things. If you are looking for petabyte scale, I recommend reaching out to a sales rep at Snowflake that can help setup a POC for you to evaluate. – Mike Walton Aug 11 '22 at 01:34
2 Answers
There is no limit set from Snowflake on the Storage.
When data is loaded into Snowflake, Snowflake reorganizes that data into its internal optimized, compressed, columnar format. Snowflake stores this optimized data in cloud storage. Snowflake manages all aspects of how this data is stored — the organization, file size, structure, compression, metadata, statistics, and other aspects of data storage are handled by Snowflake. The data objects stored by Snowflake are not directly visible nor accessible by customers; they are only accessible through SQL query operations run using Snowflake.
Documentation link for the architecture
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html#snowflake-architecture https://docs.snowflake.com/en/user-guide/intro-key-concepts.html#database-storage
You can refer to the below documentation links for the costs associated and storage considerations
https://docs.snowflake.com/en/user-guide/admin-usage-billing.html#understanding-your-cost https://docs.snowflake.com/en/user-guide/cost-overview.html#overview-of-managing-cost https://docs.snowflake.com/en/user-guide/tables-storage-considerations.html#data-storage-considerations

- 544
- 2
- 4
https://docs.snowflake.com/en/user-guide/warehouses-overview.html
Warehouse Size | Credits / Hour | Credits / Second | Notes |
---|---|---|---|
X-Small | 1 | 0.0003 | Default size for warehouses created using CREATE WAREHOUSE. |
Small | 2 | 0.0006 | |
Medium | 4 | 0.0011 | |
Large | 8 | 0.0022 | |
X-Large | 16 | 0.0044 | Default for warehouses created in the web interface. |
2X-Large | 32 | 0.0089 | |
3X-Large | 64 | 0.0178 | |
4X-Large | 128 | 0.0356 | |
5X-Large | 256 | 0.0711 | Preview feature. |
6X-Large | 512 | 0.1422 | Preview feature. |
There is also Query Acceleration Service to help with high scans loads.
Storage $46/TB/Month however this is subject to your personal contract with Snowflake which may change.

- 1,720
- 12
- 14
-
2Or...if you do put down a credit card, you may want to read up on Snowflake's resource monitors so you don't spend more than you want to. – Mike Walton Aug 11 '22 at 01:37
-
1I removed the personal note, because without it, this a valuable answer. The problem the note points at, reading the manual and using tools correctly is tricky. Can only be solved by understanding to use tools correctly one needs to know they should read the manual, and take those warnings seriously. As a side note, I have Snowflaked wrong in the past, and wasted 15K so get where Adrian is coming from, but like this question, it does not belong here, if you really are thinking of doing petabyte loads, you can afford a couple hours of reading doc's to get questions like this out of the way. – Simeon Pilgrim Aug 11 '22 at 02:16
-