1

I have huge files ranging from 450 to 750 million rows, that I need to load into our Oracle environment, and DBA asked me to give an estimate on how much tablespace I would need.

How to estimate the required data before loading the data into a table?

Abhinav Dhiman
  • 745
  • 3
  • 17

1 Answers1

1

Load a sample set (e.g. 100k rows) into a table and then check storage size with

SELECT ( SUM(BYTES) / 100e3 * 750e6 ) / 1024/1024/1024 AS GI_BYTES
FROM USER_SEGMENTS
WHERE SEGMENT_NAME IN ( <your table>, <index name 1>, <index name 2>, ...)
Wernfried Domscheit
  • 54,457
  • 9
  • 76
  • 110
  • This is a good idea, but you may need to use more than 100K rows to get an accurate sample. Assuming your tablespace uses extent autoallocation (the default), Oracle will only add segment space in chunks of either 64KB, 1MB, 8MB, or 64MB. The segment overhead for small amounts of data may be much larger than for large amounts of data, and you don't want to multiply that error to get an overly large estimate. – Jon Heller May 07 '21 at 02:56