39

I have a table with large amounts of BLOB data in a column. I am writing a utility to dump the data to file system. But before dumping, I need to check if necessary space is available on the disk to export all the blob fields throughout the table.

Please suggest an efficient approach to get size of all the blob fields in the table.

jatinpreet
  • 589
  • 1
  • 4
  • 11

4 Answers4

53

You can use the MySQL function OCTET_LENGTH(your_column_name). See here for more details.

Bud Damyanov
  • 30,171
  • 6
  • 44
  • 52
27
select sum(length(blob_column)) as total_size 
from your_table
juergen d
  • 201,996
  • 37
  • 293
  • 362
  • 1
    Thanks for the reply. The size as indicated by above query is lower than the total size of dumped files on disk. Any idea about what could be the reason? – jatinpreet May 18 '12 at 10:24
  • 1
    @jatinpreet: My guess is that this is because of file clustering in file systems. Depending on your file systems files will be saved in blocks of 4KB for instance. That might round things up. But that is just a guess. – juergen d May 18 '12 at 10:29
4
select sum(length(blob_column_name)) from desired_tablename;
Olofu Mark
  • 1,040
  • 9
  • 6
4

Sadly this is DB specific at best.

To get the total size of a table with blobs in Oracle I use the following: https://blog.voina.org/?p=374

Sadly this does not work in DB2 I still have to find an alternative.

The simple

select sum(length(blob_column)) as total_size 
from your_table

is not a correct query as is not going to estimate correctly the blob size based on the reference to the blob that is stored in your blob column. You have to get the actual allocated size on disk for the blobs from the blob repository.

Wtower
  • 18,848
  • 11
  • 103
  • 80
voinageo
  • 41
  • 1