I get ALL my compressed data when I flush() (I have tried with files 2GB+) still nothing.
There's a trick to working with compressors.
I'll bet that your 2GB+ file was not very random. Random data doesn't compress well. Orderly data compresses to a very small size.
For example
>>> import bz2
>>> c=bz2.BZ2Compressor()
>>> import string
>>> data = string.printable*1024
>>> len(data)
102400
>>> c.compress(data)
''
>>> result= c.flush()
>>> len(result)
361
The data being supplied had a pattern, which made it compress well.
You need random data.
>>> import random
>>> c=bz2.BZ2Compressor()
>>> size= 0
>>> result= ''
>>> while result == '':
... data = ''.join( random.choice(string.printable) for i in xrange(1024*8) )
... size += len(data)
... result= c.compress(data)...
>>> len(result)
754809
>>> size
901120
I get chunks when I use really random data.