I have Java application, which intensively working with 2D float arrays (float[][] arrays), which actually holding images on black background. Both dimensions are equals (square) and are power of 2 (mostly are 256, 512, 1024), so areas close to borders have zeroes in most cases.
Having sizes equals to power of 2 done for increasing performance (there is some FFT) and decreasing complexity on operations over those arrays like rotation, etc. Recently I faced lack of heap for this application on my machine with 6Gb. By my calculations - memory consumption for this application should be like up to 2-3Gb, while it reaches 4-5Gb (looking in Windows task manager). I used "YourKit" profiler and it shows that those floats arrays indeed takes most memory, however, total rough size for these floats arrays should be like 1.3Gb (well, I know that it's up to JVM to decide how to store data, but I was not expecting 2-3-times difference in memory consumption).
I was trying to compress/decompress data with Snappy compressor on the fly (and memory consumption drops to 3.5Gb), but performance drops several times, which is not very acceptable. Also, I was testing performance when replacing those floats[][] by BufferedImage, but performance was very poor.
So, there is 2 ways left which will work for me to decrease memory consumption: 1) write wrappers for float[][] array in order to save on "zeroes" elements (there are a lot "empty" rows and columns) 2) go away from "power of 2"
Both ways require quite a lot of coding/refactoring, so while I am thinking "to be or not to be" - may be you have better clue on this issue, guys?
Thanks!