1

I have some fairly simple code which read in some signal data, decimates it, and it appends each iteration to a cell. The data files are typically on the order of 20 - 100 GB in size, so I can't read the whole thing into memory.

An exmaple of my code for a given sample rate sR is

fid = fopen(filename,'r');

for jj=1:10
    data = fread(fid,120*sR,'float32');
    wave = complex(data(1:2:end),data(2:2:end)); %Data is in I/Q format

    for k=1:10
        waveDec = decimate(wave,2,100,'fir');
    end
    output{jj} = waveDec;
end

So this code will read in 10 chunks of 120/2 seconds of data. Ideally, I'd want to read the whole file since decimating down by 2^10 is very aggressive. It just takes such a long time.

I wonder if converting my wave variable to a complex number is a very heavy operation.

Is there any advantage to reading in large chunks of data vs small chunks of data? Could I potentially parallelize this code in some way?

Thank you

user1147964
  • 143
  • 2
  • 11

0 Answers0