1

I am trying to process a time signal, which is split up into (many) smaller segments, and each segment ("chunk") is analyzed independently. This output, I want to save in a combined file.

My current solution works:

configFileName = 'config.mat';
config = matfile(configFileName,'Writable',true);
% some irrelevant stuff saved in config
config.chnk(1,N) = struct('var1',[],'var2',[],'var3',[]); % pre-allocating
clearvars config;

parfor i = 1:N
    config = matfile(configFileName,'Writable',true);
    chunk = process(data(:,i)); % data has previously been sliced
    config.chnk(1,i) = chunk;
end

Note: Output of process(data(:,i)); is of same type that the config.chnk is pre-allocated with. Now, while this seems to work (update: it does not), I get a warning by Matlab:

Warning: Unable to read some of the variables due to unknown MAT-file error.

In matlab.io.MatFile/genericWho (line 209) In matlab.io.MatFile/whos (line 309) In matlab.io.MatFile (line 422) In matfile (line 75) In parallel_function>make_general_channel/channel_general (line 929) In remoteParallelFunction (line 38)

In my case, I do not read anything inside parfor so I could just omit the warning in my script by using evalc when creating the file object, but does anyone know why this warning appears, and how to avoid it properly?

Update: It does not work, actually. The exact variable I am writing to is invalid, and whenever I try to read from it, I get an error.

When the parallel process was finished, I got the following error:

Error using matlab.io.MatFile/whos (line 311) Could not open /Users/casparjespersen/ardetector/matlab/data/EMD/A0007_4/EEG.mat as a valid MAT-file.

Error in matlab.io.MatFile (line 422) varInfo = whos(obj);

Error in matfile (line 75) mf = matlab.io.MatFile(varargin{:});

Error in HHSA_BD_gen_emd (line 104) parfor windowIdx = 1:size(windowChunks,2)

casparjespersen
  • 3,460
  • 5
  • 38
  • 63
  • if all write threads to same the file it at the time save probably would look this like – user3528438 Apr 08 '16 at 19:33
  • They do. Perhaps I should save the output the parfor operation in the memory, and then after every Nth iteration, leave parfor, save, and return to the parfor loop for the remaining. – casparjespersen Apr 08 '16 at 19:35
  • After further thought, the computers this will be run on has enough memory to allocate all the chunks at once. I'll just save to memory and to file afterwards, this entire problem is not necessary :-) But thanks. – casparjespersen Apr 08 '16 at 19:40

1 Answers1

1

As @user3528438 pointed out, this is due to the workers writing to the file at the same time. I chose to re-design my script to avoid writing from within parfor.

casparjespersen
  • 3,460
  • 5
  • 38
  • 63