I am having time-trouble reading my ".csv" file into Matlab. Using the following part from this forum i found out the number of rows is roughly 12 Million, the number of columns is 7.
function count = countLines(fname)
fh = fopen(fname, 'rt');
assert(fh ~= -1, 'Could not read: %s', fname);
x = onCleanup(@() fclose(fh));
count = 0;
while ~feof(fh)
count = count + sum( fread( fh, 16384, 'char' ) == char(10) );
end
end
I tried the dlmread command to read it into Matlab which takes about 120 seconds:
tic
fname='test1_csv0.csv';
N1=countLines(fname)-1;
N2=6;
blockmodel=dlmread('test1_csv0.csv',',',[1 0 N1 N2]);
toc
I found this post on fast Matlab reading
But to be honest, as I am not a programmer, I do not know how to adapt the code to my problem and which would be the fastest option. I would like to keep the solution in Matlab, as this is the only programming-language I slightly understand. I would appreciate any help to get Matlab to read my matrix faster, do you have any experience whether "textscan" might work better then "dlmread" ?