I'm trying to read a .dat file containing tens of thousands of rows, where each of them looks something like:
1.9681968 0 0 19.996 0 61 100 1.94E-07 6.62E-07
2.330233 0 0 19.996 0 61 100 1.94E-07 6.62E-07
2.6512651 0 0 19.997 0 61 100 1.94E-07 6.62E-07
3.5923592 0 0 19.998 0 61 100 1.96E-07 6.62E-07
Now for example, I'm trying to read it with
Data = textscan(fid, %.9f%*f%*f%.9f%*f%*f%*f%.9f)
where the string format depends on which column I want to read.
When reading big files, the first column of the cell array 'Data' will become
1.96819680000000
0
2.33023300000000
2.65126510000000
0
3.59235920000000
0
and the rest of the columns will show NaNs instead of the zeros. The additional rows are almost as many as the rows in the data file, thus I get arrays that are almost a factor 2 larger.
I guess this has something to do with errors when reading doubles, since this problem doesn't occur if I try to read the file as strings.
But if possible, I would like to not read everything as strings and the have to convert everything to doubles.
Any ideas?