I am trying to read a large .csv file to .dat using textscan. The file contains 124,861 rows including a header row and 130 columns. The data in the file is mixed: strings, doubles, missing values, etc. The .csv data looks like this:
I use the following code:
fid = fopen('example.csv'); result = textscan(fid,['%s', '%d', '%s', repmat('%f', [1,12]), '%f', '%f', '%f', repmat('%f', [1,103]), '%s', '%s', '%d', '%s', '%s', '%s', '%d', '%d', '%f'],'HeaderLines', 1, 'Delimiter', ',');
The code yields a result.dat file with 205,000 rows, not 124,861 rows. It seems that matlab arbitrarily adds more rows. The funny thing is that these rows are populated with some data and I do not even see it in my original .csv file. Does anyone have any ideas why this is happening?