0

I have a problem opening a CSV-file. In most other issues with EXC_BAD_ACCESS here on stack overflow seem to relate to something deep in the code.

For example here: Thread 1: EXC_BAD_ACCESS (code=1, address=0xf00000c)

The debug navigator dump is displayed in pic below

Debug Navigator

My error seem to have something to do with libdyld.dylib 1 Start button in the Debug Navigator

Close up errorClose up error

I have managed to isolate the problem to a short mini program:

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define RO                       4000000
#define CO                         10
#define BUFSZE                  1024
#define SEP                     ","

int main(int argc, const char * argv[]) {


    float mtx1[RO][CO];
    memset(mtx1, 0, sizeof(mtx1));

    char *name;
    name = "bigfile.txt";
    char buffer[BUFSZE];
    int row, column;
    char *token;

    FILE *csvFile;
    csvFile = fopen(name, "r");
    if (csvFile) {
        for (row = -1; row < RO && fgets(buffer, sizeof(buffer), csvFile); ++row) {
            token = strtok(buffer, SEP);
            column = 0;
            while (token != NULL && column < (CO + 1) ) {
                if (row >= 0) {                                                             // Skip first row
                    if (column > 0) {                                                       // Skip first col
                        mtx1[row][column-1] = (float) strtof(token, NULL);
                    }
                    column++;
                    token = strtok(NULL, SEP);
                } else {
                    strtof(token, NULL);
                    column++;
                    token = strtok(NULL, SEP);
                }
            }
        }
    } else {
        printf("Could not open csvFile\n");
    }
    fclose(csvFile);
    return 0;

}

Oddly enough, everything works perfect as long as #define RO is 50000 or less. I get this error as soon as I try to parse the entire bigfile.txt or any subset down to around 50k rows.

It can't be that I am out of RAM since the file I am trying to create should only be around 4M x 10 x 4 ≈ 160MB.

The actual CSV-file (bigfile.txt) is less than that, 4M x 6.

Am I parsing the CSV wrongly? If so why does it work for a smaller subset? (I have in a vein attempt tried to increase BUFSZE, of course to no avail.

It is an isolated problem, so the solution should be obvious, but I am afraid I just can't find it.

Community
  • 1
  • 1
userjuicer
  • 123
  • 2
  • 9

0 Answers0