I have a function that receives the name of a file as an argument.
The idea is to read each word in the given file and save each one in a linked list (as a struct with a value and a pointer to the next struct).
I could get it working for small files, but when I give a big .txt file I get a segmentation fault.
Using gdb I could figure out that this happens at the while(fscanf(fi, "%s", value) != EOF){
line.
For some reason when the file is bigger the fscanf() segfaults.
As I could figure out the linked list part, here I pasted just enough code to compile and for you to see my problem.
So my question are: Why fscanf() segfauts with big .txt files (thousands of words), but not with small file (ten words)?
By the way, is there a better way to check for the end of the file?
Thanks in advance.
bool read(const char* file){
// open file
FILE* fi = fopen(file, "r"); //file is a variable that contains the name of the file to be opened
if (fi == NULL)
{
return false;
}
// malloc for value
char* value = malloc(sizeof(int));
// fscanf() until the end of the file
while(fscanf(fi, "%s", value) != EOF){ // HERE IS MY PROBLEM
// some code for the linked list
// where the value will be saved at the linked list
}
// free space
free(value);
// close the file
fclose(fi);
return true;
}