I've run into a weird behavior on macOS that I started investigating due to an accidental race condition that would occur when telling the OS to open an RTF file I'd just processed, but hadn't closed or explicitly flushed.
If the file handler (Word, TextEdit, whatever) was not already open, then calling system("open test.rtf")
would open the file just fine and the file would be complete.
However, if the file handler was already open, then calling system("open test.rtf")
would result in an error message that the file was corrupted or truncated (because the buffers weren't completely flushed).
The obvious fix is to fflush()
and/or fclose()
my file before opening it. However, I am more interested in the underlying interaction between my program's runtime and macOS. My question is this: how and why does the file handler's running/not-running state affect whether my buffer is flushed?
(It isn't simply a matter of the time it takes to open the program -- I added a sleep delay in the version that doesn't explicitly flush the buffers and it makes no difference.)
Unflushed version (works only if the file handler is not already running):
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
#include <unistd.h>
int main(void) {
FILE *fin = fopen("src.rtf", "rb");
FILE *fout = fopen("test.rtf", "wb");
int c;
assert(fin && fout);
while ((c=fgetc(fin)) != EOF) fputc(c, fout);
sleep(3);
system("open test.rtf");
fclose(fin);
fclose(fout);
return 0;
}
Explicitly flushed version (works all the time):
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
int main(void) {
FILE *fin = fopen("src.rtf", "rb");
FILE *fout = fopen("test.rtf", "wb");
int c;
assert(fin && fout);
while ((c=fgetc(fin)) != EOF) fputc(c, fout);
fflush(fout);
system("open test.rtf");
fclose(fin);
fclose(fout);
return 0;
}
The sample RTF file I'm using is here: https://pastebin.com/mXLk85G1