0

I'm using Perl 5.16.1 from Strawberry in a Windows environment. I have a Perl script reading very large text files. The smallest text file is 30M. When reading files that do not have a line feed at the end of the very last line I get very peculiar results. It may not happen all the time but when it does It's as though it is reading cached data from the I/O system for another file that I previously opened with the Perl script. If I manually edit the file and add a line feed it's fine. I added a line counter and some inline code to display what happens when I'm near the end of the file to make sure I wasn't going nuts. To try and fix I tried adding this to my script:

open (SS_LOG, ">>", $SSFile) or die "Can't open $SSFile\r\n $!\r\n";
print SS_LOG "\r\n";
close SS_LOG;

but it does nothing. The file stays the same size. I'm also storing data in large arrays.

Has anyone else seen anything like this?

  • Maybe `$SSFile` isn't what you think it is? Wrap your fix with `printf "The size of %s is %d\n", $SSFile, -s $SSFile;` statements. (I predict the file will grow by 3 bytes.) – mob Oct 30 '13 at 20:52

1 Answers1

0

Try unbuffering your output:

SS_LOG->autoflush(1);
Mark Setchell
  • 191,897
  • 31
  • 273
  • 432