0

I have some code that appends into some files in the nested for loops. After exiting the for loops, I want to append .end to all the files.

foreach my $file (@SPICE_FILES)
{
    open(FILE1, ">>$file") or die "[ERROR $0] cannot append to file : $file\n";
    print FILE1 "\n.end\n";
    close FILE1; 
}

I noticed in some strange cases that the ".end" is appended into the middle of the files!

how do i resolve this??

Gordon
  • 1,633
  • 3
  • 26
  • 45
  • Are you sure you're not writing to the files in another process after the ".end" part is added? As your code's written, what you describe should absolutely not happen. – Wooble May 04 '11 at 12:37
  • This is working perfectly fine for me. are the contents of @SPICE_FILES unique and are you writing anything other than ".end" ? – Arunmu May 04 '11 at 12:45
  • What platform are you working on? – crazyscot May 04 '11 at 12:57
  • Considering the great answers below, I am wondering if it would not perhaps help to make sure any previous opened filehandles are explicitly closed before trying to print the end stamp. – TLP May 04 '11 at 14:56

3 Answers3

3

Since I do not yet have the comment-privilege I'll have to write this as an 'answer'.

Do you use any dodgy modules?

I have run into issues where (obviously) broken perl-modules have done something to the output buffering. For me placing

$| = 1;

in the code has helped. The above statement turns off perls output buffering (AFAIK). It might have had other effects too, but I have not seen anything negative come out of it.

Christopher
  • 1,102
  • 1
  • 8
  • 17
1

I guess you've got data buffered in some previously opened file descriptors. Try closing them before re-opening:

open my $fd, ">>", $file or die "Can't open $file: $!"; 
print $fd, $data;
close $fd or die "Can't close: $!";

Better yet, you can append those filehanles to an array/hash and write to them in cleanup:

push @handles, $fd;
# later
print $_ "\n.end\n" for @handles; 

Here's a case to reproduce the "impossible" append in the middle:

 #!/usr/bin/perl -w
 use strict;

 my $file = "file";

 open my $fd, ">>", $file;
 print $fd "begin"; # no \n -- write buffered

 open my $fd2, ">>", $file;
 print $fd2 "\nend\n";
 close $fd2; # file flushed on close

 # program ends here -- $fd finally closed
 # you're left with "end\nbegin"
Dallaylaen
  • 5,268
  • 20
  • 34
  • As @Christopher said, you can easily find out if the problem is really with buffering by turning `$|` on and off. – Dallaylaen May 04 '11 at 12:55
  • 2
    Never trust auto-close, even on read-only handles: it’s broken because it doesn’t test and report errors. Always explicitly test wtih `close($fh) || die "Can't close: $!"`. – tchrist May 04 '11 at 13:00
0

It’s not possible to append something to the middle of the file. The O_APPEND flag guarantees that each write(2) syscall will place its contents at the old EOF and update the st_size field by incrementing it by however many bytes you just wrote.

Therefore if you find that your own data is not showing up at the end when you go to look at it, then another agent has written more data to it afterwards.

tchrist
  • 78,834
  • 30
  • 123
  • 180