3

We are using File.WriteAllBytes to write data to the disk. But if a reboot happens just about the time when we close the file, windows adds null to the file. This seems to be happening on Windows 7. So once we come back to the file we see nulls in the file. Is there a way to prevent this. Is windows closing it's internal handle after certain time and can this be forced to close immediately ?.

Prashant
  • 2,190
  • 4
  • 33
  • 64
  • You mean null bytes such as 0x00? –  Jul 12 '11 at 17:00
  • Is it a gentle reboot, so to speak, as in a normal 'Shutdown' or 'Restart'? Or is it a sudden loss of power? In the latter case, there may be no solution. – Detmar Jul 12 '11 at 17:08
  • @Detmar - Unfortunately it is a sudden loss of power. If so what are my alternatives ?. – Prashant Jul 12 '11 at 17:33
  • Prashant: Use Transactional NTFS to ensure that either all data is written or none at all. Provided you can live with no data written in case of failure, of course. – Joey Jul 12 '11 at 17:46
  • I've never tried it but you also might be able to use a totally unbuffered stream. http://stackoverflow.com/questions/122362/how-to-empty-flush-windows-read-disk-cache-in-c/128523#128523 – Chris Haas Jul 12 '11 at 18:01
  • @Joey - 'Transactional NTFS' looks very interesting. It's new to me as I've been working with XPEmbedded. It seems to be the right answer for this problem, if you know you are using Vista or Win7. – Detmar Jul 12 '11 at 22:27
  • Chris: I guess the OS and the file system driver are still going to do some buffering. – Joey Jul 12 '11 at 22:30

4 Answers4

5

Depending on what behavior you want; you can either put it in a UPS as 0A0D suggested; but in addition you can use Windows' Vista+ Transactional NTFS functionality. This allows you to atomically write to the file system. So in your case; nothing would be written rather than improper data. It isn't directly part of the .NET Framework yet; but there are plenty of managed wrappers to be found online.

Sometimes no data is better than wrong data. When your application starts up again; it can see that the file is missing; it can "continue" from where it left off; depending on what your application does.

vcsjones
  • 138,677
  • 31
  • 291
  • 286
1

Based on your comments, there is no guarantees when writing a file - especially if you lose power during a file write. Your best bet is to put the PC on an Uninterruptable Power Supply. If you are able to create an auto-restore mechanism, like Microsoft Office products, then that would prevent complete loss of data but it won't fix the missing data upon power loss.

0

I would consider this a case of a fatal exception (sudden loss of power). There isn't anything you can do about it, and generally, trying to handle them only makes matters worse.

Bryan Crosby
  • 6,486
  • 3
  • 36
  • 55
0

I have had to deal with something similar; essentially an embedded system running on Windows, where the expectation is that the power might be shut off at any time.

In practice, I work with the understanding that a file written to disk less than 10 seconds before loss-of-power means that the file will be corrupted. (I use 30 seconds in my code to play it safe).

I am not aware of any way of guaranteeing from code that a file has been fully closed, flushed to disk, and that the disk hardware has finalized its writes. Except to know that 10 (or 30) seconds has elapsed. It's not a very satisfying situation, but there it is.

Here are some pointers I have used in a real-life embedded project...

  • Use a system of checksums and backup files.
  • Checksums: at the end of any file you write, include a checksum (if it's a custom XML file then perhaps include a <checksum .../> tag of some sort). Then upon reading, if the checksum tag isn't there, or doesn't match the data, then you must reject the file as corrupt.
  • Backups: every time you write a file, save a copy to one of two backups; say A and B. If A exists on disk but is less than 30 seconds old, then copy to B instead. Then upon reading, read the original file first. If corrupt, then read A, if corrupt then read B.

Also

  • If it is an embedded system, you need to run the DOS command "chkdsk /F" on the drive you do your writes to, upon boot. Because if you are getting corrupted files, then you are also going to be getting a corrupted file system.
  • NTFS disk systems are meant to be more robust against errors than FAT32. But I believe that NTFS disks can also require more time to fully flush their data. I use FAT32 when I can.

Final thought: if you are really using an embedded system, under windows, you would do well to learn more about Windows Embedded, and the Enhanced Write Filter system.

Detmar
  • 713
  • 4
  • 6