0

As a follow up to this question. I've been able to do further testing, and here is what I've found. I'm also working with Dell support on this, here is the email to them. This is not an actual file system corruption, but something else. While I await a response from Dell, I wanted to see what you think.

"Here are the steps I've taken:

  1. Booted the server with both drives attached, mirror set is intact.
  2. Shutdown server, removed one drive, booted, shows array is degraded, but errors in windows persist. Plugged in drive, allowed full re-sync. Shut down server.
  3. Unplugged other drive, rebooted, results same as step 2.

These steps tell me that the drives do not appear bad, as they both exhibit the same symptoms when booted to the OS, I am 100% sure that this is just not mirrored corruption.

One thing to be aware of that makes me think this is hardware related. I took a snapshot of the OS, and successfully applied to to BOTH a Hyper-V VM on a separate server as well as a direct restore to another completely different server. Both times, the system booted, ran the chkdsk only once, and the errors were gone. So, 3 servers, and only one gives the errors at the OS level."

The 32-bit Dell diagnostics I ran from a bootable USB came up clean, so, could this be firmware related on the controller? What am I chasing here?

BTW, it's a SAS array, with 2 WD 250GB SATA drives on a Poweredge T300.

UPDATE: So, last night I updated the SAS 6i/r controller, and the firmware of the drives as well, no go. I went out and purchased an off the shelf Western Digital 500GB drive, and restored the image to it. The last common link I can think of, is something in the RAID table on the drives.

DanBig
  • 11,423
  • 1
  • 29
  • 53

2 Answers2

0

Your disk firmware, controller firmware, or controller driver is munging data up somewhere to make Windows "think" that there's filesystem corruption. Deep down, that's the root cause-- Windows is "seeing" something that makes it think the filesystem is screwed-up.

You could try different disks to rule out the disk firmware. Trying a different controller might be difficult, but it's certainly worth a shot if you can swing it.

Evan Anderson
  • 141,881
  • 20
  • 196
  • 331
  • I had a feeling in the beginning that it wasn't a corrupt file system, but there was something deep fighting the OS. My next step is defintely updating the firmware. – DanBig Jan 27 '10 at 15:51
  • Updating drive and controller firmware, as well as drivers, did nothing. – DanBig Feb 01 '10 at 13:04
0

Turns out it was the RAID information on the drives. Since i had a backup of the data, I completely wiped both drives and reconfigured the RAID 1 and restored, all is well. I'm just glad it wasn't a burn/rebuild/restore.

DanBig
  • 11,423
  • 1
  • 29
  • 53