I have an esxi host that recently had some drives fail, so in the process of backing up the data on the host to recover it all to a new array I attached a 5TB Seagate Drive, Blanked it & created a new datastore on it and it was all happy and working in ESXi 6.5 and allowed me to copy all the vmdk files off the dead array to it using the datastore explorer.
The problem I now have is that, ESXi 6.5 has now been rebuilt on the same machine, The new raid array is set up and working and all I am trying to do it is the reverse, Attach drive using existing VMFS signature and copy the VMDK accross to the array however it will just not play ball,
ESXi will see the drive but wont let me mount it, I keep getting a scary message saying "This configuration will delete the current disk layout. All File systems and data will be permanently lost"
Obviously this is definitely something I DO NOT want to happen.
Few points to note:
- This is the same machine just a rebuild of esxi, - Same controller same drivers,
- I have attached some other 8TB Drives to this machine a few days ago and it all worked fine
- I have tried the same drive on a different HBA in the same machine and it has the same effect.
- I have tried a UEFI and a Legacy BIOS attachment to see if that made a difference
For some weird reason rather than one big gpt partition it is showing 2 x Legacy MBR PRIMARY partitions, 1 x 2GB and 1 x 4.55TB.
Any ideas? I have attached the screenshot below: https://i.stack.imgur.com/VCjZZ.jpg