I wonder if anyone can help. I have two RAID arrays on my XP based rig. First is a SATA RAID with two Maxtors and they are in a STRIPED (RAID 0, for performance) format and are the boot drive with XP and prog files etc. All working fine. Then, and because I had a prior bad experience, I have two IBM 40 gigs in RAID 1 (Mirrored - for security) which contain all my precious data. These are attached to a SIL 0680 Medley PCI card. All well until today. I booted up and the 0680 (the mirrored raid 1) card reported an 'invalid raid'. It does 'see' the drives in the boot up sequence but obviously recons that the raid set is not good. I don't know why this happened. Normally if a drive went down it would report this and default to the single good drive. I don't think that both drives are going to go at precisely the same time either. And, windows doesn't recognise any of the drives for me to check them.
Now, what do I do? I don't want to mess too much with the drives as they contain my data. Can I delete the current raid set and create another whilst still retaining the data?
I would really appreciate any help.
Now, what do I do? I don't want to mess too much with the drives as they contain my data. Can I delete the current raid set and create another whilst still retaining the data?
I would really appreciate any help.
