I was wondering if there's any benchmark type software that can detect where in your storage chain the data loss is occurring.
I am getting lots of errors when I create .rar archives from one drive to another within my external SAS rack. The test reports about 5% failed parts on creation, and if I use the recovery sectors to repair those parts and test again, other parts will start reporting corruption.
I've pretty much eliminated the notion that RAM or CPU is the cause, because of the limited scope of corruption, memtest, and one time I simply unplugged all the SAS connections and rebooted and was clean for weeks after that.
If there was a utility that would run tests by automatically transferring data and run checksum compares, it would be a lot faster to narrow it down to one bank of hard drives, or one connection, or even the expander itself.
I am getting lots of errors when I create .rar archives from one drive to another within my external SAS rack. The test reports about 5% failed parts on creation, and if I use the recovery sectors to repair those parts and test again, other parts will start reporting corruption.
I've pretty much eliminated the notion that RAM or CPU is the cause, because of the limited scope of corruption, memtest, and one time I simply unplugged all the SAS connections and rebooted and was clean for weeks after that.
If there was a utility that would run tests by automatically transferring data and run checksum compares, it would be a lot faster to narrow it down to one bank of hard drives, or one connection, or even the expander itself.