• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

RAID w/ Different SSDs

llee

Golden Member
I currently have one OCZ Vertex 120GB SSD in my shoebox pc. I'm wondering if I would be okay if I added a second one and set up RAID 0 for my system. Do I need the exact model or am I okay with getting the 120GB Vertex 2 models? Another question is whether different SSD processors (e.g. Indilux, Sandforce) will be compatible under software RAID.

Thanks
 
Just as with mechanical drives, it depends on how picky the RAID controller is if you're using one. If you're using software RAID it should work just fine, you'll just lose the space difference which will probably be measured in the K if they both say 120G. But if you're talking about Windows software RAID you're in for some pain since it sucks so bad. If you're using another OS like Linux you'll probably be fine.
 
Just as with mechanical drives, it depends on how picky the RAID controller is if you're using one. If you're using software RAID it should work just fine, you'll just lose the space difference which will probably be measured in the K if they both say 120G. But if you're talking about Windows software RAID you're in for some pain since it sucks so bad. If you're using another OS like Linux you'll probably be fine.

Not really sure about the above as this isn't a Windows v. Linux thing and actually using Windows Dynamic disks for RAID 0 is super easy with different drives/ striping across controllers.

I think the action of posting misinformation that only serves to spread propaganda is called "trolling."

Most RAID controllers won't care that the drives are different either. Big issue with Windows dynamic disks is you can't boot from them but if you are going from single drive to full-drive RAID 0 you need to re-install your OS no matter what.

It will be fine to add another drive. Just be aware that the performance of the two drives is *very* different. Indilinx has great sequential read/ write speeds (actually faster than Sandforce drives for non-compressible data) but 4K speeds are nowhere near as good. The big "issue" with the setup is that the different SSD controller characteristics in RAID 0 will slow your non-compressible sequential read/write speed to Sandforce speeds, and limit your 4K performance to Indilinx speeds. RAID 0 basically is limited by the lowest performing component in the array.

Personally, I would use the Sandforce drive as a boot drive and just use the Vertex for other things. No real need to RAID 0 as you won't notice a difference in speeds.

One thing to make sure of though is that you are running off of a southbridge OR at least a PCIe x4 slot. You will be able to saturate a PCIe 2.0 x1 link with those two drives in sequential read scenarios.
 
This: Personally, I would use the Sandforce drive as a boot drive and just use the Vertex for other things. No real need to RAID 0 as you won't notice a difference in speeds.
 
Not really sure about the above as this isn't a Windows v. Linux thing and actually using Windows Dynamic disks for RAID 0 is super easy with different drives/ striping across controllers.

I think the action of posting misinformation that only serves to spread propaganda is called "trolling."

It's not misinformation, Windows software RAID is painful and a lot less capable compared to software RAID in Linux.

Most RAID controllers won't care that the drives are different either. Big issue with Windows dynamic disks is you can't boot from them but if you are going from single drive to full-drive RAID 0 you need to re-install your OS no matter what.

And you have numbers to back that up? Some RAID controllers are more picky than others, that's a fact. Even if what you say is true, and I'd be surprised if it was unless you include every revision of every POS onboard controller in existence, what I said is still true. It depends on the controller he's planning on using. And not being able to boot from a Windows RAID0 volume is just more proof that it sucks so bad.

Personally, I would use the Sandforce drive as a boot drive and just use the Vertex for other things. No real need to RAID 0 as you won't notice a difference in speeds.

This is probably a better idea overall, regardless though since using RAID0 is just asking for data loss.
 
It's not misinformation, Windows software RAID is painful and a lot less capable compared to software RAID in Linux.

I have never had Windows software RAID cause me *pain*. Please provide details on how this occurs.

And you have numbers to back that up? Some RAID controllers are more picky than others, that's a fact. Even if what you say is true, and I'd be surprised if it was unless you include every revision of every POS onboard controller in existence, what I said is still true. It depends on the controller he's planning on using.

Granted, it does, but a really high portion of folks have ICH9R/ ICH10R onboard RAID these days, which works really well for two SATA II SSDs due to low latency. Every PCIe Adaptec, Areca, and LSI controller I have used recently has also been without issue on running two drives in RAID 0 even if mismatched. I even threw mismatched (capacity / brand/ spindle speed) SATA II drives on a LSI 2008 controller in IT mode and was able to get just shy of 1GB/s through one SFF 8087 port using Windows dynamic disks. Pretty much same speed as when I removed the two "green" drives from the array.

And not being able to boot from a Windows RAID0 volume is just more proof that it sucks so bad.

Intel/ AMD/ NVIDIA fake-RAID boots into Windows though so Windows RAID 0 is more for spanning controllers once you are in the OS. Granted, a limitation but I would venture to guess never something that you could get yourself into without finding the limitation before putting an OS on there and trying to boot.

This is probably a better idea overall, regardless though since using RAID0 is just asking for data loss.

Agreed. Plus you get TRIM on those drives if they aren't in RAID 0 (although Indilinx background GC works GREAT).
 
I have never had Windows software RAID cause me *pain*. Please provide details on how this occurs.

Obviously I didn't literally mean physical pain. Just the mental anguish and frustration caused by Windows dynamic disks themselves like the fact that almost no 3rd party disk tools support them.

Granted, it does, but a really high portion of folks have ICH9R/ ICH10R onboard RAID these days, which works really well for two SATA II SSDs due to low latency. Every PCIe Adaptec, Areca, and LSI controller I have used recently has also been without issue on running two drives in RAID 0 even if mismatched. I even threw mismatched (capacity / brand/ spindle speed) SATA II drives on a LSI 2008 controller in IT mode and was able to get just shy of 1GB/s through one SFF 8087 port using Windows dynamic disks. Pretty much same speed as when I removed the two "green" drives from the array.

It's good to know that a lot have become more liberal.

Intel/ AMD/ NVIDIA fake-RAID boots into Windows though so Windows RAID 0 is more for spanning controllers once you are in the OS. Granted, a limitation but I would venture to guess never something that you could get yourself into without finding the limitation before putting an OS on there and trying to boot.

But then you're tied to that fake-RAID model, brand, etc. Software RAID being independent from the controller is a huge advantage IMO.

Agreed. Plus you get TRIM on those drives if they aren't in RAID 0 (although Indilinx background GC works GREAT).

That would be another Windows limitation. TRIM support shouldn't be dependent on whether or not the drives are used in RAID, the block layer drivers can send those commands just as well in either configuration if written properly.
 
Thanks for the input guys. I think the path that I'll follow is cloning my current system to a Sandforce drive and simply using that as the boot drive instead of RAID.
 
Back
Top