Performance Hit.

hennessy1

Golden Member
Mar 18, 2007
1,901
5
91
What kind of performance hit will I take with my hdd on the jmicron ports in ahci opposed to the intel on ahci? Specifically with a VelociRaptor.
 

sub.mesa

Senior member
Feb 16, 2010
611
0
0
As long as you do not use RAID, you shouldn't see much difference. Your JMicron controller likely is JMB363; connected by PCI-express x1 or 250MB/s duplex.

HDDs are still under 150MB/s sustained in the best circumstances. If you RAID them, however, the JMicron drivers are so crap that Intel would win.

HDDs don't gain much by using AHCI; SSDs get about 10 times faster, due to them able to use NCQ. NCQ is required for SSDs to process more data at once, up to 10 times as much as most SSDs have 10 parallel flash channels.
 

hennessy1

Golden Member
Mar 18, 2007
1,901
5
91
Ok I should be good then it will just be a single drive config on the jmicron controller. Thank you.
 

sub.mesa

Senior member
Feb 16, 2010
611
0
0
Yes, make sure you do not install the JMicron drivers! Use only the default microsoft (AHCI) driver. Then you shouldn't have any difference between intel and jmicron controllers, regarding performance.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Probably not too noticeable for normal usage, but in my experience, benchmarks are ruined.

For my SSDs, i've found generally my non-Intel ports to provide far worse numbers.

I'd highly suggest benching to see.
 

sub.mesa

Senior member
Feb 16, 2010
611
0
0
n7, interesting! do you happen to know the controller you used? Modern motherboards use additional controllers on PCI-express x1 bus, but if yours was on the PCI bus that would explain it being a lot slower.

When comparing the RAID drivers of JMicron/Promise/Silicon Image/AMD/nVidia they all fall to dust when compared to the Intel drivers; the only reasonably decent software RAID that exists for Windows. But if you do not use their drivers, you should be fine. I used JMicron JMB363 in my FreeBSD server (3x PCIe x1 cards each with 2 ports) and got very good speeds from them and quite low latency; though still a little higher than the chipset-powered controller which always should have the lowest latency, as its closest to your CPU. Note that if you connect a controller to PCI-express x16 ports, you may be connected directly to the northbridge, which may be the same latency of the onboard ports.

This latency is important when dealing with SSD random IOps, but not so much for sequential transfers.
 

hennessy1

Golden Member
Mar 18, 2007
1,901
5
91
So even with jmicrons latest drivers they provide worse performance then compared win7 built in drivers?
 

sub.mesa

Senior member
Feb 16, 2010
611
0
0
Hennessy1: there might be an easy way to test this, on your system with your controller.

You can download Ubuntu linux livecd, that will not require to be installed anywhere; it runs from the cd. You can boot it and choose 'try ubuntu without changing my computer'. Now click System->Administration->Disk utility, find your disk and click the Benchmark button. Now start the read-only benchmark; do NOT perform the write test!

Once you're done, you can shutdown, connect the drive to the other controller, then test again with Ubuntu, and see if it changes anything or if the scores are very similar.

It basically looks like this:

iSCSI-on-ZFS-SSD.png