- May 4, 2004
- 4,312
- 0
- 0
If those cards were running in directx 10, there would probably be less of a difference than there is now for cards running directx9.Originally posted by: myocardia
No, not unless you're talking about a G80 or an R600. For today's cards, there'd be no difference.
Originally posted by: otispunkmeyer
none really
as far as im concerned all PCI-E has given us is SLI and Crossfire.
Originally posted by: Cheex
From 16x to 4x, you lose about 5-10 frames in some FPS's but from 16x to 8x the difference is hardly even noticable, probably less than 5 frames in a FPS.
Originally posted by: PhoenixOrion
Okay.
I've come to this point where I want to put a waterblock on DFi Ultra-D chipset but videocard is in the way. I was thinking of putting motherboard in SLI mode which will activate the second PCIE (bottom) but it will be at x8.
I wasn't sure if it would be too much of a trade-off having a good system overclock but with x8 PCIE, about 5GB/s bandwidth instead, if I'm right.
Originally posted by: beggerking
Originally posted by: Cheex
From 16x to 4x, you lose about 5-10 frames in some FPS's but from 16x to 8x the difference is hardly even noticable, probably less than 5 frames in a FPS.
I thought its a 2-3 frame difference from 8x to 4x..? but that was back in the agp era..
So they are finally fully using that 4x bendwidth now?