PCIe 2.0 vs 3.0 when running QUAD SLI 680s.

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
SANY0003-12.jpg


PCI-ETests.jpg


http://www.overclock.net/t/1220962/...d-computer-edition-2012/300_100#post_16915399
 

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
Eh, it might make sense that the GTX680 at those resolutions can saturate a PCIe 2.0 8x link, and than the double bandwidth of PCIe 3.0 would increase performance.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
well would 16x/8x/8x/8x at 3.0 really only perform like 8x/4x/4x/4x when running 2.0?
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
something is definitely wrong with those results.
I have to agree. SLI/CF isn't that bus bandwidth intensive. If PCIe 3.0 was that beneficial, you'd see some big differences even in single card tests.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Eh, it might make sense that the GTX680 at those resolutions can saturate a PCIe 2.0 8x link, and than the double bandwidth of PCIe 3.0 would increase performance.

It doesn't work that way but the opposite, the higher the resolution the less PCI bandwidth is needed so the biggest differences in performance with differing PCI-E bandwitdh are with low resolutions.

UPDATE: Actually the amount of PCI-E bandwidth needed is the same regardless of the resolution, but because the framerates go down so do performance differences between differing PCI-E specs.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I have to agree. SLI/CF isn't that bus bandwidth intensive. If PCIe 3.0 was that beneficial, you'd see some big differences even in single card tests.

Not necessarily. There is only X amount of bandwidth available between processor and PCIe 2.0 cards (and potentially 2X on PCIe 3). If I single 680 can utilize .5X, you're not going to see any difference going from 2.0 to 3.0, but add in 4, and suddenly you want 2X where 2.0 can only provide X.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Not necessarily. There is only X amount of bandwidth available between processor and PCIe 2.0 cards (and potentially 2X on PCIe 3). If I single 680 can utilize .5X, you're not going to see any difference going from 2.0 to 3.0, but add in 4, and suddenly you want 2X where 2.0 can only provide X.

I drive 4 gpus with a total of 16 lanes PCI-E 2.0 just fine, performance penalty compared to X79 is less than 5%, GTX680 is not so hugely faster.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
It doesn't work that way but the opposite, the higher the resolution the less PCI bandwidth is needed so the biggest differences in performance with differing PCI-E bandwitdh are with low resolutions.

UPDATE: Actually the amount of PCI-E bandwidth needed is the same regardless of the resolution, but because the framerates go down so do performance differences between differing PCI-E specs.
you sure about that?

"However, in this week’s evaluation testing at x8/x8 and x16/x16, we see that having both video cards at x8 does somewhat impact performance, but only at the higher 5760x1200 Eyefinity/NV Surround resolution."

http://hardocp.com/article/2010/08/23/gtx_480_sli_pcie_bandwidth_perf_x16x16_vs_x8x8/3
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
you sure about that?

"However, in this week’s evaluation testing at x8/x8 and x16/x16, we see that having both video cards at x8 does somewhat impact performance, but only at the higher 5760x1200 Eyefinity/NV Surround resolution."

http://hardocp.com/article/2010/08/23/gtx_480_sli_pcie_bandwidth_perf_x16x16_vs_x8x8/3

Yes.

anandtech said:
For any given game the amount of data sent per frame is largely constant regardless of resolution, so we’ve opted to test everything at 1680x1050. At the higher framerates this resolution offers on our 7970, this should generate more PCie traffic than higher, more GPU limited resolutions, and make the impact of different amounts of PCIe bandwidth more obvious.

http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa


[H] claims many things and never research if what they claim is true
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Watching the videos, load is half on the PCIE 2.0 run versus the 3.0 run.

Leads credence to Toyota's question.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
He has two videos up, one shows 2.0 with 4 cards getting around 70 fps, he regedits for 3.0, reboots, and comes back getting 130 fps at the same location.

http://www.youtube.com/watch?v=S0-xcxAvu54

Would be nice to know what fps he gets with a single card, but it looks like 4 card scaling at PCI-E 2.0 is utter crap, with that kind of scaling even my cards are faster. It looks like an issue with SLI, maybe not enough bridge bandwidth but that would mean it couldn't be solved with a driver update.
 

KompuKare

Golden Member
Jul 28, 2009
1,216
1,566
136
Would be nice to know what fps he gets with a single card, but it looks like 4 card scaling at PCI-E 2.0 is utter crap, with that kind of scaling even my cards are faster. It looks like an issue with SLI, maybe not enough bridge bandwidth but that would mean it couldn't be solved with a driver update.

Not sure how SLI / CX works (no interest really), but I wonder if they cut back the bridge bandwidth with the 680 (or didn't upgrade it compared to the 560) and therefore have to rely more on moving / syncing stuff via PCI-E. Certainly no previous PCI/E 1.0/2.0/3.0 scaling benches I've seen ever showed up something like this.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I know with 2 cards there is practically 0 difference between pcie 2.0 and 3.0....I haven't tried 4 way anything but something seems off.

Also his surround setup makes me jealous
 

Elganja

Platinum Member
May 21, 2007
2,143
24
81
well would 16x/8x/8x/8x at 3.0 really only perform like 8x/4x/4x/4x when running 2.0?

3.0 is double the speed of 2.0... but the way you worded it is weird.

i would look at more like this:
-when running a GPU in a 3.0 x8, it will perform the same as if it was in a 2.0 x16

another thing to mention, the 680 won't saturate 16x(2.0)/8x(3.0)/16x(3.0) from what I gather from the evga forums
 

kevinsbane

Senior member
Jun 16, 2010
694
0
71
Check later in the Hardocp thread. Another user tested the same thing, with surround resolutions and 1080p, using 2x SLI GTX 680. He finds the same thing; at 3x 1080p, there's almost a 50% difference between the two; at 1080p, it's a 5-10% difference between 16x pcie 2 vs 16x pcie 3.
 

omeds

Senior member
Dec 14, 2011
646
13
81
Imo pcie 2.0 x8 is a bit limiting for GTX680 in SLI configs, as 580 was already on the edge and slightly limited at times. I think pcie 3.0 x8 is enough though.