GTX680 bottleneck?? help me explain these benchmarks!

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
Hi All!
XBitLabs have recently published an article on PileDriver CPUs:

http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_6.html#sect0

But I found it interesting form a different point of view. their test bench uses a single GTX680 as a GPU, a powerful card to say the least, but then again all their gaming benchmark graphs show a massive GPU bottleneck! how can this be??

(i refer to the UHQ parts of each graph - not the 1280X800 benches)

examin graph 1: Batman: AC - a TWIMTBP game - the flagship single GPU can't handle 8AA @ 1080P? CPU power keeps rising, but performance stays roughly the same.
moar coars? Batman don't care. but that's the game's fault, not the GPU.

moving on to graph 2: Borderlands 2 - Pentium G2120 scores the same as an FX-4300 PD? and within 10% of the top PD FX-8350?? really?? and that's even without AA but rather with FXAA and probably Physx on maximum.

graph number 3: Crysis 2 - another TWIMTBP game - seriously??? 1920X1080 and 1280X800 score the same over all CPUs?? is the game artificialy capped to 60FPS?

graph 4: Dirt Showdown - 8AA again. CPUs keep getting bigger, FPS stay the same.

graph 5: Far Cry 2 - same deal as Crysis 2, but at least it's showing logical results in terms of processing power.

graph 6: Metro 2033 - i honestly don't know what to say at this point...

i'm completely stumped. can anyone provide an explanation to these results?
i generally hold XBitLabs as a reputable site. could it be that they completely messed up?
 
Feb 19, 2009
10,457
10
76
What do you mean? Most of those games show increases in performance moving to Intel CPUs, so its obviously responding to faster CPUs. Only a few are totally GPU limited, as expected at high IQ and AA.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
well Crysis 2 did not go up when they lowered the resolution and reduced AA so something is screwy there.
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
the GTX680, which is supposed to allow smooth game play across 2 or 3 monitors, should not be completely saturated by a 3GHz quadcore. it just doesn't seem right.
the increase in performance is far below what should be when you compare these CPUs against each other, which points to a bottleneck in the GPU. the author also says that, but some of these games were made to run on consoles with 6 year old hardware how can they stress a GTX680 so much?
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Crysis 2 is gpu limited. Others are amd CPUs limited.
um look at the chart again. Crysis 2 was getting the same at 1920 as it was at 1280. if it was gpu limited then it would have higher framerate at 1280. only the Pentium cpu lags behind the other cpus at both resolutions so its certainly not gpu limited based on that chart.
 
Last edited:

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
I only saw the chart where the pentium was lagging. Beyond 60 fps we need a diff better CPU arch for the crytek engine.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Maybe whichever graphical feature that limits the framerate isn't affected by resolution
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
again there is something wrong with their results. lowering the res on Crysis 2 should have mode the framerates go way up and that is a fact. even on my wimpy gtx560 se, I can get about 80-85 fps if I lower my res really low so know it should not be limited like that at 1280 with a gtx680 and and any of those high end cpus.


Maybe whichever graphical feature that limits the framerate isn't affected by resolution
nope as I have already tested myself on max settings an much lower res and I certainly dont stop at low 60s as I can go right over 90 fps. yes my cpu is at 4.4 but no way could the game be that cpu limited to where its stuck at lows 60s with all of the high end cpus.
 
Last edited:

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
Techspot have a graph taken from Far Cry 3 with an OCed 3770K and a 7970, in 500MHz intervals on the CPU frequency and the graph shows better performance so the 7970 is being held back somewhat by the CPU when it's not overclocked. i find it hard to believe the GTX680 is different in that respect so my only logical conclusion is to dismiss this article and reconsider my entire attitude towards XBitLabs video card reviews...
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Guys, those Crysis 2 results are clearly operating under a frame cap.

No reason to cast out XbitLab's reviews forever. This was an oversight, but the general findings stand.

And to the OP, I'm not sure why you're so concerned about "GPU bottlenecks." That's exactly what we should expect and hope for in new games at high settings. Would you prefer that Batman's performance be entirely decided by your CPU
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
TH is running their system builder articles. The 1000 dollar build this quarter uses a AMD 8350 and they compare it to last quarters 1000 dollar build that used a 3570K, both builds use a gtx 670. Things look 'as expected' in the various resolutions. More data. Not many games tested.
BF3-low.png

BF3-high.png


 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
it's bad because people buy high-end parts expecting very good performance while in reality their expensive hardware is being wasted. if you had known that you can buy an FX-4300 for your gaming rig and get the same results as a 3770K, would you still buy the intel cpu?
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
it's bad because people buy high-end parts expecting very good performance while in reality their expensive hardware is being wasted. if you had known that you can buy an FX-4300 for your gaming rig and get the same results as a 3770K, would you still buy the intel cpu?

I think you're confused about the results of this article.

(1) It shows that a 3770k is MUCH faster than an FX-4300. In fact, in Far Cry 2, it is nearly twice as fast at low resolutions, and 30% faster at high resolutions.

(2) It does not purport to test the GTX680 against other cards. For that, you would review an article such as this one: http://www.hardocp.com/article/2012/11/12/fall_2012_gpu_driver_comparison_roundup/. It clearly shows that a 680 is faster in every game than a less expensive card like a 660Ti.

With high-end PC components, you don't get performance increases that scale linearly with cost, but you do get performance increases as price increases.