• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Upgraded from 1 GTX 580 to 2 GTX 680's SLI and strange performance anomaly

Dave3000

Golden Member
Today I upgraded from 1 GTX 580 to 2 GTX 680's SLI. One unusual performance issue I've been having with the GTX 680 was that in the elephants scene in 3Dmark 03 I'm now getting lower performance with 1 GTX 680 vs 1 GTX 580. For example, with my GTX 580 where it was around 1600 fps, with my GTX 680 it is around 1100 fps. GTX 680 SLI'd it's around 2000 fps. Also in the single-texture fillrate test it was 380 fps on my GTX 580 and now it's 330 fps on my GTX 680, GTX 680 SLI'd its around 660 fps, however the multi-texture fillrate test is much higher on my GTX 680 than my GTX 580, 816 fps vs 1400 fps. Overall score in 3Dmark 03 is only a little bit higher, went from 99400 to 100800 with 1 GTX 680. Is it just that the GTX 600 series is not optimized for DirectX 8 because of the drivers? However in 3Dmark 11 Performance Mode, I got around 9400 with a single GTX 680 and 14600 in SLI. In Heaven benchmark I know I get normal performance with my cards. I tested both cards separately and they both have this strange performance anomaly that I'm experiencing in 3Dmark 03. I'm using the drivers that were recently released. Is it just a driver issue?

My system is configured with an i7 3820, 16GB DDR3-1600, Asus P9X79 Pro.
 
Dave, not to be rude, but 3dmark03? Why do you care? Compare performance on your $1000 worth of graphics cards on something modern.

Anyway, did you clean out drivers before you installed new ones? Which driver set are you using? Same CPU speed? 3DM03 is basically just a CPU test now, with modern cards.
 
3dmark 03 is dead. What is it DX8.0?

That's like loading up Quake 4 and saying you went from 400fps to 350fps at 32x CSAA with transparancy AA on lol.
 
not too surprising.

the gtx 670/680 do poorly the older the game. the oldest games in most reviews is crysis and it beats the 580 by a little bit.
 
Today I upgraded from 1 GTX 580 to 2 GTX 680's SLI. One unusual performance issue I've been having with the GTX 680 was that in the elephants scene in 3Dmark 03 I'm now getting lower performance with 1 GTX 680 vs 1 GTX 580. For example, with my GTX 580 where it was around 1600 fps, with my GTX 680 it is around 1100 fps. GTX 680 SLI'd it's around 2000 fps. Also in the single-texture fillrate test it was 380 fps on my GTX 580 and now it's 330 fps on my GTX 680, GTX 680 SLI'd its around 660 fps, however the multi-texture fillrate test is much higher on my GTX 680 than my GTX 580, 816 fps vs 1400 fps. Overall score in 3Dmark 03 is only a little bit higher, went from 99400 to 100800 with 1 GTX 680. Is it just that the GTX 600 series is not optimized for DirectX 8 because of the drivers? However in 3Dmark 11 Performance Mode, I got around 9400 with a single GTX 680 and 14600 in SLI. In Heaven benchmark I know I get normal performance with my cards. I tested both cards separately and they both have this strange performance anomaly that I'm experiencing in 3Dmark 03. I'm using the drivers that were recently released. Is it just a driver issue?

My system is configured with an i7 3820, 16GB DDR3-1600, Asus P9X79 Pro.


Well GTX580 has 384bit memory interface and is a full sized Fermi revision with a 2x alu domain frequency of around 1500mhz which is useful for old benches, and GTX680 is a 256bit half-sized Kepler debut silicon with a slower singular clock domain of 1000-1150mhz, but with 3x the logic units. It's a tradeoff of specs.
 
Well GTX580 has 384bit memory interface and is a full sized Fermi revision with a 2x alu domain frequency of around 1500mhz which is useful for old benches, and GTX680 is a 256bit half-sized Kepler debut silicon with a slower singular clock domain of 1000-1150mhz, but with 3x the logic units. It's a tradeoff of specs.

Yeah, what he said. 😉

The 680 plays games very well. It's streamlining for that purpose is going to show up as weaknesses elsewhere.

For example, here it is doing raytracing.
45164.png
 
Well GTX580 has 384bit memory interface and is a full sized Fermi revision with a 2x alu domain frequency of around 1500mhz which is useful for old benches, and GTX680 is a 256bit half-sized Kepler debut silicon with a slower singular clock domain of 1000-1150mhz, but with 3x the logic units. It's a tradeoff of specs.
not sure why that matters. a 384 bit bus means nothing really because its the actual bandwidth that matters in the end and both cards are identical in that respect. also having 2x the shader speed means nothing either for the gtx580 as the gtx680 has 3x the shaders and is only clocked about 300-350mhz lower depending on boost. so in the end the gtx680 matches the gtx580 on bandwidth and actually has twice shading the power of the gtx580. basically the gtx580 should never beat the gtx680 in anything.
 
Last edited:
not sure why that matters. a 384 bit bus means nothing really because its the actual bandwidth that matters in the end and both cards are identical in that respect. also having 2x the shader speed means nothing either for the gtx580 as the gtx680 has 3x the shaders and is only clocked about 300-350mhz lower depending on boost. so in the end the gtx680 matches the gtx580 on bandwidth and actually has twice shading the power of the gtx580. basically the gtx580 should never beat the gtx680 in anything.

Except in dp performance 😎
 
not sure why that matters. a 384 bit bus means nothing really because its the actual bandwidth that matters in the end and both cards are identical in that respect. also having 2x the shader speed means nothing either for the gtx580 as the gtx680 has 3x the shaders and is only clocked about 300-350mhz lower depending on boost. so in the end the gtx680 matches the gtx580 on bandwidth and actually has twice shading the power of the gtx580. basically the gtx580 should never beat the gtx680 in anything.
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.
 
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.
Oh NO! Here we go with the 8 true vs 8 semi cores. Ben90, why did you have to open this can of worms up?D:🙁
 
How did you manage to find 2 GTX 680s? You need to play the lottery more often with that luck!



I've been tempted to buy one like 10 times before realizing its a bad investment, and everytime I thought of it Ive found one in stock.

Just a quick google search, found in stock at 2 popular stores in Canada right now. (both Zotac)

They are scarce, but you can find one if you want one really.
 
"bad investment"? Sure, but my wife feels my computer hobby as a whole is a "bad investment". Now if I could only convince her that shoes and clothes were a bad investment.😀:sneaky::\
 
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

Nobody cares bro, Bulldozer still sucks at EVERYTHING. Oh, except for 7zip. Btw, CPU forum is >>> that way.

Well, I'm off to bench my 680s in Oregon Trail. Need to make sure these cards were worth it.
 
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

THATS going in my Sig and yes bulldozer sucks.. Horribly
 
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

Holy hell, Batman. That makes one of my 680's a... wait for it... 2.15 TERAHERTZ processor! 1.4GHz clockspeed x 1536 cores.

Ha, the jokes on you, Nvidia. You should've DEFINITELY charged more. And yes, it plays Crysis... on very high 😎
 
Exactly. AMD really lost a lot of Ghz when moving from VLIW5->GCN. Must be some shady insider trading going on.
 
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

I saw this quote on Don Karnage's sig...thanks for the LOL of the day 😀:thumbsup:
 
Back
Top