Upgraded from 1 GTX 580 to 2 GTX 680's SLI and strange performance anomaly

Dave3000

Golden Member
Jan 10, 2011
1,537
114
106
Today I upgraded from 1 GTX 580 to 2 GTX 680's SLI. One unusual performance issue I've been having with the GTX 680 was that in the elephants scene in 3Dmark 03 I'm now getting lower performance with 1 GTX 680 vs 1 GTX 580. For example, with my GTX 580 where it was around 1600 fps, with my GTX 680 it is around 1100 fps. GTX 680 SLI'd it's around 2000 fps. Also in the single-texture fillrate test it was 380 fps on my GTX 580 and now it's 330 fps on my GTX 680, GTX 680 SLI'd its around 660 fps, however the multi-texture fillrate test is much higher on my GTX 680 than my GTX 580, 816 fps vs 1400 fps. Overall score in 3Dmark 03 is only a little bit higher, went from 99400 to 100800 with 1 GTX 680. Is it just that the GTX 600 series is not optimized for DirectX 8 because of the drivers? However in 3Dmark 11 Performance Mode, I got around 9400 with a single GTX 680 and 14600 in SLI. In Heaven benchmark I know I get normal performance with my cards. I tested both cards separately and they both have this strange performance anomaly that I'm experiencing in 3Dmark 03. I'm using the drivers that were recently released. Is it just a driver issue?

My system is configured with an i7 3820, 16GB DDR3-1600, Asus P9X79 Pro.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Dave, not to be rude, but 3dmark03? Why do you care? Compare performance on your $1000 worth of graphics cards on something modern.

Anyway, did you clean out drivers before you installed new ones? Which driver set are you using? Same CPU speed? 3DM03 is basically just a CPU test now, with modern cards.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
3dmark 03 is dead. What is it DX8.0?

That's like loading up Quake 4 and saying you went from 400fps to 350fps at 32x CSAA with transparancy AA on lol.
 

tigersty1e

Golden Member
Dec 13, 2004
1,963
0
76
not too surprising.

the gtx 670/680 do poorly the older the game. the oldest games in most reviews is crysis and it beats the 580 by a little bit.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Today I upgraded from 1 GTX 580 to 2 GTX 680's SLI. One unusual performance issue I've been having with the GTX 680 was that in the elephants scene in 3Dmark 03 I'm now getting lower performance with 1 GTX 680 vs 1 GTX 580. For example, with my GTX 580 where it was around 1600 fps, with my GTX 680 it is around 1100 fps. GTX 680 SLI'd it's around 2000 fps. Also in the single-texture fillrate test it was 380 fps on my GTX 580 and now it's 330 fps on my GTX 680, GTX 680 SLI'd its around 660 fps, however the multi-texture fillrate test is much higher on my GTX 680 than my GTX 580, 816 fps vs 1400 fps. Overall score in 3Dmark 03 is only a little bit higher, went from 99400 to 100800 with 1 GTX 680. Is it just that the GTX 600 series is not optimized for DirectX 8 because of the drivers? However in 3Dmark 11 Performance Mode, I got around 9400 with a single GTX 680 and 14600 in SLI. In Heaven benchmark I know I get normal performance with my cards. I tested both cards separately and they both have this strange performance anomaly that I'm experiencing in 3Dmark 03. I'm using the drivers that were recently released. Is it just a driver issue?

My system is configured with an i7 3820, 16GB DDR3-1600, Asus P9X79 Pro.


Well GTX580 has 384bit memory interface and is a full sized Fermi revision with a 2x alu domain frequency of around 1500mhz which is useful for old benches, and GTX680 is a 256bit half-sized Kepler debut silicon with a slower singular clock domain of 1000-1150mhz, but with 3x the logic units. It's a tradeoff of specs.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Hum... This cant be serious

1600 fps to 1100? :D

Also, there is no elephant scene... Unless you are talking about Troll's Lair
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well GTX580 has 384bit memory interface and is a full sized Fermi revision with a 2x alu domain frequency of around 1500mhz which is useful for old benches, and GTX680 is a 256bit half-sized Kepler debut silicon with a slower singular clock domain of 1000-1150mhz, but with 3x the logic units. It's a tradeoff of specs.

Yeah, what he said. ;)

The 680 plays games very well. It's streamlining for that purpose is going to show up as weaknesses elsewhere.

For example, here it is doing raytracing.
45164.png
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Well GTX580 has 384bit memory interface and is a full sized Fermi revision with a 2x alu domain frequency of around 1500mhz which is useful for old benches, and GTX680 is a 256bit half-sized Kepler debut silicon with a slower singular clock domain of 1000-1150mhz, but with 3x the logic units. It's a tradeoff of specs.
not sure why that matters. a 384 bit bus means nothing really because its the actual bandwidth that matters in the end and both cards are identical in that respect. also having 2x the shader speed means nothing either for the gtx580 as the gtx680 has 3x the shaders and is only clocked about 300-350mhz lower depending on boost. so in the end the gtx680 matches the gtx580 on bandwidth and actually has twice shading the power of the gtx580. basically the gtx580 should never beat the gtx680 in anything.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
not sure why that matters. a 384 bit bus means nothing really because its the actual bandwidth that matters in the end and both cards are identical in that respect. also having 2x the shader speed means nothing either for the gtx580 as the gtx680 has 3x the shaders and is only clocked about 300-350mhz lower depending on boost. so in the end the gtx680 matches the gtx580 on bandwidth and actually has twice shading the power of the gtx580. basically the gtx580 should never beat the gtx680 in anything.

Except in dp performance :cool:
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
not sure why that matters. a 384 bit bus means nothing really because its the actual bandwidth that matters in the end and both cards are identical in that respect. also having 2x the shader speed means nothing either for the gtx580 as the gtx680 has 3x the shaders and is only clocked about 300-350mhz lower depending on boost. so in the end the gtx680 matches the gtx580 on bandwidth and actually has twice shading the power of the gtx580. basically the gtx580 should never beat the gtx680 in anything.
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.
Oh NO! Here we go with the 8 true vs 8 semi cores. Ben90, why did you have to open this can of worms up?D::(
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
How did you manage to find 2 GTX 680s? You need to play the lottery more often with that luck!
 

jacktesterson

Diamond Member
Sep 28, 2001
5,493
3
81
How did you manage to find 2 GTX 680s? You need to play the lottery more often with that luck!



I've been tempted to buy one like 10 times before realizing its a bad investment, and everytime I thought of it Ive found one in stock.

Just a quick google search, found in stock at 2 popular stores in Canada right now. (both Zotac)

They are scarce, but you can find one if you want one really.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
"bad investment"? Sure, but my wife feels my computer hobby as a whole is a "bad investment". Now if I could only convince her that shoes and clothes were a bad investment.:D:sneaky::\
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

wat
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

Nobody cares bro, Bulldozer still sucks at EVERYTHING. Oh, except for 7zip. Btw, CPU forum is >>> that way.

Well, I'm off to bench my 680s in Oregon Trail. Need to make sure these cards were worth it.
 
Feb 6, 2007
16,432
1
81
Nobody cares bro, Bulldozer still sucks at EVERYTHING. Oh, except for 7zip. Btw, CPU forum is >>> that way.

Well, I'm off to bench my 680s in Oregon Trail. Need to make sure these cards were worth it.
You'll be dying of dysentery way faster than on a 580.
 

Don Karnage

Platinum Member
Oct 11, 2011
2,865
0
0
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

THATS going in my Sig and yes bulldozer sucks.. Horribly
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

Holy hell, Batman. That makes one of my 680's a... wait for it... 2.15 TERAHERTZ processor! 1.4GHz clockspeed x 1536 cores.

Ha, the jokes on you, Nvidia. You should've DEFINITELY charged more. And yes, it plays Crysis... on very high :cool:
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Exactly. AMD really lost a lot of Ghz when moving from VLIW5->GCN. Must be some shady insider trading going on.
 

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Exactly. People are always bashing on Bulldozer, but the simple truth is it has 8 cores at 3.9Ghz so its basically a 31.2Ghz processor. A 3770k is only equal to a 14Ghz processor.

The numbers don't lie.

I saw this quote on Don Karnage's sig...thanks for the LOL of the day :D:thumbsup: