Sanity Check Needed! 285SLI vs 260SLI, Please Help!

deusued

Junior Member
Apr 7, 2009
2
0
0
I need a sanity check, or help, or probably even both.

First: I'm not a performance enthusiast: I'm a hobbyist at best. My most recent foray into performance is the first time since my days in the late 90's with overclocking Celerons in tandem with TNT's and Voodoo's (b/c of marriage, kids).

That said, I'm a career technologist and go pretty deep on this stuff - so, I'm no newbie, but I'm most likely very green in relation to the gurus on this board.

So.. my issue.

I've run through a few sets of hardware components trying to get the most out of my machine. My "quick" benchmark is Crysis v1.2.1, island, GPU benchmark -- my "not so quick" is 3DMark06.

My testing is -not- that scientific, so please be gentle.

Onto the details:

Motherboard: EVGA 780i FTW
\ http://www.newegg.com/Product/...7&Tpk=evga%20780%20ftw
Memory : GSkill 1200
\ http://www.newegg.com/Product/...1201&Tpk=gskill%201200
Video Cards: 2 @ EVGA GTX 260: 896-P3-1260-AR
\ 2 @ EVGA GTX 285 SSC: 01G-P3-1287-AR
Resolution: 1680x1050

Crysis v1.2.1, 1680x1050 0xAAm, Very High (fps are all average for 4 runs)
--------------------------------------------------------------------------
E8600@3.33GHz, RAM@1067-6-6-6-18, 260/SLI (182.06)/576/1242/999: 41.7 fps
E8600@3.33GHz, RAM@1067-6-6-6-18, 260/SLI (182.06)/576/1242/1202: 40.5 fps
E8600@3.33GHz, RAM@1067-5-5-5-15, 260/SLI (182.06)/648/1397/1202: 42.3 fps

E8600@4.0 GHz, RAM@960-6-6-6-18, 260/SLI (182.06)/576/1242/999: 42.6 fps
E8600@4.0 GHz, RAM@960-6-6-6-18, 260/SLI (182.06)/600/1293/999: 43.6 fps
E8600@4.0 GHz, RAM@960-6-6-6-18, 260/SLI (182.06)/648/1397/1202: 45.0 fps
E8600@4.0 GHz, RAM@960-5-5-5-15, 260/SLI (182.06)/648/1397/1202: 45.4 fps
E8600@4.0 GHz, RAM@960-5-5-5-15, 260/SLI (182.06)/674/1453/1202: 46.2 fps
E8600@4.0 GHz, RAM@960-5-5-5-15, 260/SLI (182.06)/674/1453/1248: 46.35 fps

E8600@4.0 GHz, RAM@960-6-6-6-18, 285/SLI (182.06)/702/1584/1323: 47.7 fps
E8600@4.0 GHz, RAM@960-5-5-5-15, 285/SLI (182.06)/702/1584/1323: 48.6 fps
E8600@4.0 GHz, RAM@960-5-5-5-15, 285/SLI (182.50)/702/1584/1323: 46.9 fps
E8600@4.0 GHz, RAM@960-5-5-5-15, 285/SLI (185.66)/702/1584/1323: 46.0 fps

Q9650@3.0 GHz, RAM@800-5-5-5-15, 285/SLI (182.06)/702/1584/1323: 40.3 fps

Q9650@3.6 GHz, RAM@960-6-6-6-18, 285/SLI (185.66)/702/1584/1323: 42.1 fps
Q9650@3.6 GHz, RAM@960-5-5-5-15, 285/SLI (185.66)/702/1584/1323: 42.79 fps
Q9650@3.6 GHz, RAM@960-5-5-5-15, 285/SLI (182.06)/702/1584/1323: 45.7 fps

3DMark06: (limited runs, b/c I only do this at the end when I'm satisfied with the above):
---------
E8600@4.0 GHz, RAM@960-5-5-5-15, 285/SLI (182.06)/702/1584/1323: 19458
\ http://service.futuremark.com/compare?3dm06=10521253
E8600@4.0 GHz, RAM@960-5-5-5-15, 260/SLI (182.06)/674/1453/1248: 18877
\ http://service.futuremark.com/compare?3dm06=10485753

Notes:
------
a) my 260's are decently overclockable: 17% on the core & shader and 25% on the ram. my gtx 285's are sadly not overclockable at all.
b) i overclock the processors with the built-in "dummy oc". i have had no issues going 20% on either of the two proc's above.
c) i purchased the q9650 with the crazy notion that my performance was somehow being limited by my e8600. i was wrong. in an expensive way. ouch.
d) the 185.66 beta drivers are not ready for prime time. obviously, they are beta, but still.

Issues/Questions:
1) wtf. 285/sli barely eeks out performance increases over 260/sli: 10% over stock 260 and near-neglible over overclocked 260's. serious wtf.
2) should i return the 285's? there's no real other way for me to say it, but i really feel as though i got duped into thinking the 285's would perform better than 260's. seriously. look at the "improvements" above.
3) regarding (b) above: is there a way to read out the voltage/etc settings from the mobo? i may be missing something obvious, but i can't get -all- of them using windows tools (even the nvidia toolset) to correctly identify all of the changes that "dummy oc" is making.
4) i'm hoping someone from EVGA pipes in and either tells me i've got what i can get or that there's something wrong with the cards, but i seriously doubt that it'll be that black and white.

Help! I'm going crazy!

Thanks,
deusued/Will
 

masteryoda34

Golden Member
Dec 17, 2007
1,399
3
81
An OC'ed 260 is usually as good as a stock 280. At higher resolutions you would see more separation between 260 and 280's. I think these results are normal.

A lot of people might disagree with me, but at 1680x1050 w/SLI 260 or SLI 280 I think Crysis will be CPU limited even on a Q9650.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Well Crysis sort of requires an exponential increasing amount of power so actually I find the increase fairly impressive.

Crysis is also fairly yesteryear in terms of CPU utilization--all the multiplatform console games that are available/coming out are going to take good advantage of quads (or tris). It's a 2007 game, and frankly its poorly optimized, especially at the texture level where you get the occasional stutter no matter how much RAM you have--the game was just designed when hardware was more limited, and a large amount of additional resources is wasted.

I harken Crysis to the Aurora engine. Have you seen the Witcher/NWN2 benchmarks? Crossfire X 4870 vs a single 8800 with a super clocked CPU and in both cases your fps is bottlenecked at 18. Which means...not optimized for future hardware. At this date the engine is just complete sh1t in terms of optimization.

I imagine CryEngine 3 will have 95% of the visuals with 2/3 of the power requirements. I'm sure they've learned quite a bit since then.
 

deusued

Junior Member
Apr 7, 2009
2
0
0
Originally posted by: masteryoda34
An OC'ed 260 is usually as good as a stock 280. At higher resolutions you would see more separation between 260 and 280's. I think these results are normal.

A lot of people might disagree with me, but at 1680x1050 w/SLI 260 or SLI 280 I think Crysis will be CPU limited even on a Q9650.

do you think the same would be true (not much improvement) for the standard 3dmark06 bench as well, where i only saw a sub-1000 point increase?
 

masteryoda34

Golden Member
Dec 17, 2007
1,399
3
81
Originally posted by: deusued
Originally posted by: masteryoda34
An OC'ed 260 is usually as good as a stock 280. At higher resolutions you would see more separation between 260 and 280's. I think these results are normal.

A lot of people might disagree with me, but at 1680x1050 w/SLI 260 or SLI 280 I think Crysis will be CPU limited even on a Q9650.

do you think the same would be true (not much improvement) for the standard 3dmark06 bench as well, where i only saw a sub-1000 point increase?

Probably. 3DMark Vantage would be better for the newer video cards. When I went from a 8800GTX to a GTX 260 my 3dmark 06 score only went up like 1000pts. My performance in games was a lot better though. Also I'm still using a AMD 5200 @ 3GHz.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I personally find that 3dmark05 is a much better benchmarking test. That and pong.


Sorry, couldn't resist my smartass nature.


What the hell are you doing? If you're at 16x10 (which is where I am btw) you are killing your budget for 1 game!!! keep the gtx 260's if you must but by all means ditch the 285's. You would be fine on every other game in existence with just 1 gtx 260, especially with the kind of OC you have. Both camps are going to come out with dx11 gpus sometime between june and november, that will be the time to upgrade your gpu. Look for a deal on a 24" monitor while you're at it...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
At 1680x1050 all you really take advantage of is a single GTX 285. If you really want to go dual, go with dual GTX 260s. GTX 285 SLI is simply overkill for your needs, and any 'future proofing' you get from that setup is not worth the additional cost.
 

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136
Also if you have the budget to look at GTX 285 SLI you really need to look at 30" monitors.

1680x1050 is a ridiculous waste of that much GPU power.

Viper GTS
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
return the 285s and the 9650 if you want your money back since you've noticed the difference is negligible. If you still have the e8600 - Oc the e8600 to 4.5ish, and the 260's to around ~740/1500/1240.

The quad core 9650 at 4ghz+ is really going to help you out in Farcry2, Gtaiv, supreme commander, Flight simulator x, Unreal tourney 3, UE games, Wic, and a few games when compared to the 4.5ghz dualcore.

The 260's at 740 core are about the same as the 285's. The orignal 260s with 192 shaders and the first pcb design are the ones composed of the most durable components, and consequently have the most overclocking headroom. (even higher with a software voltmod like EVGA voltage tuner).

the 4ghz dual to 4ghz quad is not a big jump except in 3dmarks and the above mentioned games.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I believe you are cpu limited at those resolutions. The I7 core @ 3.8/4.0 does a much better job with high end sli from the benches Ive read.
 

gummi467

Junior Member
Aug 30, 2004
22
0
0
Most of the above people are correct in that your gpu solution is overkill for your needs at those resolutions. 1 285 or 2x 260's will be more than adequate if not overkill for your resolution. That being said you weren't 'duped' but rather just the victim of diminishing returns (performance/cost) and software technology that can't put to full use the hardware that you have at your disposal.

as a note, just to add something you may not have noticed is that the recent 185 drivers seem to diminish performance at lower resolutions in exchange for an increase at extreme ones.
source: http://anandtech.com/video/showdoc.aspx?i=3539&p=2

like was said, that game is not optimized for a quad core so the 9650 isn't going to come out ahead unless evenly clocked with the 8600 and even then it would only be slight if at all. Honestly, your best bet in monetary terms is to try to return the 9650 and 285s for a full refund. barring that, upgrade to a 1920x1200 monitor. I doubt you would see a framerate drop from what you currently have and they aren't that expensive. skip the 30 inchers unless you have over a grand to blow just for a monitor. you won't gain much and the price is just retarded. if you do decide to blow that kind of money I will gladly accept hand-me-downs.

in all seriousness, overclocking the 8600 (4.2+ should be doable on air) farther and going back to the 182 drivers is going to be your best bet for maximizing your framerates. you are definitely cpu bound and for your resolution the 185 drivers blow. if it ain't broke don't fix it and all that jazz. best of luck.
 

TantrumusMaximus

Senior member
Dec 27, 2004
515
0
0
Originally posted by: gummi467
Most of the above people are correct in that your gpu solution is overkill for your needs at those resolutions. 1 285 or 2x 260's will be more than adequate if not overkill for your resolution. That being said you weren't 'duped' but rather just the victim of diminishing returns (performance/cost) and software technology that can't put to full use the hardware that you have at your disposal.

as a note, just to add something you may not have noticed is that the recent 185 drivers seem to diminish performance at lower resolutions in exchange for an increase at extreme ones.
source: http://anandtech.com/video/showdoc.aspx?i=3539&p=2

like was said, that game is not optimized for a quad core so the 9650 isn't going to come out ahead unless evenly clocked with the 8600 and even then it would only be slight if at all. Honestly, your best bet in monetary terms is to try to return the 9650 and 285s for a full refund. barring that, upgrade to a 1920x1200 monitor. I doubt you would see a framerate drop from what you currently have and they aren't that expensive. skip the 30 inchers unless you have over a grand to blow just for a monitor. you won't gain much and the price is just retarded. if you do decide to blow that kind of money I will gladly accept hand-me-downs.

in all seriousness, overclocking the 8600 (4.2+ should be doable on air) farther and going back to the 182 drivers is going to be your best bet for maximizing your framerates. you are definitely cpu bound and for your resolution the 185 drivers blow. if it ain't broke don't fix it and all that jazz. best of luck.

What he said.

Also as another person above said I was in the same boat... 2 GTX 260's with an OC E8400 after going i7 it was a landmark better.. then finally settled on the GTX295. Of course I have a 24" LCD to go with it. If you're not at least gaming 1920x1200 at a minimum you are wasting your money on that hardware.

 

palladium

Senior member
Dec 24, 2007
538
2
81
Originally posted by: jaredpace
return the 285s and the 9650 if you want your money back since you've noticed the difference is negligible. If you still have the e8600 - Oc the e8600 to 4.5ish, and the 260's to around ~740/1500/1240.

The quad core 9650 at 4ghz+ is really going to help you out in Farcry2, Gtaiv, supreme commander, Flight simulator x, Unreal tourney 3, UE games, Wic, and a few games when compared to the 4.5ghz dualcore.

The 260's at 740 core are about the same as the 285's. The orignal 260s with 192 shaders and the first pcb design are the ones composed of the most durable components, and consequently have the most overclocking headroom. (even higher with a software voltmod like EVGA voltage tuner).

the 4ghz dual to 4ghz quad is not a big jump except in 3dmarks and the above mentioned games.

Not every card can get that kind of OC potential. Don't know about the 192SP GTX260s, my GTX280 ( I believe it should have at least the same durability if not more 'coz of the huge price premium) can only manage 1400 shader before I get errors in ATITool ( I did manage to get it stable at ~750 core though).

But yeah, you don't need SLI config at that res, I'd get a GTX285 ( or even 275) and call it a day.