Galaxy GTS 250 SLI issues

sighlentz

Member
Nov 30, 2009
29
0
0
The Galaxy GTS 250 cards come with "Xtreme Tuner" software for making adjustments to fan speed and for O/C ing.

When in SLI it shows specs for the Master gts 250 and the Slave gts 250

The master shows default specs of 1836 shader, 738 gpu and 1100 memory (just like the specs on the card state) But, the slave shows specs as 600 shader, 300 gpu and 100 memory.

Why are they different? Even NVIDIA's system monitor software shows the specs on GPU2 to be lower than the cards stated default specs. I've SLI'd other cards in different systems and never saw this happen. What is wrong?

Any help would be greatly apreciated
 

sighlentz

Member
Nov 30, 2009
29
0
0
My system specs:

Q6600 DO stock, 2gb 800 ram, 680i LT Sli, 700 watt psu, WD 500gb and 250gb,

and now (2) Galaxy GTS 250's
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
It could be that when you are not actively using both GPU's, the tuner software significantly throttles the slave GPU down to save power.
 

sighlentz

Member
Nov 30, 2009
29
0
0
Sounds like a viable theory, but the documentation for the S/W doesn't mention that.

I spoke with supposedly some tech rep at Galaxy (who wasn't to familiar with Extreme Tuner, he had to get back to me ) who stated that the 2nd gpu only supports the master gpu and that's what it's default speed was.

I wasn't buying it so I asked for the issue to be elevated to a more experienced tech rep. They both tried to convince me that the lower default specs was a function of being in SLI

I told them that 2 identical cards in SLI should approach 2 x the performance, not 1 1/3 x the performance. People would just buy a better single card.

Nvidia's system monitor software runs in the background on your desktop and each SLI system I have shows both gpu's running at the same specs in system monitor
 

qliveur

Diamond Member
Mar 25, 2007
4,090
74
91
Sounds like they're bullshitting you, all right.

Uninstall Xtreme Tuner. Problem solved.

The only "tuning" software that I even bother with is RivaTuner. All of the others that I've messed with in the past have ended up proving themselves to be more of a pain in the ass than they're worth.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
Actually I do believe the slave GPU in multi-GPU setups does indeed throttle way down when idling. The clockspeeds should increase when going into 3D mode, though.

Also the scaling can in fact only be 33% better. And it can be up to 100% better, but this is rare. Typically you can expect a 50-65% performance boost by adding a second card, with common boosts being in the 40-80% range.
 

Schmide

Diamond Member
Mar 7, 2002
5,596
733
126
I did the 2 galaxy upgrade for my nephew, went from 2x 9600gso to 2x gts250s with a q8200 @ 2.8ghz on a 780i. 3d06 scores went from around 13k to around 14k.
 

sighlentz

Member
Nov 30, 2009
29
0
0
I did uninstall Xtreme Tuner, and it made no difference in the slave gpu's lower default specs at idle.

But I did monitor the performance of both cards under load and the slave gpu indeed does scale up to the same specs as the master gpu and stays right with it until there's no more load, then it returns to idle at the lower specs.

I guess I'm satisfied with the results. The two gpu's perform well together during game play, just like their supposed to.

I'm glad I picked up these two cards at such a great price
 

sighlentz

Member
Nov 30, 2009
29
0
0
The GTS 250 is basically a 55nm G92b based 9800 GTX+ GPU on a new P361 PCB and internally Nvidia calls it D10P2. The differences are mainly on the power design; the core and memory speeds are identical to the 9800 GTX+ but power consumption has been lowered. However, not all GTS 250's have these improvements. Some of the earlier GTS 250's were merely rebranded 9800 GTX+ video cards. All of the GTX 200 series cards and 40nm GeForce 210, GeForce GT 220 & GeForce GT 240 support OpenGL 3.2.
 

SP33Demon

Lifer
Jun 22, 2001
27,928
142
106
I also just got mine a couple days back and will install them when I build my new rig :)