9600GT SLi review

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Nov 26, 2005
15,194
403
126
ATI Radeon HD 3870
ATI Radeon HD 3850 256MB
ATI Radeon X1950 XTX
NVIDIA GeForce 8800 GT 256MB
NVIDIA GeForce 9600 GT 512MB
NVIDIA GeForce 8600 GTS

comparing the 3850's in Crossfire against the 9600GT's in SLi to me seems misleading or for the 8800GT's for that matter. no fanboy am i but lets atleast compare apples to apples and oranges to oranges (256MB vs. 512MB)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SniperDaws
No i know i just wanted to know why the 9600GT performed as good as the 8800GT with half the hardware, but it could be a driver optimisation as a some people are saying and if it is indeed then ill be taking the card back and getting the 8800GT because the 8800GT will also be getting a boost when its starts using the driver.

no matter what it is, we probably won't know for quite awhile

nvidia isn't likely to let loose with the info now as it would give AMD info they would like to use. It *appears* to me that PART of it is Core speed and shader speed and PART of it is optimizations. MOST of the time, very similar architecture can be "tweaked" to run [up to] 25% faster on a more refined process.

That is why there appears to be so much *guessing* from the HW sites ... and they are probably mostly correct for what part of the picture they are looking at. ;)

guys .. mellow out on it .. we will know soon enough :p
-with the GTX release, obviously .. until then we are disputing SPECULATIONS.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
okay so if they'res a 64SP version of the g80 and a 128SP version of g80 (8800gtx) - could we just do a math problem analogy here to guess at the performance of a 9800gtx?

If a 9600gt has 64sp and a 9800gtx has 128sp. What is the ratio difference for the 8800 g80 series? if a g80 performs 40% slower with the SP # cut from 128 to 64, would it be safe to assume the same thing for the g92? Since we already know the 9600gt is 64sp, what would the performance of 128sp be? 100fps to 140fps?

Originally posted by: apoppin
Originally posted by: SniperDaws
No i know i just wanted to know why the 9600GT performed as good as the 8800GT with half the hardware, but it could be a driver optimisation as a some people are saying and if it is indeed then ill be taking the card back and getting the 8800GT because the 8800GT will also be getting a boost when its starts using the driver.

no matter what it is, we probably won't know for quite awhile

nvidia isn't likely to let loose with the info now as it would give AMD info they would like to use. It *appears* to me that PART of it is Core speed and shader speed and PART of it is optimizations. MOST of the time, very similar architecture can be "tweaked" to run [up to] 25% faster on a more refined process.

That is why there appears to be so much *guessing* from the HW sites ... and they are probably mostly correct for what part of the picture they are looking at. ;)

guys .. mellow out on it .. we will know soon enough :p
-with the GTX release, obviously .. until then we are disputing SPECULATIONS.

the 9600gt512 is not as fast as a 8800gt512

check out this comparison of 1920x1200 noAA in Bioshock.

link:
http://www.overclockersclub.co...iews/xfx_9600_gt/8.htm
 

Rusin

Senior member
Jun 25, 2007
573
0
0
If you want to speculate 9800 GTX performance you should notice first that both, 8800 GTS 512 and 9800 GTX use G92-chip and both are usinge it's A2-revision(8800 GTS: G92-400-A2, 9800 GTX G92-420-A2). Basically meaning that 9800 GTX 128SP should perform almost like 8800 GTS 512 128SP with similar clock speeds.
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Forgetting all overclocking...everything at stock.

The performance of the 9600GT falls directly between the 8800GTS (G80) 320MB/640MB and the 8800GT (G92) 512MB.


For a mid-range card, that is very very good.

The new generation mid-range is giving the performance at the lower tier of last generation high-end, on all-be-it beta drivers.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

Originally posted by: apoppin
Originally posted by: SniperDaws
No i know i just wanted to know why the 9600GT performed as good as the 8800GT with half the hardware, but it could be a driver optimisation as a some people are saying and if it is indeed then ill be taking the card back and getting the 8800GT because the 8800GT will also be getting a boost when its starts using the driver.

no matter what it is, we probably won't know for quite awhile

nvidia isn't likely to let loose with the info now as it would give AMD info they would like to use. It *appears* to me that PART of it is Core speed and shader speed and PART of it is optimizations. MOST of the time, very similar architecture can be "tweaked" to run [up to] 25% faster on a more refined process.

That is why there appears to be so much *guessing* from the HW sites ... and they are probably mostly correct for what part of the picture they are looking at. ;)

guys .. mellow out on it .. we will know soon enough :p
-with the GTX release, obviously .. until then we are disputing SPECULATIONS.

the 9600gt512 is not as fast as a 8800gt512

check out this comparison of 1920x1200 noAA in Bioshock.

link:
http://www.overclockersclub.co...iews/xfx_9600_gt/8.htm
[/quote]

no it isn't ... we already established that ... the speculation arises when we wonder what it's big brother will be like ... based on it as a "cut down" version of the faster 98' series
:Q
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
agree with cheex: midrange card buyers will be lucky with the geforce 9xxx series cards. Enthusiast buyers are going to get the suck. Blah. I hate dual card/gpu solutions.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: apoppin

Originally posted by: apoppin
Originally posted by: SniperDaws
No i know i just wanted to know why the 9600GT performed as good as the 8800GT with half the hardware, but it could be a driver optimisation as a some people are saying and if it is indeed then ill be taking the card back and getting the 8800GT because the 8800GT will also be getting a boost when its starts using the driver.

no matter what it is, we probably won't know for quite awhile

nvidia isn't likely to let loose with the info now as it would give AMD info they would like to use. It *appears* to me that PART of it is Core speed and shader speed and PART of it is optimizations. MOST of the time, very similar architecture can be "tweaked" to run [up to] 25% faster on a more refined process.

That is why there appears to be so much *guessing* from the HW sites ... and they are probably mostly correct for what part of the picture they are looking at. ;)

guys .. mellow out on it .. we will know soon enough :p
-with the GTX release, obviously .. until then we are disputing SPECULATIONS.

the 9600gt512 is not as fast as a 8800gt512

check out this comparison of 1920x1200 noAA in Bioshock.

link:
http://www.overclockersclub.co...iews/xfx_9600_gt/8.htm

no it isn't ... we already established that ... the speculation arises when we wonder what it's big brother will be like ... based on it as a "cut down" version of the faster 98' series
:Q[/quote]

sounds like he thinks they are equal
 

nonameo

Diamond Member
Mar 13, 2006
5,902
2
76
Sounds like it's just diminishing returns on shader processors. I think we saw this with the GT -> GTS as well, no? I think we all know that the weak spot of the 8600 series was the 128 bit memory bus.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: krnmastersgt
All this from the guy that thinks the 9800 GTX is going to be a factory overclocked 8800 GTS with nicer packaging.

But it is according to digittimes and many other sources.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: nonameo
Sounds like it's just diminishing returns on shader processors. I think we saw this with the GT -> GTS as well, no? I think we all know that the weak spot of the 8600 series was the 128 bit memory bus.

Most definitely. 8600 series had texture fillrate that was closer to G80GTS but with 32gb of memory bandwidth. If it had more bandwidth it would be closer to something like 1900xtx performance at least in the lower to mid resolutions.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nonameo
Sounds like it's just diminishing returns on shader processors. I think we saw this with the GT -> GTS as well, no? I think we all know that the weak spot of the 8600 series was the 128 bit memory bus.

I don't agree.

8600GT/88800GTX

32SP/128SP (4X more shaders)
128bit/384bit (3x wider bus)

Clock for clock, the 8600GT has a better shader clock to data bus width than the mighty 8800GTX. In other words, if the 8800GTX isn't starved for memory badwidth, than the 8600GT is even less so. The real probem with the card was that every spec was castrated.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Azn
Originally posted by: krnmastersgt
All this from the guy that thinks the 9800 GTX is going to be a factory overclocked 8800 GTS with nicer packaging.

But it is according to digittimes and many other sources.

man i will be so pissed if it is. But they could be spreading rumors. I'll believe it when i see some benches.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: ArchAngel777
Originally posted by: nonameo
Sounds like it's just diminishing returns on shader processors. I think we saw this with the GT -> GTS as well, no? I think we all know that the weak spot of the 8600 series was the 128 bit memory bus.

I don't agree.

8600GT/88800GTX

32SP/128SP (4X more shaders)
128bit/384bit (3x wider bus)

Clock for clock, the 8600GT has a better shader clock to data bus width than the mighty 8800GTX. In other words, if the 8800GTX isn't starved for memory badwidth, than the 8600GT is even less so. The real probem with the card was that every spec was castrated.

The difference with the G84 is that it does 8 textures per clock like the G92 vs G80 4 textures per clock. That 128bit memory bus just wasn't enough.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Yup. Indeed it's a VERY good card as a mid-range card. You guys are kinda spoiled due to recent price drop of 8800 GT.. :p But remember that 9600 GT is replacing the 8600 GTS at the same price segment. That POS was selling for $150+ until last month! 9600 GT might become a long-lasting SKU like the venerable 6600 GT. If the price drops below $150 in a few weeks, there will indeed be folks who are attracted to the idea of SLI'ing two of these puppies.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: nRollo
Originally posted by: v8envy
Except for one minor problem: SLI capable chipsets are only available from nvidia. And they're not very good, not to mention only available on obscenely priced boards.

If I had to have a multi-GPU solution (read: high res 30" display, need to crank settings) I'd have to go with the 3870x2 (or x4), simply because Skulltrail is just too out there. Until SLI is available on Intel chipsets the price of midrange/low end NV cards is immaterial. CF is the only game in town for multi-GPU unless you're a masochist.

(edit: oops, I was talking about 775-socket boards. I guess AMD fans can do SLI)

There's nothing wrong with SLi Socket 775 boards.

They get good reviews, and I have had literally zero problems with them myself.

What do you feel the advantages of non nForce chipsets are?

Besides "higher overclocking" anyway- because anybody with a mid 2GHz Intel cpu is just as well off as anyone else.

As far as Crossfire being the "only game in town"
there's the small matter of it being slower than a single card a lot, and usually slower than the much lower priced 9600GT SLi set.

I don't know, you've said some things, but could you please link us to reviews that back up your claims?

Most of the problem were sloved with the 8.2 driver.


Would you like to explain why Nvidia has loads of problem with resolution diversity problem.

http://www.computerbase.de/art...clive_barker_s_jericho

When playing Clive barker s Jericho AT 1600x1200 4xAA/16xAF the nvidia 8600GT SLI is winning but as soon as you turn up the AA 8x , the game is unplayable.

http://www.computerbase.de/art...i/21/#abschnitt_crysis

again something like that happens in Crysis too :!

Also 3870 is faster than 9600GT SLI and SLI has the same amount of problem Crossfire has. Which is it runs great on some game on some resolution and completely fails in other game. SLI and Crossfire rely too heavily on driver tweaking and if developer didn't profile the game properly then you have to wait one month for a fix. Also Nvidia has abounds its update for older card after 2 cycle update. You want to say sorry to all thoes 7950GX user ?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Extelleron
The 9600GT's SLI performance and the 8800GT @ $200 really puts pressure on ATI to reduce the price of the HD 3870 X2. It's really hard to justify the HD 3870 X2 for $450 when you can get the same performance from a 9600GT SLI setup for $340.

IMO HD 3870 X2 should be priced at around ~$349.

it doesn't work that way, unfortunately. The top-end is always going to command a premium over 2x(2nd best cards). There are people who will pay the premium to have the "best", and in some instances it might even make sense. People who don't have an sli board, for example, which is probably a fairly large majority of gamers. Also, specific to the 3870x2 is intel-based mobo owners who need the x2 to get the most out of their systems (once quadfire is officially released into the wild next month at least). Who wants to buy 4x3870's and have to run them on a freakin' phenom when you could just get an x48 dq6 with 2x3870x2?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: keysplayr2003
Originally posted by: Extelleron
Originally posted by: nRollo
Originally posted by: v8envy
Except for one minor problem: SLI capable chipsets are only available from nvidia. And they're not very good, not to mention only available on obscenely priced boards.

If I had to have a multi-GPU solution (read: high res 30" display, need to crank settings) I'd have to go with the 3870x2 (or x4), simply because Skulltrail is just too out there. Until SLI is available on Intel chipsets the price of midrange/low end NV cards is immaterial. CF is the only game in town for multi-GPU unless you're a masochist.

(edit: oops, I was talking about 775-socket boards. I guess AMD fans can do SLI)

There's nothing wrong with SLi Socket 775 boards.

They get good reviews, and I have had literally zero problems with them myself.

What do you feel the advantages of non nForce chipsets are?

Besides "higher overclocking" anyway- because anybody with a mid 2GHz Intel cpu is just as well off as anyone else.

As far as Crossfire being the "only game in town"
there's the small matter of it being slower than a single card a lot, and usually slower than the much lower priced 9600GT SLi set.

I don't know, you've said some things, but could you please link us to reviews that back up your claims?

Saying that nVidia's Intel chipsets are good is really not an accurate statement at this point.

Neither is saying that they aren't good. You mention a P35 board? Am I mistaken, or are the PCI-e slots limited to 1x 16x & 1x 4x? Or is that only the board apoppin has?

Check this out:

http://www.newegg.com/Product/...x?Item=N82E16813131142

ASUS P5N-E SLI 650i SLI 114.99
you will have to flash the BIOS to revision 803 if you want it to work with the new 45nm line of core 2 duo's.

This kind of blows your 250.00 mobo theory out the window.

I can flash my mobo that I bought last April to ver. 803, and drop a Wolfdale into it.

All it takes is a little research (about 3 minutes worth) to save yourself from marketing.

250+ dollars suddently reduced to 114.99. It's magical.

wow, did somebody hurt your feelings keys? How is your board going to work with a Q9450?

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I don't think he's being a fanboy at all, he was pointing out fallacies in others statements. Also, for most hardcore gamers there is very little benefit for getting a 9450 over an e8400 or Q6600. I do more seti@home than gaming and have my computer in a small office with an opty180 that also crunches 24/7, so the 9450 is a HUGE benefit for me personally over the others, but how many others here can say that?
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Originally posted by: nonameo
Sounds like it's just diminishing returns on shader processors. I think we saw this with the GT -> GTS as well, no? I think we all know that the weak spot of the 8600 series was the 128 bit memory bus.

I don't agree.
I don't think diminish returns will occur just yet with current or soon-to-come hardware.

8800GT -> 8800GTS = 16 shaders = 10% performance increase
9600GT -> 9800GTX = 64 shaders = 40% performance increase **


** - Please note: This is still...speculation.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Zstream
I thank the true fanboyism is showing from the mod...

not for you to say :p
-at least according to the rules here ;)

... and not at all, imo ... he should not be asked to defend his choice of HW at all
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: apoppin

Originally posted by: apoppin
Originally posted by: SniperDaws
No i know i just wanted to know why the 9600GT performed as good as the 8800GT with half the hardware, but it could be a driver optimisation as a some people are saying and if it is indeed then ill be taking the card back and getting the 8800GT because the 8800GT will also be getting a boost when its starts using the driver.

no matter what it is, we probably won't know for quite awhile

nvidia isn't likely to let loose with the info now as it would give AMD info they would like to use. It *appears* to me that PART of it is Core speed and shader speed and PART of it is optimizations. MOST of the time, very similar architecture can be "tweaked" to run [up to] 25% faster on a more refined process.

That is why there appears to be so much *guessing* from the HW sites ... and they are probably mostly correct for what part of the picture they are looking at. ;)

guys .. mellow out on it .. we will know soon enough :p
-with the GTX release, obviously .. until then we are disputing SPECULATIONS.

the 9600gt512 is not as fast as a 8800gt512

check out this comparison of 1920x1200 noAA in Bioshock.

link:
http://www.overclockersclub.co...iews/xfx_9600_gt/8.htm

no it isn't ... we already established that ... the speculation arises when we wonder what it's big brother will be like ... based on it as a "cut down" version of the faster 98' series
:Q[/quote]

I don't know. I've read couple of these guys articles and it's usually not consistent. They had 8600gts beating out 3850 few months back.

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Cheex
Originally posted by: nonameo
Sounds like it's just diminishing returns on shader processors. I think we saw this with the GT -> GTS as well, no? I think we all know that the weak spot of the 8600 series was the 128 bit memory bus.

I don't agree.
I don't think diminish returns will occur just yet with current or soon-to-come hardware.

8800GT -> 8800GTS = 16 shaders = 10% performance increase
9600GT -> 9800GTX = 64 shaders = 40% performance increase **


** - Please note: This is still...speculation.

Let's not forget clock speed has something to do with that gain:

8800GT 600/1500/1800
G92GTS 670/1625/1940

Take out the clock speed advantages and that 10% would shrink.
 

mitchhamlin

Senior member
Feb 13, 2008
538
0
0
So cheap for such great sounding cards! You can buy two and SLI them, and it'd be better than an 8800 Ultra, cheaper as well!