9600GT review at tweaktown

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
What I'm trying to say is don't get suckerd in by current performance compared to 8800gt. 8800gt is still the best deal to be had. Don't skimp on the shader for measly $30 difference. Shader will have more profound effects well into the future.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
I know nothing, all i see is benchmarks of the 9600GT keeping up with the 8800GT while only having half of the hardware as the 8800.


when these shader intensive games come out are we going to see the 9600GT crippled and the 8800GT fly or is the 8800GT going to be crippled aswell.

if both cards are going to be crippled then im not sure i see your argument for the 8800GT to be better.

i mean crysis is the most shder intensive game to date and the 8800GTS cant play it at its finest settings

So if games are going to get more shader intensive than crysis then its not only going to be the 9600GT that dies so why bother paying more for a card thats going to struggle with newer games.

im buying the 9600GT tomorrow because

1 its cheap
2 with games out now it performs the same as the 8800GT and the 3870
3 its got purevideo enhancments
4 its cooler and quieter
5 its a newer card.
6 it can only get better with driver updates

im happy :) Thankyou for your help Azn.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: Azn

So you are saying G94 has had architectural change from the G92? How?

It's just a cut back of G92 chip. 32 tmu and 64 shader. 8 texture per clock for 16sp. It is evident there is no architectural change other than a cut back. There's no new features that has been announced.
Where it's evident that it's just cut back? G94 also has updated memory channel system..meaning that 9600 GT's effective membandwith is higher than 8800 GT's even when both have memory clocked at 1800MHz and they have 256-bit mem bandwith; you know that this 57.6GB/s is only the maximum number.

9600gt does well because most of the texture filrate if not all are being used up in games. Shader isn't the holy grail of performance although video game programmers think it is and they will be adding more effects as time progress into the future. A game has to have more shader effect to take advantage. Current crop of games aren't as shader intensive that a 9600gt can't handle with 64SP.
Do you remember "The way it's meant to be played"-campaign. It basically means that Nvidia and game maker are working together. I don't think that this would lead in to more shader-heawy solutions now that Nvidia's strenghts are elsewhere. Latest to join Nvidia camp was Valve. Guess what happens when shader heavy Source-engine is out lived?
 

raddreamer3kx

Member
Oct 2, 2006
193
0
0
Originally posted by: Azn
What I'm trying to say is don't get suckerd in by current performance compared to 8800gt. 8800gt is still the best deal to be had. Don't skimp on the shader for measly $30 difference. Shader will have more profound effects well into the future.

Its still more money,people dont want to spend more money,this card performs pretty much identical to the 88gt.

I hope we are not getting some 88gt owners who paid almost $300 for there card crying abit.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: raddreamer3kx
Originally posted by: Azn
What I'm trying to say is don't get suckerd in by current performance compared to 8800gt. 8800gt is still the best deal to be had. Don't skimp on the shader for measly $30 difference. Shader will have more profound effects well into the future.

Its still more money,people dont want to spend more money,this card performs pretty much identical to the 88gt.

I hope we are not getting some 88gt owners who paid almost $300 for there card crying abit.



ffs dont get it kicking off........lol
 

AzN

Banned
Nov 26, 2001
4,112
2
0
If you remember Geforce 7 series. Geforce 7 is a powerful beast far as memory bandwidth, pixel fillrate, and texture fillrate goes. It was a much powerful card compared to x1000 series within that aspect. It did quite well at the time and compared to x1000 series but as time progressed X1000 series had superior shader performance where it would spank Geforce 7 lineup with future games. I see the same thing coming. More games are using more shader effects than ever. Next crop of games is where 9600gt might choke and 8800gt still playable because of more shader. Of course this may never happen because I can't predict the future. But currently it's only $30 difference between 64 vs 112. You do the math.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
both will choke for next gen gaming... heck even my 8800GTS 512 will choke for next gen gaming... the moment the 9800 series arrives I am selling it on ebay and buying one... the value of those cards will tank when newer games arrive.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Rusin
Originally posted by: Azn

So you are saying G94 has had architectural change from the G92? How?

It's just a cut back of G92 chip. 32 tmu and 64 shader. 8 texture per clock for 16sp. It is evident there is no architectural change other than a cut back. There's no new features that has been announced.
Where it's evident that it's just cut back? G94 also has updated memory channel system..meaning that 9600 GT's effective membandwith is higher than 8800 GT's even when both have memory clocked at 1800MHz and they have 256-bit mem bandwith; you know that this 57.6GB/s is only the maximum number.

You've got a link?


Do you remember "The way it's meant to be played"-campaign. It basically means that Nvidia and game maker are working together. I don't think that this would lead in to more shader-heawy solutions now that Nvidia's strenghts are elsewhere. Latest to join Nvidia camp was Valve. Guess what happens when shader heavy Source-engine is out lived?

It is nothing more than a campaign like a tv commercial to brainwash gamers to pick Nvidia cards nothing more.

Much like they had the same campaign before Geforce 7. Look at Geforce 7 now. It is being spanked by 1600xt with current crop of shader intensive games.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Even better now the 9600GT has dropped again in price in the UK, i can pick one up for less than £120.

An hour ago they were £140 :)
 

lookouthere

Senior member
May 23, 2003
552
0
0
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
I highly doubt nvidia doesn't know G94 is "magically" better than G92, otherwise they would just make a high end G94 card... the G94 has some minor changes but it's nothing revolutionary, it does good in games because it has good clockspeed, great memory bandwidth and a few AA optimizations. A 9600GT with 112 shaders... would be a little better than a 8800GT but just because of the AA optimizations (and as such would only be better in AA settings), in any case it wouldn't be 1.5x better.
In some games like Crysis, the 8800GT beats the 9600GT with ease, those 64 extra shaders DO make a big difference.

I think the 9800GT and 9800GTX will be slightly tweaked G92, by this I mean slightly higher clockspeeds, a few optimizations here and there and better AA performance (aa bug) so that the performance is rounded up and the old G80s become obsolete.

Most reviews pity highly overclocked 9600GTs (like 700mhz) to stock 8800GT and this had led people to believe the 9600GT is a revolutionary architecture that performs close to the 8800GT, well in the reviews where stock is compared to stock the 8800GT is up to 20% faster, hell on games like Crysis it's not even close and we all know games are only going to get further towards the Crysis benchmark and not farther away from it.

The 9600GT is a great card, but it's no 8800GT killer.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Piuc2020
I highly doubt nvidia doesn't know G94 is "magically" better than G92, otherwise they would just make a high end G94 card... the G94 has some minor changes but it's nothing revolutionary, it does good in games because it has good clockspeed, great memory bandwidth and a few AA optimizations. A 9600GT with 112 shaders... would be a little better than a 8800GT but just because of the AA optimizations (and as such would only be better in AA settings), in any case it wouldn't be 1.5x better.
In some games like Crysis, the 8800GT beats the 9600GT with ease, those 64 extra shaders DO make a big difference.

I think the 9800GT and 9800GTX will be slightly tweaked G92, by this I mean slightly higher clockspeeds, a few optimizations here and there and better AA performance (aa bug) so that the performance is rounded up and the old G80s become obsolete.

Most reviews pity highly overclocked 9600GTs (like 700mhz) to stock 8800GT and this had led people to believe the 9600GT is a revolutionary architecture that performs close to the 8800GT, well in the reviews where stock is compared to stock the 8800GT is up to 20% faster, hell on games like Crysis it's not even close and we all know games are only going to get further towards the Crysis benchmark and not farther away from it.

The 9600GT is a great card, but it's no 8800GT killer.

Oh my god thank you for that breath of fresh air.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

ouch, that's gonna hurt for a while...
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?

I don't presume to speak for azn, but "definitely be 1.5x to 2x better than 3870x2" is a pretty strong word choice here. Especially when you consider that 9800gx2 is STILL in limbo. When will it arrive, april? may? will it be relevant 3 months after introduction, or is it going to be another 7950gx2 and get eclipsed by 9800gtx? Even assuming that 9800gx2 will have a theoretical advantage of 20% + over 3870x2, will nvidia have strong enough driver support to justify it as a medium term gaming solution? It will almost definitely be the fastest single card for the few months between its introduction and 9800gtx's, but after that all bets are off. If you're married to nvidia then either stick to single card solutions or get 2 of those msi OC cards for $200 each at newegg and sli them.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

well ... clock-speeds appear to be up ... a little :p
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: bryanW1995
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?

I don't presume to speak for azn, but "definitely be 1.5x to 2x better than 3870x2" is a pretty strong word choice here. Especially when you consider that 9800gx2 is STILL in limbo. When will it arrive, april? may? will it be relevant 3 months after introduction, or is it going to be another 7950gx2 and get eclipsed by 9800gtx? Even assuming that 9800gx2 will have a theoretical advantage of 20% + over 3870x2, will nvidia have strong enough driver support to justify it as a medium term gaming solution? It will almost definitely be the fastest single card for the few months between its introduction and 9800gtx's, but after that all bets are off. If you're married to nvidia then either stick to single card solutions or get 2 of those msi OC cards for $200 each at newegg and sli them.

I'm talking strictly about G94 to G92 architecture differences (whatever they may be) here.
I made no mention of 3870GX2 or 9800GX2 or 1.5x/2x better than whatever.
Closed minds annoy me. What can I say. My comments were pinpointed at Azn's "It's just a cut back G92" statements. I asked him to show me the error of my ways. And that offer is always open.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

well ... clock-speeds appear to be up ... a little :p
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res

And a 50 to 75MHz clock speed advantage is going to make up for a 48sp deficit.... Not thinking so. 16 shaders? Maybe that'll do it. 48? Not going to happen.

My point is, something in the architecture has changed. The shaders "may" be exactly the same. The texture units "may" be exactly the same. As I said, it may be "simply" a memory controller improvement. But is this not considered a change to the architecture? At least in part? Something is making this thing crank. And you can't say it's drivers, else all other G92 based cards would see similar improvements.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
And that offer is always open.

look up ^^ two posts ... i proposed a theory to partially explain it although ultimately i do not agree with Azn

nvidia 'tweaked' the architecture ;)

well ... clock-speeds appear to be up ... a little
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res

Ah, I see your edit up there. But wasn't this card tested at all resolutions? up to 25x16?

You would never pick one up, but would you pick two? :D

Anands review shows this card playable in Crysis, without AA, at 19x12. 35.x fps. And 33.3 fps at 19x12 in STALKER?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin
And that offer is always open.

look up ^^ two posts ... i proposed a theory to partially explain it although ultimately i do not agree with Azn

nvidia 'tweaked' the architecture ;)

well ... clock-speeds appear to be up ... a little
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res

Ah, I see your edit up there. But wasn't this card tested at all resolutions? up to 25x16?

You would never pick one up, but would you pick two? :D

Anands review shows this card playable in Crysis, without AA, at 19x12. 35.x fps. And 33.3 fps at 19x12 in STALKER?

two .. of course NOT ... it is 256-bit ... the very issue you chided me on in pairing my 2900xt with my 256-bit 2900p. It *would* make a serious handicap if i gamed at 19x12 with all the details ... not at 16x10.

yes it was tested .. and i would much prefer the 2xGTS 512-bit for especially 25x16 ... NEVER even 2xG94 -256bit for the SMALL price difference IF i was running such a nice and expensive display. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin


two .. of course NOT ... it is 256-bit ... the very issue you chided me on in pairing my 2900xt with my 256-bit 2900p.

You were "chided" for your motherboard limitations, if nothing else.

It *would* make a serious handicap if i gamed at 19x12 with all the details ... not at 16x10.

Really? A handicap? I guess that would depend on the game, no? Besides, if you can play at 19x12 with a single G94, Why don't you think there would be an improvement with two? Meaning more candy? Did you not say you had great success with your Xfire setup? More than you couldn't hoped for? And that is with your 2x 2900pro's (256-bit).

yes it was tested .. and i would much prefer the 2xGTS 512-bit for especially 25x16 ... NEVER even 2xG94 -256bit for the SMALL price difference IF i was running such a nice and expensive display. ;)

I would agree with you, but the SLI scores for this card seemed to scale pretty well. I'd say that would leave room for some candy. And at 179 a pop, (358 total) not bad.

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?

I don't feel that I'm lied to. I just didn't get how he got his GX2 is 2x faster than the 8800Ultra. Not that GX2 is released so I'm guessing he got some information from Fudzilla.

It's not clear what Nvidia did with the architecture. Unless I see the blueprint on what they added I can't estimate how much faster it would be from the old core.

Difference between the G92GTS and GT is mostly clock speed and memory bandwidth. Better comparison would be GTS clocked to GT clocks which would net you less than 5%.

The whole debate about TMU I think I've done it to death with BFG10K. It is called Texture Memory Unit for a reason. Current memory bandwidth on G92 can't move the data fast enough to take advantage of all of 8800gt massive 33.6 fillrate or G92GTS for that matter. It would most likely take about 18.0- 20.0 MTexels/sec. I'm just making a estimate here so don't quote on me.

http://images.vnu.net/gb/inqui...-dx10-hit/fillrate.jpg

When you take those things to account it becomes clear why 9600gt performs the way it does since 9600gt fillrate is 20.8 MTexels/sec... Like I said before SP isn't the holy grail of performance. A game only chokes if it doesn't have enough shader power to perform the operation. As you increase the resolution you need more shader power. Same with fillrate..
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin


two .. of course NOT ... it is 256-bit ... the very issue you chided me on in pairing my 2900xt with my 256-bit 2900p.

You were "chided" for your motherboard limitations, if nothing else.

It *would* make a serious handicap if i gamed at 19x12 with all the details ... not at 16x10.

Really? A handicap? I guess that would depend on the game, no? Besides, if you can play at 19x12 with a single G94, Why don't you think there would be an improvement with two? Meaning more candy? Did you not say you had great success with your Xfire setup? More than you couldn't hoped for? And that is with your 2x 2900pro's (256-bit).

yes it was tested .. and i would much prefer the 2xGTS 512-bit for especially 25x16 ... NEVER even 2xG94 -256bit for the SMALL price difference IF i was running such a nice and expensive display. ;)

I would agree with you, but the SLI scores for this card seemed to scale pretty well. I'd say that would leave room for some candy. And at 179 a pop, (358 total) not bad.
i had great success because i did not change my resolution. 512bit is overkill for 16x10 and 256bit is not enough for my Pro to handle 19x12 - even together.

Keys, what you are missing is that no one who spends 3K for a display cares to save $50 if they might lose out a bit. AND they will with only 256-bit cards to drive 25x16 compared to a slightly faster 512MB card[period]

i rest my case

g94 is a very nice *budget* solution to take on 3850/3750 ;)