9600GT review at tweaktown

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
Originally posted by: keysplayr2003
Originally posted by: apoppin


two .. of course NOT ... it is 256-bit ... the very issue you chided me on in pairing my 2900xt with my 256-bit 2900p.

You were "chided" for your motherboard limitations, if nothing else.

It *would* make a serious handicap if i gamed at 19x12 with all the details ... not at 16x10.

Really? A handicap? I guess that would depend on the game, no? Besides, if you can play at 19x12 with a single G94, Why don't you think there would be an improvement with two? Meaning more candy? Did you not say you had great success with your Xfire setup? More than you couldn't hoped for? And that is with your 2x 2900pro's (256-bit).

yes it was tested .. and i would much prefer the 2xGTS 512-bit for especially 25x16 ... NEVER even 2xG94 -256bit for the SMALL price difference IF i was running such a nice and expensive display. ;)

I would agree with you, but the SLI scores for this card seemed to scale pretty well. I'd say that would leave room for some candy. And at 179 a pop, (358 total) not bad.
i had great success because i did not change my resolution. 512bit is overkill for 16x10 and 256bit is not enough for my Pro to handle 19x12 - even together.

Keys, what you are missing is that no one who spends 3K for a display cares to save $50 if they might lose out a bit. AND they will with only 256-bit cards to drive 25x16 compared to a slightly faster 512MB card[period]

i rest my case

g94 is a very nice *budget* solution to take on 3850/3750 ;)

Who's talking about 3K for a display? 24 incher could do 19x12 no? What is this regard for 3 thousand dollar monitors? I mentioned nothing of this. Who would put a 3K monitor on a mid range card anyway? Non gamers excluded ;)
I play at 1680x1050 for the most part. My 22" native res. Crysis, I had to back down to 14x9 to get comfy.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i have been talking 25x16 :p
--for which a g94 even in SLi would not be a logical solution

and if you really wanted to save a few bucks, g94 sli'd would be 'possible' at 19x12 ... but would *you* do it over the GTS pair?
 

superbooga

Senior member
Jun 16, 2001
333
0
0
If you look at the front page of nvnews, they have a few results of 9600 GT SLI at 25x16. There are substantial gains, enough to go from unplayable to playable.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?


OMG thank you, Somone else thats noticed.

Anyway i ordered my Gigabyte 9600GT Non OC today for £120 and it should be here on

Monday, im so excited to get rid of this awful 1800XL :)

The highest res i use is 1680x1050.

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: SniperDaws
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?


OMG thank you, Somone else thats noticed.

Anyway i ordered my Gigabyte 9600GT Non OC today for £120 and it should be here on

Monday, im so excited to get rid of this awful 1800XL :)

The highest res i use is 1680x1050.

Congratulations. 9600gt is definitely a step above the XL. Probably 2x the performance or a little more.

I already explained to keysplayr2003 if you catch above. I'm just waiting what he says about it or if he has anything to say.





 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
:laugh:

I agree with Azn, more shaders is the right direction for future DX10 games. I know I don't have prof to back this up, but what I mean it's not really hard to figure it out just look at both AMD and nVidia current architectures lots of shaders. ;)

To me this card is a rip off. I don't want to be rude but I just feel that nVidia is selling us a garbage version of the 8800GT.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Azn
Originally posted by: SniperDaws
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?


OMG thank you, Somone else thats noticed.

Anyway i ordered my Gigabyte 9600GT Non OC today for £120 and it should be here on

Monday, im so excited to get rid of this awful 1800XL :)

The highest res i use is 1680x1050.

Congratulations. 9600gt is definitely a step above the XL. Probably 2x the performance or a little more.

I already explained to keysplayr2003 if you catch above. I'm just waiting what he says about it or if he has anything to say.

You seem to be saying that an 8800GT's memory bandwidth is stifling it's fillrate, and that it would make little difference if we hacked off 48 sp's from it. Basically performing equal with the 9600GT, all other things being equal like bit-width, memory, shader and core speed.

I'd ask you to look a FS 9600GT article just posted. If you want accurate numbers for bandwidth and fillrate, check it out.

 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: MegaWorks
:laugh:

I agree with Azn, more shaders is the right direction for future DX10 games. I know I don't have prof to back this up, but what I mean it's not really hard to figure it out just look at both AMD and nVidia current architectures lots of shaders. ;)

To me this card is a rip off. I don't want to be rude but I just feel that nVidia is selling us a garbage version of the 8800GT.

well its priced a good £30-40 ($50-60) cheaper than the 8800GT and the 8800GT is only slightly faster and the 9600GT may end up being faster with driver updates.


 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Azn
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?

I don't feel that I'm lied to. I just didn't get how he got his GX2 is 2x faster than the 8800Ultra. Not that GX2 is released so I'm guessing he got some information from Fudzilla.

It's not clear what Nvidia did with the architecture. Unless I see the blueprint on what they added I can't estimate how much faster it would be from the old core.

Difference between the G92GTS and GT is mostly clock speed and memory bandwidth. Better comparison would be GTS clocked to GT clocks which would net you less than 5%.

The whole debate about TMU I think I've done it to death with BFG10K. It is called Texture Memory Unit for a reason. Current memory bandwidth on G92 can't move the data fast enough to take advantage of all of 8800gt massive 33.6 fillrate or G92GTS for that matter. It would most likely take about 18.0- 20.0 MTexels/sec. I'm just making a estimate here so don't quote on me.

http://images.vnu.net/gb/inqui...-dx10-hit/fillrate.jpg

When you take those things to account it becomes clear why 9600gt performs the way it does since 9600gt fillrate is 20.8 MTexels/sec... Like I said before SP isn't the holy grail of performance. A game only chokes if it doesn't have enough shader power to perform the operation. As you increase the resolution you need more shader power. Same with fillrate..

TMU = Texture Mapping Unit. And in the other thread, AA being done through TMU??? Dont talk if you have no idea what you are talking about. Do some research before making outlandish claims.

Im not sure what you mean by taking x amount of texel fillrate etc but there are several reasons why the 9600GT (G94) can give the 8800GT (G92) a run for its money.

The 8800GT seems to take more of a hit when AA is enabled compared to the 9600GT. Im not so sure what kind of new technique is involved in G94 since nVIDIA doesn't want to share it but its either a new compression method/memory management. (This is what B3D is for :D)

Not all games are shader bound where 112 SPs of the GT can flex its muscle against 64 of the G94, but its rather a combination of things. Bandwidth, framebuffer, fillrate but its too hard to isolate these things. From what i can tell, the 8800GT is kind of a "imbalanced" product. The card is starved of bandwidth yet it has overwhelming shader performance for a mid/high range card. If it was paired with faster memory, or a bigger bus it would easily be faster than the G80 based cards on all fields.

The 9600GT is much more balanced in this sense. But im still curious to see how even with such a gap in SP count between G94/G92 the G94 can keep up. Not to mention a G94 (Half of G80 on 65nm) giving a full fledged RV670 a run for its money. (Pretty much a R600 on 55nm)

At the end of the day, its a win win situation. You cant go wrong buying either a 8800GT or 9600GT. Im not sure how a HD3870 fits into the picture as its having tough time with the new contender. HD3850 is easily the slowest of the 4, and would only be worth it if its priced much lower than $169.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: MegaWorks
:laugh:

I agree with Azn, more shaders is the right direction for future DX10 games. I know I don't have prof to back this up, but what I mean it's not really hard to figure it out just look at both AMD and nVidia current architectures lots of shaders. ;)

To me this card is a rip off. I don't want to be rude but I just feel that nVidia is selling us a garbage version of the 8800GT.

What about "better shaders" as opposed to more? Because "more" shaders do not seem to be helping the R6xx cores weighing in a 320sp's.

Pros:
smaller transistor count = smaller die size = less cost to make = less cost to consumer
= potentially higher overclockability = even more for your money.



Cons: Your turn
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Cookie Monster


TMU = Texture Mapping Unit. And in the other thread, AA being done through TMU??? Dont talk if you have no idea what you are talking about. Do some research before making outlandish claims.

Sorry I abbreviate it wrong but the whole dx9 and Geforce 8 did AA using pipes. More TMU gives you better performance with AA. If you know what you are talking about please explain.


Im not sure what you mean by taking x amount of texel fillrate etc but there are several reasons why the 9600GT (G94) can give the 8800GT (G92) a run for its money.

The 8800GT seems to take more of a hit when AA is enabled compared to the 9600GT. Im not so sure what kind of new technique is involved in G94 since nVIDIA doesn't want to share it but its either a new compression method/memory management. (This is what B3D is for :D)

Not all games are shader bound where 112 SPs of the GT can flex its muscle against 64 of the G94, but its rather a combination of things. Bandwidth, framebuffer, fillrate but its too hard to isolate these things. From what i can tell, the 8800GT is kind of a "imbalanced" product. The card is starved of bandwidth yet it has overwhelming shader performance for a mid/high range card. If it was paired with faster memory, or a bigger bus it would easily be faster than the G80 based cards on all fields.

The 9600GT is much more balanced in this sense. But im still curious to see how even with such a gap in SP count between G94/G92 the G94 can keep up. Not to mention a G94 (Half of G80 on 65nm) giving a full fledged RV670 a run for its money. (Pretty much a R600 on 55nm)

At the end of the day, its a win win situation. You cant go wrong buying either a 8800GT or 9600GT. Im not sure how a HD3870 fits into the picture as its having tough time with the new contender. HD3850 is easily the slowest of the 4, and would only be worth it if its priced much lower than $169.

Sorry my english skills aren't good as yours. But you are saying the exact same thing as I was saying just with different words.

The whole RV670 vs 9600gt. They are just different beasts. RV670 has only 16tmu with higher clock speed compared to 32 on 9600gt.

Far as shader goes I give it RV670 performance wise. This RV670 320SP is more equal footing with Nvidia's 96SP.

Performance wise RV670 has the upper hand in higher resolutions with high pixel fillrate. 9600gt packs more punch with AA and lower resolutions. Give or take they take on each other with their strengths and weaknesses.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: SniperDaws
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?


OMG thank you, Somone else thats noticed.

Anyway i ordered my Gigabyte 9600GT Non OC today for £120 and it should be here on

Monday, im so excited to get rid of this awful 1800XL :)

The highest res i use is 1680x1050.

Congratulations. 9600gt is definitely a step above the XL. Probably 2x the performance or a little more.

I already explained to keysplayr2003 if you catch above. I'm just waiting what he says about it or if he has anything to say.

You seem to be saying that an 8800GT's memory bandwidth is stifling it's fillrate, and that it would make little difference if we hacked off 48 sp's from it. Basically performing equal with the 9600GT, all other things being equal like bit-width, memory, shader and core speed.

I'd ask you to look a FS 9600GT article just posted. If you want accurate numbers for bandwidth and fillrate, check it out.

I already looked at the article. Did they tell us what wasn't already known?


 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Azn
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: SniperDaws
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?


OMG thank you, Somone else thats noticed.

Anyway i ordered my Gigabyte 9600GT Non OC today for £120 and it should be here on

Monday, im so excited to get rid of this awful 1800XL :)

The highest res i use is 1680x1050.

Congratulations. 9600gt is definitely a step above the XL. Probably 2x the performance or a little more.

I already explained to keysplayr2003 if you catch above. I'm just waiting what he says about it or if he has anything to say.

You seem to be saying that an 8800GT's memory bandwidth is stifling it's fillrate, and that it would make little difference if we hacked off 48 sp's from it. Basically performing equal with the 9600GT, all other things being equal like bit-width, memory, shader and core speed.

I'd ask you to look a FS 9600GT article just posted. If you want accurate numbers for bandwidth and fillrate, check it out.

I already looked at the article. Did they tell us what wasn't already known?

:::sigh::: I guess it's ring around the rosie time. Ok, have fun. ;)
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003

You seem to be saying that an 8800GT's memory bandwidth is stifling it's fillrate, and that it would make little difference if we hacked off 48 sp's from it. Basically performing equal with the 9600GT, all other things being equal like bit-width, memory, shader and core speed.

I'd ask you to look a FS 9600GT article just posted. If you want accurate numbers for bandwidth and fillrate, check it out.


:::sigh::: I guess it's ring around the rosie time. Ok, have fun. ;)

What's ring around the rosie? :confused:

edit: You are talking about nursery rhyme about the black plague?

I'm not trying to be ring around the rosie time but you can easily calculate exact #'s of these cards by knowing the clock speeds.


 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Originally posted by: MegaWorks
:laugh:

I agree with Azn, more shaders is the right direction for future DX10 games. I know I don't have prof to back this up, but what I mean it's not really hard to figure it out just look at both AMD and nVidia current architectures lots of shaders. ;)

To me this card is a rip off. I don't want to be rude but I just feel that nVidia is selling us a garbage version of the 8800GT.

What about "better shaders" as opposed to more? Because "more" shaders do not seem to be helping the R6xx cores weighing in a 320sp's.

Pros:
smaller transistor count = smaller die size = less cost to make = less cost to consumer
= potentially higher overclockability = even more for your money.



Cons: Your turn

Not all SP are created equal.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
ffs Azn if a game comes out that is shader intensive enough to kill a 9600GT do you think an 8800GT will fly through it ?

The 8800GT will die along with the 9600GT but it wont have cost me an extra £40 for the privelidge.

ive had enough good night god bless.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
NO but it would do a lot better than 9600gt in those situations. IF you are trying to milk your card much as possible a 8800gt is the better card for $30 more.

Most of these reviews bench mostly the same games that consist of FPS. I would appreciate it if they benched some other games.

NFS Prostreet seems to love shader. So does Colin Mcrae Dirt. Why they don't test nothing but Crysis, UT, and COD4, I do not know.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
http://www.pcper.com/article.php?aid=522&type=expert


Other than those features chages, NVIDIA was eager to promote some "new" features that the GeForce 9600 GT. I say "new" like that only because these features ALREADY existed on the G92 cores of the 8800 GT and GTS, they just weren't advertised as heavily. Take this compression technology that allows more efficient transfer of data from memory to the GPU - NVIDIA is comparing it to the G80 in the graph above, not G92.


Is this what everyone is talking about? The compression thing supposedly everyone believes it is only available on 9600gt and not on g92.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: SniperDaws
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?

I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra

Where are you getting your information from? Fudzilla?

I don't get it Azn. What's not to understand? Why do you feel you're being lied to?

Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.

So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?

If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.

Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.

And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?


OMG thank you, Somone else thats noticed.

Anyway i ordered my Gigabyte 9600GT Non OC today for £120 and it should be here on

Monday, im so excited to get rid of this awful 1800XL :)

The highest res i use is 1680x1050.

Congratulations. 9600gt is definitely a step above the XL. Probably 2x the performance or a little more.

I already explained to keysplayr2003 if you catch above. I'm just waiting what he says about it or if he has anything to say.

You seem to be saying that an 8800GT's memory bandwidth is stifling it's fillrate, and that it would make little difference if we hacked off 48 sp's from it. Basically performing equal with the 9600GT, all other things being equal like bit-width, memory, shader and core speed.

I'd ask you to look a FS 9600GT article just posted. If you want accurate numbers for bandwidth and fillrate, check it out.

I already looked at the article. Did they tell us what wasn't already known?

:::sigh::: I guess it's ring around the rosie time. Ok, have fun. ;)

"fun" ? ... did i hear my name?

i'll let FS conclusion speak to you:

http://www.firingsquad.com/har...performance/page16.asp

Quite honestly, with just 64 stream processors we weren?t expecting much from NVIDIA?s GeForce 9600 GT. We knew it would be a strong competitor to the Radeon HD 3850, but we had no clue it would deliver performance that rivals the GeForce 8800 GT in some cases!

So how did NVIDIA manage to pull it off? Obviously the 9600 GT doesn?t have the pure shading horsepower of the 8800 GT, sporting just 64 stream processors compared to the 8800 GT?s 112, but other than the shading unit deficit the two GPUs are quite similar architecturally. Texture filtering and addressing capabilities are the same, as are the number of ROPs. In addition, they both offer the same peak memory bandwidth and the same z and color compression enhancements. This is important as we?re testing these games with the eye candy cranked up and with AA. In these types of situations, the GPU is often bound by its memory subsystem. Also keep in mind that the 9600 GT sports higher graphics core and stream processor clocks than the 8800 GT.

When you combine this with the OC?ed clocks found on the cards we tested, these 9600 GT cards actually offer more ROP fill and memory bandwidth than a bone stock GeForce 8800 GT board.

well they confirmed my guess :p
-Not for high resolution with lots of AA/AF

but for $169 a real *deal*

EVGA 9600GT SuperClocked for only $169
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
For $340 *today* this seems like a great card to SLI. It may even be enough to bust out 1920x1200 powering performance. Too bad that's not possible on Intel chipsets.


 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Azn
Originally posted by: keysplayr2003
Originally posted by: MegaWorks
:laugh:

I agree with Azn, more shaders is the right direction for future DX10 games. I know I don't have prof to back this up, but what I mean it's not really hard to figure it out just look at both AMD and nVidia current architectures lots of shaders. ;)

To me this card is a rip off. I don't want to be rude but I just feel that nVidia is selling us a garbage version of the 8800GT.

What about "better shaders" as opposed to more? Because "more" shaders do not seem to be helping the R6xx cores weighing in a 320sp's.

Pros:
smaller transistor count = smaller die size = less cost to make = less cost to consumer
= potentially higher overclockability = even more for your money.



Cons: Your turn

Not all SP are created equal.

You echo my sentiment. Better shaders would be more beneficial to all, than more shaders. From an all around perspective. Heat, Power, cost. Make it better, not bigger.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Azn
NO but it would do a lot better than 9600gt in those situations. IF you are trying to milk your card much as possible a 8800gt is the better card for $30 more.

Most of these reviews bench mostly the same games that consist of FPS. I would appreciate it if they benched some other games.

NFS Prostreet seems to love shader. So does Colin Mcrae Dirt. Why they don't test nothing but Crysis, UT, and COD4, I do not know.

You mean the newest, hottest games out there right now? Bestsellers? If you travel around to all the review sites, you will find a wide variety of games tested. A game that one site did not test, another site may have. Take your pick.

Azn, do you own a 8800GT?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Originally posted by: Azn
NO but it would do a lot better than 9600gt in those situations. IF you are trying to milk your card much as possible a 8800gt is the better card for $30 more.

Most of these reviews bench mostly the same games that consist of FPS. I would appreciate it if they benched some other games.

NFS Prostreet seems to love shader. So does Colin Mcrae Dirt. Why they don't test nothing but Crysis, UT, and COD4, I do not know.

You mean the newest, hottest games out there right now? Bestsellers? If you travel around to all the review sites, you will find a wide variety of games tested. A game that one site did not test, another site may have. Take your pick.

Azn, do you own a 8800GT?

No I don't own either 8800gt or 9600gt. If I were to upgrade it would probably be 8800gt though.

I've seen other reviews and most of the reviewers benchmark the same games. :(