Where it's evident that it's just cut back? G94 also has updated memory channel system..meaning that 9600 GT's effective membandwith is higher than 8800 GT's even when both have memory clocked at 1800MHz and they have 256-bit mem bandwith; you know that this 57.6GB/s is only the maximum number.Originally posted by: Azn
So you are saying G94 has had architectural change from the G92? How?
It's just a cut back of G92 chip. 32 tmu and 64 shader. 8 texture per clock for 16sp. It is evident there is no architectural change other than a cut back. There's no new features that has been announced.
Do you remember "The way it's meant to be played"-campaign. It basically means that Nvidia and game maker are working together. I don't think that this would lead in to more shader-heawy solutions now that Nvidia's strenghts are elsewhere. Latest to join Nvidia camp was Valve. Guess what happens when shader heavy Source-engine is out lived?9600gt does well because most of the texture filrate if not all are being used up in games. Shader isn't the holy grail of performance although video game programmers think it is and they will be adding more effects as time progress into the future. A game has to have more shader effect to take advantage. Current crop of games aren't as shader intensive that a 9600gt can't handle with 64SP.
Originally posted by: Azn
What I'm trying to say is don't get suckerd in by current performance compared to 8800gt. 8800gt is still the best deal to be had. Don't skimp on the shader for measly $30 difference. Shader will have more profound effects well into the future.
Originally posted by: raddreamer3kx
Originally posted by: Azn
What I'm trying to say is don't get suckerd in by current performance compared to 8800gt. 8800gt is still the best deal to be had. Don't skimp on the shader for measly $30 difference. Shader will have more profound effects well into the future.
Its still more money,people dont want to spend more money,this card performs pretty much identical to the 88gt.
I hope we are not getting some 88gt owners who paid almost $300 for there card crying abit.
Originally posted by: Rusin
Where it's evident that it's just cut back? G94 also has updated memory channel system..meaning that 9600 GT's effective membandwith is higher than 8800 GT's even when both have memory clocked at 1800MHz and they have 256-bit mem bandwith; you know that this 57.6GB/s is only the maximum number.Originally posted by: Azn
So you are saying G94 has had architectural change from the G92? How?
It's just a cut back of G92 chip. 32 tmu and 64 shader. 8 texture per clock for 16sp. It is evident there is no architectural change other than a cut back. There's no new features that has been announced.
Do you remember "The way it's meant to be played"-campaign. It basically means that Nvidia and game maker are working together. I don't think that this would lead in to more shader-heawy solutions now that Nvidia's strenghts are elsewhere. Latest to join Nvidia camp was Valve. Guess what happens when shader heavy Source-engine is out lived?
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
Originally posted by: Piuc2020
I highly doubt nvidia doesn't know G94 is "magically" better than G92, otherwise they would just make a high end G94 card... the G94 has some minor changes but it's nothing revolutionary, it does good in games because it has good clockspeed, great memory bandwidth and a few AA optimizations. A 9600GT with 112 shaders... would be a little better than a 8800GT but just because of the AA optimizations (and as such would only be better in AA settings), in any case it wouldn't be 1.5x better.
In some games like Crysis, the 8800GT beats the 9600GT with ease, those 64 extra shaders DO make a big difference.
I think the 9800GT and 9800GTX will be slightly tweaked G92, by this I mean slightly higher clockspeeds, a few optimizations here and there and better AA performance (aa bug) so that the performance is rounded up and the old G80s become obsolete.
Most reviews pity highly overclocked 9600GTs (like 700mhz) to stock 8800GT and this had led people to believe the 9600GT is a revolutionary architecture that performs close to the 8800GT, well in the reviews where stock is compared to stock the 8800GT is up to 20% faster, hell on games like Crysis it's not even close and we all know games are only going to get further towards the Crysis benchmark and not farther away from it.
The 9600GT is a great card, but it's no 8800GT killer.
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
Where are you getting your information from? Fudzilla?
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
Where are you getting your information from? Fudzilla?
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
Where are you getting your information from? Fudzilla?
I don't get it Azn. What's not to understand? Why do you feel you're being lied to?
Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?
If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.
Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.
And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?
Originally posted by: bryanW1995
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
Where are you getting your information from? Fudzilla?
I don't get it Azn. What's not to understand? Why do you feel you're being lied to?
Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?
If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.
Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.
And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?
I don't presume to speak for azn, but "definitely be 1.5x to 2x better than 3870x2" is a pretty strong word choice here. Especially when you consider that 9800gx2 is STILL in limbo. When will it arrive, april? may? will it be relevant 3 months after introduction, or is it going to be another 7950gx2 and get eclipsed by 9800gtx? Even assuming that 9800gx2 will have a theoretical advantage of 20% + over 3870x2, will nvidia have strong enough driver support to justify it as a medium term gaming solution? It will almost definitely be the fastest single card for the few months between its introduction and 9800gtx's, but after that all bets are off. If you're married to nvidia then either stick to single card solutions or get 2 of those msi OC cards for $200 each at newegg and sli them.
And that offer is always open.
Originally posted by: apoppin
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?
well ... clock-speeds appear to be up ... a little
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res
Originally posted by: apoppin
And that offer is always open.
look up ^^ two posts ... i proposed a theory to partially explain it although ultimately i do not agree with Azn
nvidia 'tweaked' the architecture
well ... clock-speeds appear to be up ... a little
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res
Originally posted by: keysplayr2003
Originally posted by: apoppin
And that offer is always open.
look up ^^ two posts ... i proposed a theory to partially explain it although ultimately i do not agree with Azn
nvidia 'tweaked' the architecture
well ... clock-speeds appear to be up ... a little
... perhaps nvidia is rubbing AMD's nose in the fact that shaders don't matter so much yet at lower resolutions.
--i would NEVER pick this G94 for high res
Ah, I see your edit up there. But wasn't this card tested at all resolutions? up to 25x16?
You would never pick one up, but would you pick two?
Anands review shows this card playable in Crysis, without AA, at 19x12. 35.x fps. And 33.3 fps at 19x12 in STALKER?
Originally posted by: apoppin
two .. of course NOT ... it is 256-bit ... the very issue you chided me on in pairing my 2900xt with my 256-bit 2900p.
You were "chided" for your motherboard limitations, if nothing else.
It *would* make a serious handicap if i gamed at 19x12 with all the details ... not at 16x10.
Really? A handicap? I guess that would depend on the game, no? Besides, if you can play at 19x12 with a single G94, Why don't you think there would be an improvement with two? Meaning more candy? Did you not say you had great success with your Xfire setup? More than you couldn't hoped for? And that is with your 2x 2900pro's (256-bit).
yes it was tested .. and i would much prefer the 2xGTS 512-bit for especially 25x16 ... NEVER even 2xG94 -256bit for the SMALL price difference IF i was running such a nice and expensive display.
I would agree with you, but the SLI scores for this card seemed to scale pretty well. I'd say that would leave room for some candy. And at 179 a pop, (358 total) not bad.
Originally posted by: keysplayr2003
Originally posted by: Azn
Originally posted by: lookouthere
Originally posted by: Rusin
So Nvidia would be just lying when they say it has 64SP?
I dont think Nvidia would lie about the number of shaders. But I do think G94 is an improvement over G92. If this were to have 112 or 128 shaders, it would definitely be 1.5x to 2x better than 3870x2. So take a guess....for the 9800GX2, it probably would be at least 2x better than the current 8800Ultra
Where are you getting your information from? Fudzilla?
I don't get it Azn. What's not to understand? Why do you feel you're being lied to?
Look at the honkin specs. It is PLAINLY evident there have been some architectural improvements. Could be nothing more than an improved memory controller. Could be a couple of differnt minor things as well. But it's still a change.
So, can you explain how this G94 does so well for what it is when compared with other G92's with loads more shaders?
If G94 was nothing more than a 8800GT with 48 shaders cut off of it, don't you think it would perform condierably worse than it does? It would have to be that way.
Look at the difference between a 8800GTS (G92) and a 8800GT. Only a 16 sp difference, and there is a good percentage difference in performance. Now remove an additional 48 sp's from the 8800GT. What do you think would happen to the performance? equal to a 9600GT? Oh but wait, a 9600GT is only slightly behind a 8800GT with it's full 112 sp's.
And you can't see this because????? Please clear it up for me? I have an open mind. Change it for me. Tell me what I'm missing?
i had great success because i did not change my resolution. 512bit is overkill for 16x10 and 256bit is not enough for my Pro to handle 19x12 - even together.Originally posted by: keysplayr2003
Originally posted by: apoppin
two .. of course NOT ... it is 256-bit ... the very issue you chided me on in pairing my 2900xt with my 256-bit 2900p.
You were "chided" for your motherboard limitations, if nothing else.
It *would* make a serious handicap if i gamed at 19x12 with all the details ... not at 16x10.
Really? A handicap? I guess that would depend on the game, no? Besides, if you can play at 19x12 with a single G94, Why don't you think there would be an improvement with two? Meaning more candy? Did you not say you had great success with your Xfire setup? More than you couldn't hoped for? And that is with your 2x 2900pro's (256-bit).
yes it was tested .. and i would much prefer the 2xGTS 512-bit for especially 25x16 ... NEVER even 2xG94 -256bit for the SMALL price difference IF i was running such a nice and expensive display.
I would agree with you, but the SLI scores for this card seemed to scale pretty well. I'd say that would leave room for some candy. And at 179 a pop, (358 total) not bad.