• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

nVIDIA november assault

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: manowar821
There are threads like this everywhere now, people are upset at the lack of a high end before christmas.

I hope you're following these threads, Nvidia, give the people what they want! :)

That's what I'm saying! Put a dual slot cooler on the 8800GT, up the voltage, put 1GB of memory on it and it would beat an ultra and sell it for $499.
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
Originally posted by: Astrallite
Hah, this thread should be renamed "nVidia november assault of the 20fps Crysis beasts".


i can make 3D program that is gonna run 2fps :D

me thinks crytek is to blame, making game nobody can play..who are you gonna sell it to?
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Hah, well Far Cry was pretty badly programmed too, but Crysis is definitely the best looking game out there right now, and it basically manages the same performance as Oblivion, which is pretty impressive because it looks a hell of a lot better than Oblivion.
 

Synomenon

Lifer
Dec 25, 2004
10,547
6
81
Anything yet on whether it does the full HD decoding / CPU offloading that the 8600GTS does?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Shaq
Will this new GTS be 65nm and single slot also? I'm hoping it will be 65nm and dual slot, which should OC well enough to pass a GTX and maybe an ultra. Current 90nm GPU's go 620 regularly so I wouldn't be surprised if it went well over 700 at 65nm. That would definitely be worth an upgrade.

G80 A3 ones can hit 700~ on the 90nm with good cooling. (750MHz is possible but those are the cherry picked ones)

The "new" GTS is simply the old G80 core, defective cores that can have 112 SPs enabled.

Also CJ is hinting at G92 having more than 64/96 SPs. Im not sure if this is true or not but if the 64SPs were dual precision, then they are equivalent to 128 SPs running at single precision. Not to mention that they could make the shaders dual MADD to increase more shader performance just like NV40 to G70.

The most popular guess is 192SP at beyond3d, but im not so sure. nVIDIA might be pulling another "G80" smoke screen so the whole GX2 rumour could be false. I mean, if we got fooled big time when G80 hit, and now the same thing could happen.

I personally want a single GPU based high end solution that is simply based on the successful G80 core thats been re worked for better efficiency, thermal output and other things such as pure video instead of dual GPU solution.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: Cookie Monster
Originally posted by: Shaq
Will this new GTS be 65nm and single slot also? I'm hoping it will be 65nm and dual slot, which should OC well enough to pass a GTX and maybe an ultra. Current 90nm GPU's go 620 regularly so I wouldn't be surprised if it went well over 700 at 65nm. That would definitely be worth an upgrade.

G80 A3 ones can hit 700~ on the 90nm with good cooling. (750MHz is possible but those are the cherry picked ones)

The "new" GTS is simply the old G80 core, defective cores that can have 112 SPs enabled.

Also CJ is hinting at G92 having more than 64/96 SPs. Im not sure if this is true or not but if the 64SPs were dual precision, then they are equivalent to 128 SPs running at single precision. Not to mention that they could make the shaders dual MADD to increase more shader performance just like NV40 to G70.

Are the new GTS's going to be A3's or a mix of A2's and A3's that Nvidia has been collecting since last year? I really wish partners would label the revision on their sites as it makes a pretty big difference. For instance, Clubit advertises a guaranteed G0 which is a good bit better than a B3.

A guaranteed A3 should be 10-15% faster than A2. They may charge a little more for it, but they will still sell extremely well. I guess they haven't fully exploited the high end market and maximized their profits yet. Just think of a A3 rev. 112 shader 640 that will do 700+ and have cherry picked 2200+ DDR3. Mmmm...I'd have to order that GTX killer.

On a side note, are all the A3 128 sp's going to the ultra's and vanilla GTX's getting all the A2's?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I think all G80s cores being shipped are A3 now (we could get some cnfirmation on recent G80 buyers to take the HSF off and check what is says on the IHS of the G80). Im not too sure about the new GTS being A3s. They could be left over G80 stock (so combination of A2s/A3s or even A1s) that just barely passed certification of being a GTX (which leads me to believe that the yields aren't too bad for G80).
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Cookie Monster
Originally posted by: Shaq
Will this new GTS be 65nm and single slot also? I'm hoping it will be 65nm and dual slot, which should OC well enough to pass a GTX and maybe an ultra. Current 90nm GPU's go 620 regularly so I wouldn't be surprised if it went well over 700 at 65nm. That would definitely be worth an upgrade.

G80 A3 ones can hit 700~ on the 90nm with good cooling. (750MHz is possible but those are the cherry picked ones)

The "new" GTS is simply the old G80 core, defective cores that can have 112 SPs enabled.

Also CJ is hinting at G92 having more than 64/96 SPs. Im not sure if this is true or not but if the 64SPs were dual precision, then they are equivalent to 128 SPs running at single precision. Not to mention that they could make the shaders dual MADD to increase more shader performance just like NV40 to G70.

The most popular guess is 192SP at beyond3d, but im not so sure. nVIDIA might be pulling another "G80" smoke screen so the whole GX2 rumour could be false. I mean, if we got fooled big time when G80 hit, and now the same thing could happen.

I personally want a single GPU based high end solution that is simply based on the successful G80 core thats been re worked for better efficiency, thermal output and other things such as pure video instead of dual GPU solution.

It is possible, the Inquirer is throwing out that the rumored die size is around 280mm2, and using the 65nm process as a baseline, that is way too large if this was a native die for only 64SP or even 96 SP, but we will have to see what happens or if that die size is actually correct.

Even if it was 192SP, the current iteration sold wouldn't be that (not enabled) with a 3D Mark 2k6 score of only 10.7K, not to mentioned the rumored price points this part is coming in at.

I don't think there will be any "major" changes this time around, new architectures just don't fall out of the sky from Nvidia, there should be at least 1 generation where there is shrink down and speedup with some "minor" tweaks. It is doubtful the changes to G92 from G80 are anything as drastic as the G71 to G80 transition.

 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: Cookie Monster
I think all G80s cores being shipped are A3 now (we could get some cnfirmation on recent G80 buyers to take the HSF off and check what is says on the IHS of the G80). Im not too sure about the new GTS being A3s. They could be left over G80 stock (so combination of A2s/A3s or even A1s) that just barely passed certification of being a GTX (which leads me to believe that the yields aren't too bad for G80).

Ouch...A1's even? If you get an A1 rev. 600 core will be a stretch I would imagine. I believe all GPU's at launch were A2 AFAIK, at least my GTS 640 is and I purchased it on launch day, but who knows for sure. You should be able to use Gpu-z to get revision. The 5 build showed up as A3 on my 320 Fatal1ty. Rivatuner's diagnostic also has it if someone wants to check. That's how I knew my 640 was A2.

I thought I read in one of these rumor stories that it required a new manufacturing process. I'll see if I can find it again. Since this launch is silent does that mean that outlets can sell it as soon as they get stock instead of waiting for a magical launch date?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Updated again, this time the specifications look something along the lines of

G92 - 8800GT
65nm
Core clock 600MHz
Memory clock 1800MHz
112SPs
256bit memory interface
GDDR3
512mb ~ $249
256mb ~ $199

?TMUs/ROPs

Now it gets pretty interesting.
 

novasatori

Diamond Member
Feb 27, 2003
3,851
1
0
damn can't wait to see this out
hopefully it will really retail ~$200 for 256mb
I need two cards bad
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
If the GT is 112, what room does that leave for even the revamped GTS?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
There is a missing link here that we can't seem to get our hands on. The current 8800GTS is supposed to get an upgrade of 16 more scalar shaders for a total of 112, and all else remains the same. The 8800GT, if said to encroach on a 96 shader 8800GTS, is very likely more than 64 shaders especially with a cut down bus (256bit).
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Cookie Monster
Updated again, this time the specifications look something along the lines of

G92 - 8800GT
65nm
Core clock 600MHz
Memory clock 1800MHz
112SPs
256bit memory interface
GDDR3
512mb ~ $249
256mb ~ $199

?TMUs/ROPs

Now it gets pretty interesting.

That just can't be right, if it has that much shader power it would overpower the 8800 GTS ver 2 let alone the old 8800 GTSs currently in production now.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: keysplayr2003
There is a missing link here that we can't seem to get our hands on. The current 8800GTS is supposed to get an upgrade of 16 more scalar shaders for a total of 112, and all else remains the same. The 8800GT, if said to encroach on a 96 shader 8800GTS, is very likely more than 64 shaders especially with a cut down bus (256bit).

Yeah, I am thinking more along the lines of 80 Shader Units, the new rumor of 112 on the 8800 GT with a 600MHZ core makes no sense, and seems much too powerful for it's price point.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: coldpower27
Originally posted by: keysplayr2003
There is a missing link here that we can't seem to get our hands on. The current 8800GTS is supposed to get an upgrade of 16 more scalar shaders for a total of 112, and all else remains the same. The 8800GT, if said to encroach on a 96 shader 8800GTS, is very likely more than 64 shaders especially with a cut down bus (256bit).

Yeah, I am thinking more along the lines of 80 Shader Units, the new rumor of 112 on the 8800 GT with a 600MHZ core makes no sense, and seems much too powerful for it's price point.

Yeah...80x600=48000, 96x500=48000. They would be equal. I know that calculation isn't 100% accurate but should be a good rough estimate. However memory is still significantly slower on 256-bit bus @1800 vs. 320-bit @1600. New GTS is 112x500=56000 or roughly 16.66% faster than GT.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Shaq
Originally posted by: coldpower27
Originally posted by: keysplayr2003
There is a missing link here that we can't seem to get our hands on. The current 8800GTS is supposed to get an upgrade of 16 more scalar shaders for a total of 112, and all else remains the same. The 8800GT, if said to encroach on a 96 shader 8800GTS, is very likely more than 64 shaders especially with a cut down bus (256bit).

Yeah, I am thinking more along the lines of 80 Shader Units, the new rumor of 112 on the 8800 GT with a 600MHZ core makes no sense, and seems much too powerful for it's price point.

Yeah...80x600=48000, 96x500=48000. They would be equal. I know that calculation isn't 100% accurate but should be a good rough estimate. However memory is still significantly slower on 256-bit bus @1800 vs. 320-bit @1600. New GTS is 112x500=56000 or roughly 16.66% faster than GT.

Significantly slower? The overall bandwidth of the 8800 GT is 90% of the 8800 GTS (new or old), I hardly call that significant by any means. You also got to keep in mind memory bandwidth is one of the weaker items that determine performance, unless you go into insane resolution territory of course.

Yeah I made that calculation in my mind after I saw the post about 112 Stream processors a while back on the 8800 GTS hence why I picked the 80 value as a good balance.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Well both memories at 2000 would be a 25% advantage to the GTS which should be significant in games. I imagine they are rated the same, but the GT is clocked higher to compete with the GTS and will have less headroom.

If you don't OC yeah it won't be very significant (10%), and you will have a card that is equal (80x600) or 10-15% slower (600x64) than the current GTS, which fills a needed gap between the 8600 and 8800 GTS's. But I would rather purchase a 320GTS than a 8800GT 512 (if 64 shaders) at the same price point for the 20% faster OC'd core. At 64 shaders the card may not be powerful enough to use all that frame buffer. At 80 shaders absolutely get the 8800GT 512 over the 8800GTS 320.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Shaq
Well both memories at 2000 would be a 25% advantage to the GTS which should be significant in games. I imagine they are rated the same, but the GT is clocked higher to compete with the GTS and will have less headroom.

If you don't OC yeah it won't be very significant (10%), and you will have a card that is equal (80x600) or 10-15% slower (600x64) than the current GTS, which fills a needed gap between the 8600 and 8800 GTS's. But I would rather purchase a 320GTS than a 8800GT 512 (if 64 shaders) at the same price point for the 20% faster OC'd core. At 64 shaders the card may not be powerful enough to use all that frame buffer. At 80 shaders absolutely get the 8800GT 512 over the 8800GTS 320.

Why would you compare both memories at the same clockspeed? I mean that is more an academic exercise then of real world value, the stock frequencies allow the 8800 GT to compensate some on the memory bandwidth due to higher clockspeed. The GT is rumored to use 1.0ns GDDR3 chips, while the GTS uses 1.2ns. So even with overclocking I expect the GT to still compensate somewhat on overall memory bandwidth.

This is like comparing the 7600 GT vs the 7800 GS, the latter had plenty more bandwidth while being lower clocked but needed to be overclocked to show it's true potential but which would mean only realized if you did OC. Stockwise they were about even.

The Radeon X1950 XT to X1950 XTX was a 33% increase in memory bandwidth and didn't show much if any significant advantage in games. Memory bandwidth past a certain level just isn't that important and like I said, only at the insane resolutions with performance taxing features applied.



 

Ares202

Senior member
Jun 3, 2007
331
0
71
when is the 2950 due because in initial benchmarks its beating the 8800 ultra comprehensively

im waiting till november to buy my rig because the jury is still out weather Nvidia will release a G92 with unlocked shaders and dual slot with higher voltage, they just might be keeping it under wraps to confuse ATI
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Originally posted by: Ares202
when is the 2950 due because in initial benchmarks its beating the 8800 ultra comprehensively
Can we see those initial benchmarks?