nVIDIA november assault

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ConstipatedVigilante

Diamond Member
Feb 22, 2006
7,670
1
0
64 SPs seems a bit low for a high-end part (I mean, half the GTX?) unless the 65nm clocks are quite high indeed. I would guess at more in 70's or 80's. Hopefully, at least.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
80 shaders seems more likely, because with 64 shaders I don't see how the card would threaten a 8800gts in performance. Plus, with the 320mb gts selling for under $300 and the negative reviews the 8600gts received due to underwhelming performance, I don't think Nvidia wants to drop the ball again on this one.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Shaq
Originally posted by: munky
The new gts is still the same 90nm g80 core, only this time with 112 shaders enabled. But even so, at 650mhz the new gts would be a real competitor to a stock gtx, unless the extra memory and bandwidth on the gtx make a significant difference.

That's unfortunate. That wouldn't be worth upgrading for current 640 owners. Why can't Nvidia take the same 8800GT GPU, make it dual slot and up the voltage, put 1GB of DDR4 memory and sell it for $499? It seems it would be no problem for them. It would be faster than a GTX so they could discontinue them and would save them money due to the smaller process. They would sell extremely well since it is Xmas time and Crysis, UT3, Gears and several other AAA games are coming out.

Why do they want to drop the ball and give DAAMIT time to catch up? The last 2-3 years they have been good at staying one step ahead of ATI.

According to this link off firing squad the current 640 is about 10% slower overall when OC'd compared to a stock GTX. So it should be almost the same with the extra shader processors. http://www.firingsquad.com/har..._gtx_gts_overclocking/

Of course for current gts owners it's not worth it. But for me, seeing how I'm still using a 1900xt, it would be exactly what I've been waiting for. $350 right now seems too much to pay for a card that debuted at $400 a year ago.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: munky
80 shaders seems more likely, because with 64 shaders I don't see how the card would threaten a 8800gts in performance. Plus, with the 320mb gts selling for under $300 and the negative reviews the 8600gts received due to underwhelming performance, I don't think Nvidia wants to drop the ball again on this one.

It's a threat non the less when 8800gt gets 90% of the performance and cost $150 less.
 

aiya24

Senior member
Aug 24, 2005
540
0
76
Originally posted by: munky
Of course for current gts owners it's not worth it. But for me, seeing how I'm still using a 1900xt, it would be exactly what I've been waiting for. $350 right now seems too much to pay for a card that debuted at $400 a year ago.

i feel the same way, $350 is way too much for me to spend on a video card. i did purchase a GTS640 too but sold it since i couldn't afford to keep it. this 8800GT and AMD/ATi's RV670 seem to be good high-midrange cards and will be welcomed additions in the <$250 market. i actually have something to look forward to this October/November :D.
 

CP5670

Diamond Member
Jun 24, 2004
5,667
766
126
Originally posted by: lopri
Sad. In the past we would see this part as 9900 GT and there would be a 9900 GTX/Ultra that'll perform more or less at 150% of 8800 GTX. How long will 8800 GTX stay at the top? 2 years?

Don't get me wrong. This part is indeed like a hypothetical 9900 GT and I do like the return of single-slot card. If this card has the 8600-like (or even better) video processing capability for HD conents, it's a near-ideal solution for mid to mid-high market. I just lament the lack (or delay) of ultra high-end updates.

My thoughts too. :( I was hoping those earlier rumors of G92 being a next gen high end card were correct. I have been holding off getting a 8800GTX (or GTS for that matter) for several months now, first because there weren't enough games that made it worthwhile and later due to those rumors, but it looks like I shouldn't have waited so long. I don't want to buy one at this point given the current prices and the fact that it struggles in several DX10 games.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Hah, yep, those of us with DX9 cards will need to wait at least another half year to a year before there's something worth dropping big cash for. At this point jumping on the bandwagon for a card other people have had for a year and paying the same price they did (or more, since the stocks are low) is just consumer suicide.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: chizow
Originally posted by: coldpower27
My guess is that the new card has 80 stream processors and 20 TMU, at 600MHZ this means 48K Shader Cycles = to the old 8800 GTS. Hence the new 8800 GTS 640 at 112 Stream processors for 57.6K Shader Cycles and 28 TMU. This would put some distance between them. Remember if 112 is possible which is 7/8 there is no reason why 5/8 is impossible.

I am going to assume the 10.7K 3D Mark 2006 Score without any other available information was derived using a Core 2 Extreme QX6850. Anyone have a gather at what a reference 8800 GTS 640 gets with that processor?

10k 3DMark06 is about the same as a 640MB GTS with a C2D, so ya, the 8800GT is very close in performance.

I also think the number of shaders on the GT would have to be closer to the 96 on the GTS based on clock speeds in order to be competitive, so 80 is definitely a possibility, HOWEVER, I don't think they're simply disabling shader quads, or more accurately, octets on the GT like they did with the GTS. GT is on a 65nm process and geared for the mainstream. Disabling shaders wouldn't make sense as there's no higher end part on this process yet and would eat much of the cost-saving benefits of a move to a smaller, cooler part. This will definitely be a part to keep an eye on for OC'ers if it has closer to 96 shaders on a smaller process, as much higher clock speeds should be possible with cooler temps and less power draw. I wouldn't worry too much about the 256-bit memory interface either, as I think the 320 and 384 on the GTS/GTX is still overkill with current clock speeds.

Oh sorry, I was just saying that a number like 80 Shaders is possible due to the 112 Shaders on the new G80 SKU of 8800 GTS 640 I didn't imply that the G92 will have disable shader "octets" I think it's going to be something like the 7600 GT and natively only have 5 blocks of 16 for 80 Shader Units.

All I way saying that Nvidia isn't restricted to using numbers like 32/64/96/128 etcc.. since the LCD is 16 in this case.

 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Nothing better than the 8800 Ultra anytime soon then it seems :frown:

My 2560x1600 needs feeding...c'mon nV/AMD...one of you step up...
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
The new gts looks like a nice gain for the same money! Competition is good. :beer:
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: n7
Nothing better than the 8800 Ultra anytime soon then it seems :frown:

My 2560x1600 needs feeding...c'mon nV/AMD...one of you step up...

There is always SLI Geforce 8800 GTX! If you must have 2560x1600 resolution in all games like Crysis. I knew people were going to regret getting such a high native resolution on a monitor. SLI can barely drive 2560x1600 but it works well enough. Bare minimum.

 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: pcslookout
Originally posted by: n7
Nothing better than the 8800 Ultra anytime soon then it seems :frown:

My 2560x1600 needs feeding...c'mon nV/AMD...one of you step up...

There is always SLI Geforce 8800 GTX! If you must have 2560x1600 resolution in all games like Crysis. I knew people were going to regret getting such a high native resolution on a monitor. SLI can barely drive 2560x1600 but it works well enough. Bare minimum.

Hah, no thanx.

Since the induction of SLI/CF, there's hasn't been a chance in the world i'd try to run the abomination of issues & drawbacks that is SLI/CF.


 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Get ready for a rude-awakening. NV is getting all worked up to swing the SLI flag in full force once again. They just couldn't do it because their drivers were just.. not ready. :laugh: But there are many forward-looking signs (for SLI) in current and upcoming NV hardware. NV has already experimented with I/O and processing separation (NVIO), and I expect their upcoming high-end platform will have this video I/O chip built-in. With the die-shrink of G80, introduction of PCIE 2, and the integration of video I/O processing in the platform they will attempt to rectify the well-known shortcomings of SLI.

If I dare to predict, this attempt will drag many early adopters to seemingly eternal misery for one good year, at least, if we go by their past. I'm personally determined not to be their beta-tester ever again, but admittedly there was a period of time that I enjoyed SLI. Thankfully at that time the first generation SLI was as mature as could be (7900 GTX SLI in mid 2006), and I enjoyed quite a few games at quite-high-back-then resolution of 1920x1200 for about 6 months.

So yeah, if you visit NV's website, you will see inordinary amount of SLI propaganda. A taste of things to come. :D
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
It sure does look like the plan til the next architecture hits us some time next year. The only reason NVIO was left out of G80 was because of design choice. NVIO itself is pretty small, and although the chip looks big, most of it is filled with just "filler" silicon. G92 sure does incorporate this into the main GPU die but im interested in what other tweaks and changes have been made to the GPU itself. (especially the ALU:TEX ratio + the changes that could have taken place for the SP).

I wouldn't mind a good die shot of it. :)

 

Canterwood

Golden Member
May 25, 2003
1,138
0
0
Hmmm still nothing thats going to be able to perform well in DX10.

Looks like I'll be missing this round of cards as well, and wont be upgrading until well into 2008.

And I can see price gouging by retailers to ensure customers get the 112 stream GTS, a bit like the G0 stepping Q6600 debacle.

The video card market is in a terrible state for consumers atm.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
Originally posted by: Canterwood
Hmmm still nothing thats going to be able to perform well in DX10.

Looks like I'll be missing this round of cards as well, and wont be upgrading until well into 2008.

And I can see price gouging by retailers to ensure customers get the 112 stream GTS, a bit like the G0 stepping Q6600 debacle.

The video card market is in a terrible state for consumers atm.

Look on the bright side! All the folks who bought Geforce 8800 series video cards now can rest easy knowing their purchase was well worth it! Even the ones who bought them when they first came out!
 

Canterwood

Golden Member
May 25, 2003
1,138
0
0
Originally posted by: pcslookout
Look on the bright side! All the folks who bought Geforce 8800 series video cards now can rest easy knowing their purchase was well worth it! Even the ones who bought them when they first came out!

Whats that got to do with it?

Even if the 8800 had been replaced with better technology, those that had them wouldn't suddenly have a bad card and feel ripped off.

Dragging the 8800 series on just shows what a lack of innovation there is in the video card market atm.
Like I said already, its a sorry state of affairs, and I wont part with my money until things get better than this.
 

Canterwood

Golden Member
May 25, 2003
1,138
0
0
Originally posted by: Azn
innovation? These corps just want to milk you.

Yeah, and ATI's lack of a competition hasn't helped.

Hopefully though thats about to change.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: lopri
Get ready for a rude-awakening. NV is getting all worked up to swing the SLI flag in full force once again. They just couldn't do it because their drivers were just.. not ready. :laugh: But there are many forward-looking signs (for SLI) in current and upcoming NV hardware. NV has already experimented with I/O and processing separation (NVIO), and I expect their upcoming high-end platform will have this video I/O chip built-in. With the die-shrink of G80, introduction of PCIE 2, and the integration of video I/O processing in the platform they will attempt to rectify the well-known shortcomings of SLI.

If I dare to predict, this attempt will drag many early adopters to seemingly eternal misery for one good year, at least, if we go by their past. I'm personally determined not to be their beta-tester ever again, but admittedly there was a period of time that I enjoyed SLI. Thankfully at that time the first generation SLI was as mature as could be (7900 GTX SLI in mid 2006), and I enjoyed quite a few games at quite-high-back-then resolution of 1920x1200 for about 6 months.

So yeah, if you visit NV's website, you will see inordinary amount of SLI propaganda. A taste of things to come. :D

Hmmm, wasn't there grumblings about dual core SLI and higher on a single package/PCB a while ago with early reports about SLI 2.0? If I had to guess, this is the direction NV may be going with SLI and makes sense given the success of C2D/C2Q. I know there's some issues with integrating multiple GPUs compared to multiple CPUs, but making the SLI bridges via hardware on the same package using the same memory interfaces would be good start for more seamless SLI integration.

This would also make sense if the 8800GT is the first GPU on the smaller 65nm process but weighs in somewhere around 80 shaders. This would give NV a capable core on its own, but when scaled with 2, 4, etc. cores give NV the room to create their high-end monsters. It would also be much more cost efficient to simply use more cores and a more complex PCB rather than their recent trend of simply neutering their high-end parts to meet lower performance segments.

Originally posted by: coldpower27
Oh sorry, I was just saying that a number like 80 Shaders is possible due to the 112 Shaders on the new G80 SKU of 8800 GTS 640 I didn't imply that the G92 will have disable shader "octets" I think it's going to be something like the 7600 GT and natively only have 5 blocks of 16 for 80 Shader Units.

All I way saying that Nvidia isn't restricted to using numbers like 32/64/96/128 etcc.. since the LCD is 16 in this case.
Ya I agree, just wanted to clarify for others but the 16 LCD was a good point worth emphasizing. Also wanted to emphasize that whatever # of shaders G92 comes with will most likely be native in the absence of a high-end part on the same process.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
I'm surprised nobody has mentioned this yet, but what about a texture evict bug on the 256 meg versions of the G92? Supposedly it was fixed, but I still see threads about a few games that still have it. Since the card has less memory than the 320 version I figured it would be worse. Or was there a problem with the hardware that they have fixed along with the die shrink? Because I heard that the 320's will be discontinued once the G92's drop.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Canterwood
Originally posted by: Azn
innovation? These corps just want to milk you.

Yeah, and ATI's lack of a competition hasn't helped.

Hopefully though thats about to change.

We all knew ATI wasn't going to lower their brand new released 2900xt prices any time soon. Nvidia had their card out 6 months prior. Nvidia could have lowered the price if they wanted on their high end part some. Yup Nvidia milked the PC gamers for good 6 months for lack of competition.

Things are about to change though in the middle to high end range.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Hah, this thread should be renamed "nVidia november assault of the 20fps Crysis beasts".
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: Azn
Originally posted by: Canterwood
Originally posted by: Azn
innovation? These corps just want to milk you.

Yeah, and ATI's lack of a competition hasn't helped.

Hopefully though thats about to change.

We all knew ATI wasn't going to lower their brand new released 2900xt prices any time soon. Nvidia had their card out 6 months prior. Nvidia could have lowered the price if they wanted on their high end part some. Yup Nvidia milked the PC gamers for good 6 months for lack of competition.

Things are about to change though in the middle to high end range.


At release the 8800GTS was $500 and the GTX was $650 so they have come down pretty significantly.

 

manowar821

Diamond Member
Mar 1, 2007
6,063
0
0
There are threads like this everywhere now, people are upset at the lack of a high end before christmas.

I hope you're following these threads, Nvidia, give the people what they want! :)