8600GTS to have 32 Stream Shaders

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Cookie Monster
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.

Of course when X1650XT was labelled as junk being 1/4 of X1900XTX, no one seemed to disagree. Suddenly a $200 1/4 8800GTX card seems acceptable.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: RussianSensation
Originally posted by: coldpower27

ATI did release 256Bit "mainstream" cards in past generations, they were just beaten by the 128Bit contenders nevertheless, X800 GT and X1800 GTO come to mind.

X800GT was "old" generation and was never meant to compete with GeForce 7 series. But I am pretty sure it closely matched 6600GT (Elite Bastards review:
"ATI have brought the Radeon X800GT to market to compete with NVIDIA's GeForce 6600GT, and from our results you can undoubtedly see that it has succeeded, offering a similar level of performance for a similar price.")

I also don't recall 7600GT outperforming X1800GTO in BF2. It also lost in Call of Duty 2 and Oblivion, which were some grade A titles.

I didn't specify which generation the 256Bit cards were competing to and I assume you already knew what they were, but if you don't the X800 GT is meant for the 6600 GT and the X1800 GTO was meant for the 7600 GT.

X800 GT vs 6600 GT was about even despite the 256Bit bus, but rather expensive to make.

My mistake for the X1800 GTO, it was indeed a bit faster, anything that ATI had an advantage in was extended. The issue becomes winning at what cost?

Its the same with the X1900's vs the 7900 GTX, the former was quicker but much more expensive to fabricate, winning at all costs isn't viable for mass market cards in the mainstream segment.

256Bit PCB will remain at performance-mainstream offerings, intended for something higher then $199 price point for the most part and typically on disabled or only slightly scaled down high end cores.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: RussianSensation
Originally posted by: Cookie Monster
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.

Of course when X1650XT was labelled as junk being 1/4 of X1900XTX, no one seemed to disagree. Suddenly a $200 1/4 8800GTX card seems acceptable.

X1650 XT is 1/2 the X1900 and X1950 XTX line, the X1600 Pro/XT were the ones that were bitched about at 1/4 of the R580, they also represented a very weak jumped over the competition which was the 6600 GT at the time.

ATI could take advantage here as long as their RV630 is more then 1/4 the R600. :D You also go to keep in mind the clockrates the 8600 GTS is perhaps 3/10 the 8800 GTX.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: RussianSensation
Originally posted by: Cookie Monster
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.

Of course when X1650XT was labelled as junk being 1/4 of X1900XTX, no one seemed to disagree. Suddenly a $200 1/4 8800GTX card seems acceptable.

the x1650xt was half of a x1900xtx. it had 8 rops and 24 pixel shaders, compared to 16 rops and 48 shaders on the x1900xtx.

maybe you're thinking about the x1600xt? now that card was 1/4th of a x1900xtx. it was labelled as junk because it sucked :). (at least compared to the 7600's)

We havent seen benchmarks from ATI's new midrange cards, so the 8600's may get labelled as junk if they get killed by ATI's cards.
 

A554SS1N

Senior member
May 17, 2005
804
0
0
Hmm, 32 stream shaders and 8 ROP's explain the crappy performance, I guess the premium is probably just temporary as it has the DX10 features. At current estimates it is way over priced but that'll probably change big time when ATi/AMD release their parts. I have to say, it's the first mid-range NVidia card I'm not getting excited about for a long while. The 6600GT and 7600GT were brilliant, but this 8800GTS seems like a half-hearted effort.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
coldpower27 and schneiderguy thanks for correcting me on the X1650XT/X1600XT specs (the point is most everyone was disappointed by ATI's current gen mid-range products until X1950Pro came out).

I just think Nvidia had the luxury of selling their products at high prices due to lack of competing products from ATI. We'll have to check back in a month to see how this turns out. The Inquirer rumoured that ATI plans to undercut Nvidia products by "offering 8800GTX performance at 8800GTS prices." We'll have to see if this is true and how that will affect mid-range.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Seems funny that it would have only 32 shader units after so many references to it having 64. However, I'm not sure what to make of the so-so performance.

Cookie - you could be right, but I don't really understand how the balance of ROPs to Shaders affects performance well enough to agree or disagree.

Matt2 - I do agree with you that people shouldn't expect too much from a true midrange card. You want high-end performance then you need to buy higher-end parts. Modified high-end cards that come down to the upper mid-range are a little different beast.

Then again, I think the reason I was hoping for more out of the 8600 was *because* of its purported price range. The $200-230 market is pretty heady territory and it doesn't really look like it will be a good performer at that price. Once it drops into the $150-$175 range I think it becomes a different story. Although I'd say I was still hoping for a bit better performance. Something along the lines of an x1900xt instead of a 7900gt.
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
Originally posted by: RussianSensation
And to respond to Hans, I don't necessarily think it has to deal with the level of complexity. Clearly Nvidia can implement 384-bit bus. I personally think it's about margins. If Nvidia can sell 128-bit cards and save on production costs, why not? I am just saying the minute ATI releases a 256-bit mid-range card, Nvidia will realize their mistake. Now ATI just has to execute.

Mistake? At the rate ATI is dragging its ass, nvidia has plenty of time to load the cannons with a better midrange card or slash the prices on the 8600gts. ATI blew it and now nvidia is just capitalizing on that. I can't really blame nvidia for their moves here, they mostly make good business sense.

They'll react. But they don't need to yet.

And hey, I miss the days of buying a cheapo part and unlocking its pipes too...but I can't blame nvidia for shutting the door on that one. That probably didn't just cause them lost revenue from high end sales...they probably had a lot of jerks buying a 6800nu and then returning it when it didn't unlock fully.