Cookie Monster
Diamond Member
- May 7, 2005
- 5,161
- 32
- 86
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.
Originally posted by: Cookie Monster
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.
Originally posted by: RussianSensation
Originally posted by: coldpower27
ATI did release 256Bit "mainstream" cards in past generations, they were just beaten by the 128Bit contenders nevertheless, X800 GT and X1800 GTO come to mind.
X800GT was "old" generation and was never meant to compete with GeForce 7 series. But I am pretty sure it closely matched 6600GT (Elite Bastards review:
"ATI have brought the Radeon X800GT to market to compete with NVIDIA's GeForce 6600GT, and from our results you can undoubtedly see that it has succeeded, offering a similar level of performance for a similar price.")
I also don't recall 7600GT outperforming X1800GTO in BF2. It also lost in Call of Duty 2 and Oblivion, which were some grade A titles.
Originally posted by: RussianSensation
Originally posted by: Cookie Monster
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.
Of course when X1650XT was labelled as junk being 1/4 of X1900XTX, no one seemed to disagree. Suddenly a $200 1/4 8800GTX card seems acceptable.
Originally posted by: RussianSensation
Originally posted by: Cookie Monster
Yea, current specualation is that the 8600GTS is 1/4 of G80 or more specifically, 8600GTS is 1/4 of 8800GTX.
Of course when X1650XT was labelled as junk being 1/4 of X1900XTX, no one seemed to disagree. Suddenly a $200 1/4 8800GTX card seems acceptable.
Originally posted by: RussianSensation
And to respond to Hans, I don't necessarily think it has to deal with the level of complexity. Clearly Nvidia can implement 384-bit bus. I personally think it's about margins. If Nvidia can sell 128-bit cards and save on production costs, why not? I am just saying the minute ATI releases a 256-bit mid-range card, Nvidia will realize their mistake. Now ATI just has to execute.