Originally posted by: Azn
Originally posted by: golem
Pixel fillrate and bandwidth. But mostly bandwidth if you look at 8800gtx vs 8800gts 512mb. The G92 chip is pulling more raw frames but when it is limited to bandwidth that's where 8800gts 512 chokes and 8800gtx in those limited situations. 9600gt looks good compared to a 8800gt because in most situations it is limited by bandwidth especially you are running with AA. It's raw performance only compares to 3850 currently but it might be worse when more complex shaders are used in a game for instance "Fallout" which is coming this Fall. Just explaining to sniperdaws why 3870 AA scores are low not that the card sucks. Currently doing it in conventional method is faster than doing it through shader. You can't really dog on 3870 for that because Nvidia will release dx10.1 card that might do worse.
So the 9600GT is better for most current games since you can add AA while the 3870 might be better for future games since you might have to turn off AA off anyway and it has more shaders? I know that's really simplified, but is that what you mean?
might? Like Smith says in the matrix.... "It's inevitable"
😛
Up until Geforce 8 was released the shader was tied down to rop nor did high end cards have noticeably faster shader over the midrange products. Back then it was seperated by core clocks, tmu, and memory clocks. Now Nvidia has seperated the SP to determine budget to high end cards by price range. Just look at 8600gt for instance. It is a 8 rop card on a 128bit memory bus and yet it's still good as 7900gt 16 rop 256bit bus card in modern games. It doesn't have the fillrate nor the bandwidth but it has stronger shader which makes a dramatic difference in any of the modern games today.