Originally posted by: jiffylube1024
Originally posted by: Dethfrumbelo
3DMark may not tell us too much at this point. I'm much more interested in real games running high AA/AF and HDR. But if the performance in games is only on par/slightly faster than a X1950XT/7950GX2, then I'll just go with the X1900XT, which should be even cheaper in 3 weeks. 700 million+ transistors (nearly double the X1950XT) and only a 20% gain? Hmmm...
Exactly, nobody is taking these results in context. According to these rumours the 8800GTX is 30% faster than the X1950XTX in 3dmark06 or 05. That would be at stock settings presumably: 1280X1024 with, I believe 2X AA/ 8x AF (someone correct me about these settings if I'm wrong). In the other G80 thread, by the way, the advantage is
more like 60%
That's basically a test of pure shader power at low/medium res.
----------
That benchmark does not take into account the G80's other significant advantage: memory bandwidth.
Once the resolution starts going up (1600X1200 and above), and the AA really starts going, I'd expect G80 to pull away. I would predict a pretty sizeable (~50%) gap between 8800GTX and X1950XTX at 1920X1200 with 4X AA.
Not to mention the fact that aside from the X1950XTX (which currently has way more memory bandwith than any card on the market), I'd expect G80 to really kick butt at high res vs every other card: X1900XTX and 7900GTX.
1024X768/1280X1024 isn't exactly going to allow this architecture to stretch its legs and show what it's capable of...
Not to mention the fact that new and faster drivers invariably will come out, seeing as G80 is a brand new architecture. Nvidia milked good rewards out of the 6800 series with driver revisions (less so the 7800's because they were pretty much just beefed up 6800 cards), while ATI tweaked the X1xxx series constantly - mainly due to the new memory ring bus.