Originally posted by: jiffylube1024
Azn, you prove that you're talking out of your arse time and time again and BFG's technical "mumble jumble " (I assume you mean mumbo jumbo?) is due to the fact that he knows what he's talking about. I'll just take one example here of how spectacularly you're wrong:
Originally posted by: Azn
Oblivion was the first game did geforce 7 started to show it's weakness over x1900 series. Geforce 7 had particularly weaker shaders. About 1/2 the performance of it's rival.
You do know that each relatively companies tweak their cards like adding 512bit memory ring bus not to mention x1900xtx had faster clocks and faster memory over 1800xt.
Take a look at the X1900XT/XTX review on this very site to see how shader power made a difference even back in January of '06, nearly two years ago. First, check out the specs of the X1900 series versus the X1800 series:
X1900 specs vs X1800 specs
The X1900XT has the exact same clockspeed as the X1800XT, 50 MHz slower memory (1.45 GHz vs 1.55 GHz), exact same 512-bit ring bus, etc, same number of texture units, etc. The big difference is 48 pixel shader pipelines on the X1900 vs 16 on the X1800.
If we weren't shader limited then there should be no difference between the cards (the X1800 series should win due to the 50 MHz more memory bandwidth), yet the X1900XT beats the X1800XT on EVERY SINGLE BENCHMARK.
The X1900XTX was 20-30% faster than the X1800XT, as seen
here . This was before driver updates further improved the X1900 series.
The gap has widened significantly since then in terms of pixel shaders in new games; current engines like the Crysis engine and Unreal Engine 3 use WAY more pixel shading power than games of early 2006.
-------
Now take a look at the 8800GT review and study the
specs of the cards.
The 8800GTS has more memory bandwidth than the 8800GT, while the 8800GT has a faster core and shader clock and more texture addressing/filtering units and stream processors. The 8800GT is faster in
everything.