Originally posted by: BFG10K
Oblivion was the first game did geforce 7 started to show it's weakness over x1900 series.
Again more dodging and total goal-post shifting on your part. Nobody was discussing the GF7 series because it?s a completely different architecture.
Geforce 7 had particularly weaker shaders. About 1/2 the performance of it's rival.
Which is exactly why the filtering issue is so important given GF7 shaders were tied to their texturing units unlike being decoupled on the Radeon.
But this was never under discussion, you've simply chopped and changed the issue just to obfuscate the fact that you're wrong.
You do know that each relatively companies tweak their cards like adding 512bit memory ring bus not to mention x1900xtx had faster clocks and faster memory over 1800xt.
What the hell are you talking about? There was no change to the memory ring from the X1800 to X1900 and its clocks were pretty much identical, which is eaactly which it?s a perfect test-bed to demonstrate shader differences. There are plently of games in that list that pre-date Oblivion.
That's why it doesn't beat it in any of the other resolutions.
"Other resolutions"? You mean like 1280x1024 which is CPU limited and hence influenced more by the CPU/platform rather than shaders?
When SP was limited did it ever lose to 8800gts with faster SP clocks at extreme situations by 1/2 fps. Now that's just one benchmark in 1 resolution.
You were the one that was harping on about using that review despite my protestations. To quote yourself "
what do you think that was? Fake review?
So don?t start crying about it when we start using it because you forgot to actually check whether it backs your claims.
I assure you 8800gtx will beat g92 8800gts in sleuth of games and you'll be in you basement trying to find articles trying to make a come back like usual.
Fortunately for us your assurances mean nothing.
Yes I would like to see a test where a GT is only overclocked by memory clock speed and it would show huge improvement over raising the core which would have some effect but not as much as raising memory clock speeds.
Again you need to provide evidence of your claims and thus-far you have nothing except a bunch of meaningless and theoretical 3DMark results.
I'm still waiting for a retraction of your original claim:
you can see the biggest jump when textures are saturated by it's bandwidth combined with current SP. There are still many games still rely on texture prowness over shader. Actually it's about 95% of PC games out today.
The two graphs showed us that you pulled that out of your orifice and when they did you ignored the graphs and the results and starting rambling on about Oblivion and the GF7 series.
You need to retract your claims and stop trolling.