Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?
Originally posted by: orangat
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?
The differences are slight and not to the extent of Oblivion.
For example, in Oblivion the x850xt is faster than the 7800gt in the outdoor-bloom benchmark which is remarkable since the x850 is one generation behind less 4 pipes.
Originally posted by: munky
Originally posted by: orangat
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?
The differences are slight and not to the extent of Oblivion.
For example, in Oblivion the x850xt is faster than the 7800gt in the outdoor-bloom benchmark which is remarkable since the x850 is one generation behind less 4 pipes.
Yeah, I know the differences are usually not so big, but it still makes me what exactly does the TWIMTBP money do?
Originally posted by: munky
That goes for many TWIMTBP games: FEAR, TR:Legend, Farcry, BF2, ... did I forget any?
Originally posted by: Cookie MonsterOblivion favours ATi cards outdoor, but favors NV cards indoor. BUt overall ATi cards perform well in oblivion thanks to its 48 pixel shaders.
Originally posted by: coldpower27
According to the Oblivion benchmarks here at Anandtech, the 7800 GT is putting up even numbers to the X800 XL... which is pretty absurd IMO. To me it's very obvious which company this game favors.
Originally posted by: munky
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.
Originally posted by: Drayvn
Originally posted by: munky
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.
Yeah same here some users have actually done it and had graphical problems.
But other than that except for being written in the config files there are no SM3 shader paths. Supposedly something to do with they could never get it working...
Which is pretty wierd as other less known companies have been able to get SM3 working on much older engines *cough cough* Far Cry, Splinter Cell (Unreal 2 Enigne!)
Originally posted by: Barkotron
Originally posted by: Drayvn
Originally posted by: munky
By default SM3 shaders are disabled in the game, although you can supposedly enable them in the ini file. I enabled it on mine, but I didnt notice any changes in IQ or performance.
Yeah same here some users have actually done it and had graphical problems.
But other than that except for being written in the config files there are no SM3 shader paths. Supposedly something to do with they could never get it working...
Which is pretty wierd as other less known companies have been able to get SM3 working on much older engines *cough cough* Far Cry, Splinter Cell (Unreal 2 Enigne!)
The SM3 thing is weird. What are they using for HDR? If it's not SM3, why don't they let X800-series ATI cards do HDR? Does anyone know?
Originally posted by: gobucks
as far as oblivion goes, it seems to me that it's ATi's memory architecture that is really helping it out. That 512-bit ring bus thingy is apparently helping to maximize bandwidth and that in turn helps minimize the performance impact of AA/AF.