Originally posted by: MarcVenice
I already saw that review, once you go up to 2560*1600 you almost start losing FPS. I don't think UT3 is such a graphic intensive game, the review doesn't state what settings where used, and no AA was forced as far as I can determine. Of course using idle shaders isn't useless, but it does become useless when there are no idle shaders to use. And once again, how many games can use this? This is one single level of UT3. Like I said, this might prove to be nice on a gtx2*0, since it has like twice the shaderpower of a 9800gtx, but those shaders won't be sitting idle for long either, when new games come out.
The only scenario I could see this being usefull is when my 8800gts 320mb becomes useless graphic wise, and I can stick it in my pci-e 1x or pci-e 4x slot, and have it run next to my HD4850 or whatever watered down version of the GTX260/280 that comes out, and have it act as a dedicated PPU. Now, then I'll give credit where it's due, my videocard would really become one hell of a bang for buck card.
Well, I'm pretty sure not everyone plays at 25x16. I'd bet the most popular res is 16x10 with 19x12 even less so. 25x16 would be great for extreme high end however, and at that point, you will most likely have an SLI setup.
Marc, the shaders do not
have to be idle ones do they?. Physx will use what it is told to use. Hence the performance hit as opposed to not using Physx at all. Again, it's like either enabling AA, or not enabling AA. You'll get better speed without it, but it wont be as pretty, or in the case of Physx, it won't be as "cool".
Do you remember a short while back when the 9600GT came out and it's performance surprised everyone because it often came so close to an 8800GT, but only had 64 shaders?
We did some testing using CoD4. Disabling shaders on my 8800GTS640. I went from 96 shaders to 64 with no performance hit whatsoever at 1680x1050. It was only when I further disabled shaders down to 48 that I started noticing a performance hit. 32 shaders was almost abysmal. No, scratch that, it was utterly abysmal. This is only one game of course, but it does show that shaders can sometimes be idle depending on the game. Now, I wouldn't expect any shaders to be resting in a game like Crysis for example. Just food for thought for ya.
Anyway, like any IQ or draw distance or % of grass being drawn, the more you use, the less your performance will be. But damned if it won't look pretty. And the 9800GTX shows a 66% increase in performance gain in a game that would otherwise use the CPU for Physx processing. At 1680x1050. I'm willing to bet most enthusiast gamers have anywhere from a 20" to 24" widescreen (anywhere from 14x9 to 19x12). Then you have the extreme folks with the 30" Dells or 37" Westies. But then again, they have the GPU power to push them.
You mentioned using your 8800GTS 320 as a dedicated Physx card. That would be a great idea. You wouldn't have to sell it, or place it on a shelf somewhere. Putting your money to good use essentially extending the usefullness of the hardware.
I am actually looking forward to trying this type of setup. I have a 8800GTS640 here I can possibly use. We have to wait for new drivers however to support this 8 series. I could use one of the 9800GTX's here alongside the GTX280 (after testing out the GTX280 by itself that is). Should be interesting findings.