G80 Physx ETA

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I've managed to run Folding@Home on my G80 card and I'm getting 1100 iter/sec vs. 50 or so when I just use my CPU.

That said, I can't get Physx games like UT3 working properly. The lighthouse map is a complete slideshow, along with the other Physx maps.

Does anyone here know when we can expect physx support for G80? Rollo and Keysplayr, are you guys privy to that sort of info, and are you allowed to share it with us?

Thanks. :)

:beer:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: taltamir
as soon as you install the hacked drivers on it :p.
I've got the hacked graphics drivers on it. The thing is, the NV Physx driver itself has not been hacked yet (and it's apparently 'unhackable'). G92 and G200 cards work with the latest one, but G80 cards don't. :(
 

lopri

Elite Member
Jul 27, 2002
13,312
687
126
I asked this in the other thread, but didn't get any answer so I'll ask again. If anyone knows, please enlighten me.

3. How does PhysX work, other than in 3DMark? Does it work off some unused part of the GPU silicon? Does it sacrifice some of the GPU's main purpose (i.e. rendering) to work simultaneously? Or does it require a dedicated (separated) GPU?

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
physX uses some of the shaders to calculate physX, it increases the FPS by a lot. The 3dmark score is inflated due to it running only the CPU test (in which the CPU + GPU are both working 100% on physics), then it does the GPU test (100% GPU + 20% CPU on graphics), then it plugs the results in an equasion that expets you to have the CPU help the GPU, but not vice versa. it should really run a test that tests both at once. but doesn't. it should really run a REAL GAME. (which is what their next version would be)

However, in real games you do see a nice boost to FPS due to the CPU just being wayyyyy too slow for physX effects.


You don't need to hack the physX drivers.. you install them as is, weather they work or not depends on your video card drivers. Which can be hacked... I got it working on a 8800GTS 512 (G92)... and i hear some people got it on G80.

You could also just wait a little bit and nvidia will release official support, eventually.
 

MyLeftNut

Senior member
Jul 22, 2007
393
0
0
Has there been any tangible reports of increased performance with these hacked physx drivers?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: MyLeftNut
Has there been any tangible reports of increased performance with these hacked physx drivers?
Apparently in the Physx levels of UT3, you go from 5fps to around 25fps on an 8800GT.
 

plion

Senior member
Aug 7, 2005
326
0
71
Interesting, does it apply to any game or only physx supported games? Can it work on a game like company of heroes or world of warcraft?
 

lopri

Elite Member
Jul 27, 2002
13,312
687
126
Originally posted by: taltamir
You don't need to hack the physX drivers.. you install them as is, weather they work or not depends on your video card drivers. Which can be hacked... I got it working on a 8800GTS 512 (G92)... and i hear some people got it on G80.

You could also just wait a little bit and nvidia will release official support, eventually.
What I'm wondering is:

1. There is an unused part of a GPU while rendering 3D scenes, and that is what accelerates PhysX.
2. The processing unit is the same for 3D rendering and PhysX and there is a compromise on rendering, but the net result is a plus. (say lose 3 FPS on rendering but gain 5 FPS from PhysX? or something like that)

Which scenario is it? If it's #2, I'd think there'd be a drivers nightmare.

I think an ideal scenario is having a dedicated GPU. This of course would not be SLI because the GPUs need not be the same. And you would be able to keep using your old video card for physX while your new card did the rendering.

But I am still unsure what the exact mechanism of GPU PhysX is.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: lopri
Originally posted by: taltamir
You don't need to hack the physX drivers.. you install them as is, weather they work or not depends on your video card drivers. Which can be hacked... I got it working on a 8800GTS 512 (G92)... and i hear some people got it on G80.

You could also just wait a little bit and nvidia will release official support, eventually.
What I'm wondering is:

1. There is an unused part of a GPU while rendering 3D scenes, and that is what accelerates PhysX.
2. The processing unit is the same for 3D rendering and PhysX and there is a compromise on rendering, but the net result is a plus. (say lose 3 FPS on rendering but gain 5 FPS from PhysX? or something like that)

Which scenario is it? If it's #2, I'd think there'd be a drivers nightmare.

I think an ideal scenario is having a dedicated GPU. This of course would not be SLI because the GPUs need not be the same. And you would be able to keep using your old video card for physX while your new card did the rendering.

But I am still unsure what the exact mechanism of GPU PhysX is.

My guess is that they would give some of the shading processors to the PhysX, and leave the rest to run the game. For example, on a G80 8800GTS, they could give 64 SPs to the game, and 32 to the PhysX. They could even tune it for games that don't require alot of shading processors (like COD4).

Interesting, does it apply to any game or only physx supported games? Can it work on a game like company of heroes or world of warcraft?

Only PhysX supported games (and so far, only UT3 and 3DMark Vantage AFAIK). BTW Vantage isn't a game. :D
:beer:
 

terentenet

Senior member
Nov 8, 2005
387
0
0
Geez! All these posts about Physx and Folding@Home working on G92 and no link to the hacked drivers? Where do I get them from?
Will these work on 9800GX2?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Hey guys.

Nvidia Physx System Software 8.06.12 is available here.

The driver with Physx support (177.39) includes Nvidia system software v8.06.12. Posted on NV.com last week.

It should run on shaders not being utilized by any given game. So, don't quote me yet, but I would think that the more shaders a card has, the better. However, there may be a dedicated pre-determined number of shaders at any given time. E.G. 8 shaders or 16 shaders, or just a certain percentage of overall shaders present on a given core.

Keep in mind that I have no actual numbers for you yet. Working on that though.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Looks like something pretty useless to me, at least on anything g80 derived. If the 9800gtx doesn't show any real improvements, no other card will either. We'll just have to wait for someone to review gtx2*0 with CUDA. That MIGHT improve the value for those cards by a little, since they might be to powerfull for most games at 1680*1050, but then again, as soon as they aren't powerfull enough anymore CUDA will be useless again. And, how many games actually support it ?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: MarcVenice
Looks like something pretty useless to me, at least on anything g80 derived. If the 9800gtx doesn't show any real improvements, no other card will either. We'll just have to wait for someone to review gtx2*0 with CUDA. That MIGHT improve the value for those cards by a little, since they might be to powerfull for most games at 1680*1050, but then again, as soon as they aren't powerfull enough anymore CUDA will be useless again. And, how many games actually support it ?

UT3 performance gain running Physx on GPU instead of CPU

Performance jumped from 31fps when physics were run on CPU, to @ 51fps at 1680x1050.
At this early stage, it doesn't sound too useless to me. You will lose a few fps when enabling Physx in any given game, but that is akin to adding eye candy of any sort. Like increasing AA will give you a performance hit. This was on a 9800GTX.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
I already saw that review, once you go up to 2560*1600 you almost start losing FPS. I don't think UT3 is such a graphic intensive game, the review doesn't state what settings where used, and no AA was forced as far as I can determine. Of course using idle shaders isn't useless, but it does become useless when there are no idle shaders to use. And once again, how many games can use this? This is one single level of UT3. Like I said, this might prove to be nice on a gtx2*0, since it has like twice the shaderpower of a 9800gtx, but those shaders won't be sitting idle for long either, when new games come out.

The only scenario I could see this being usefull is when my 8800gts 320mb becomes useless graphic wise, and I can stick it in my pci-e 1x or pci-e 4x slot, and have it run next to my HD4850 or whatever watered down version of the GTX260/280 that comes out, and have it act as a dedicated PPU. Now, then I'll give credit where it's due, my videocard would really become one hell of a bang for buck card.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
It seems like CPU based physics would be a great way for a game to take advantage of a multi-core CPU. Even if the CPU is less efficient than a GPU or dedicated PPU, many gamers have quad-core systems with typical games being single threaded 3 of those processing cores are sitting idle. If the CPU-based physics could just take advantage of those other cores and run in multiple threads I suspect it could run as fast as it does on the GPU, while leaving the entire GPU available for graphics.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: MarcVenice
I already saw that review, once you go up to 2560*1600 you almost start losing FPS. I don't think UT3 is such a graphic intensive game, the review doesn't state what settings where used, and no AA was forced as far as I can determine. Of course using idle shaders isn't useless, but it does become useless when there are no idle shaders to use. And once again, how many games can use this? This is one single level of UT3. Like I said, this might prove to be nice on a gtx2*0, since it has like twice the shaderpower of a 9800gtx, but those shaders won't be sitting idle for long either, when new games come out.

The only scenario I could see this being usefull is when my 8800gts 320mb becomes useless graphic wise, and I can stick it in my pci-e 1x or pci-e 4x slot, and have it run next to my HD4850 or whatever watered down version of the GTX260/280 that comes out, and have it act as a dedicated PPU. Now, then I'll give credit where it's due, my videocard would really become one hell of a bang for buck card.

Well, I'm pretty sure not everyone plays at 25x16. I'd bet the most popular res is 16x10 with 19x12 even less so. 25x16 would be great for extreme high end however, and at that point, you will most likely have an SLI setup.
Marc, the shaders do not have to be idle ones do they?. Physx will use what it is told to use. Hence the performance hit as opposed to not using Physx at all. Again, it's like either enabling AA, or not enabling AA. You'll get better speed without it, but it wont be as pretty, or in the case of Physx, it won't be as "cool".

Do you remember a short while back when the 9600GT came out and it's performance surprised everyone because it often came so close to an 8800GT, but only had 64 shaders?
We did some testing using CoD4. Disabling shaders on my 8800GTS640. I went from 96 shaders to 64 with no performance hit whatsoever at 1680x1050. It was only when I further disabled shaders down to 48 that I started noticing a performance hit. 32 shaders was almost abysmal. No, scratch that, it was utterly abysmal. This is only one game of course, but it does show that shaders can sometimes be idle depending on the game. Now, I wouldn't expect any shaders to be resting in a game like Crysis for example. Just food for thought for ya.

Anyway, like any IQ or draw distance or % of grass being drawn, the more you use, the less your performance will be. But damned if it won't look pretty. And the 9800GTX shows a 66% increase in performance gain in a game that would otherwise use the CPU for Physx processing. At 1680x1050. I'm willing to bet most enthusiast gamers have anywhere from a 20" to 24" widescreen (anywhere from 14x9 to 19x12). Then you have the extreme folks with the 30" Dells or 37" Westies. But then again, they have the GPU power to push them.

You mentioned using your 8800GTS 320 as a dedicated Physx card. That would be a great idea. You wouldn't have to sell it, or place it on a shelf somewhere. Putting your money to good use essentially extending the usefullness of the hardware.

I am actually looking forward to trying this type of setup. I have a 8800GTS640 here I can possibly use. We have to wait for new drivers however to support this 8 series. I could use one of the 9800GTX's here alongside the GTX280 (after testing out the GTX280 by itself that is). Should be interesting findings.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Originally posted by: keysplayr2003
Originally posted by: MarcVenice
Looks like something pretty useless to me, at least on anything g80 derived. If the 9800gtx doesn't show any real improvements, no other card will either. We'll just have to wait for someone to review gtx2*0 with CUDA. That MIGHT improve the value for those cards by a little, since they might be to powerfull for most games at 1680*1050, but then again, as soon as they aren't powerfull enough anymore CUDA will be useless again. And, how many games actually support it ?

UT3 performance gain running Physx on GPU instead of CPU

Performance jumped from 31fps when physics were run on CPU, to @ 51fps at 1680x1050.
At this early stage, it doesn't sound too useless to me. You will lose a few fps when enabling Physx in any given game, but that is akin to adding eye candy of any sort. Like increasing AA will give you a performance hit. This was on a 9800GTX.

That is kind of interesting. So PhysX *does* run on the 9800GTX ?
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Still not convinced Keys. How many games are like that one single UT3 map ? I'd still rather enable AA then physx my CPU could run as well. And I still don't know what settings were used in that test. And I'm not convinced about your 8800gts 640mb story either. Two things, first, why did new gen cards double or tripple in shaders if they sit idle in CoD4, one of the better looking games out right now? ATI cards perform very good in CoD4, might have something to do with the increase in shaderpower, might not. Second thing, although I might be completely wrong here, don't know to much about GPU architecture, but what if 30 some shaders sit idle on ALL g80 8800gts's, 320/640mb cards? Simply because they are perhaps memory bandwith starved, not enough rop's or tmu's, in essence, an inbalanced card, where as the 9600gt was a better balanced card?
 

terentenet

Senior member
Nov 8, 2005
387
0
0
Got it. Physx up and running on 9800GX2 Quad-SLI. Now if I only had the games to put it to good use...
Besides UT3, what other games support Physx?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: lopri
Originally posted by: taltamir
You don't need to hack the physX drivers.. you install them as is, weather they work or not depends on your video card drivers. Which can be hacked... I got it working on a 8800GTS 512 (G92)... and i hear some people got it on G80.

You could also just wait a little bit and nvidia will release official support, eventually.
What I'm wondering is:

1. There is an unused part of a GPU while rendering 3D scenes, and that is what accelerates PhysX.
2. The processing unit is the same for 3D rendering and PhysX and there is a compromise on rendering, but the net result is a plus. (say lose 3 FPS on rendering but gain 5 FPS from PhysX? or something like that)

Which scenario is it? If it's #2, I'd think there'd be a drivers nightmare.

I think an ideal scenario is having a dedicated GPU. This of course would not be SLI because the GPUs need not be the same. And you would be able to keep using your old video card for physX while your new card did the rendering.

But I am still unsure what the exact mechanism of GPU PhysX is.

Physics runs on the same part of the card as graphics. They share power. This is based on the Techgage preview.

CPU limited situation = speed improvement.
GPU limited situation = no speed improvement.


That's basically the long and short of it.
It also means you could (hopefully) use a second card for physics (maybe a weaker one e.g. you've just upgraded to a GTX280 from an 8800GT and have an SLI motherboard).
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: MarcVenice
Still not convinced Keys. How many games are like that one single UT3 map ? I'd still rather enable AA then physx my CPU could run as well. And I still don't know what settings were used in that test. And I'm not convinced about your 8800gts 640mb story either. Two things, first, why did new gen cards double or tripple in shaders if they sit idle in CoD4, one of the better looking games out right now? ATI cards perform very good in CoD4, might have something to do with the increase in shaderpower, might not. Second thing, although I might be completely wrong here, don't know to much about GPU architecture, but what if 30 some shaders sit idle on ALL g80 8800gts's, 320/640mb cards? Simply because they are perhaps memory bandwith starved, not enough rop's or tmu's, in essence, an inbalanced card, where as the 9600gt was a better balanced card?
I was actually the first person here to notice the COD4 shader thing as I was curious about why the 9600GT performed so well with less SPs. Keys is 100% correct here.

We disabled SPs using RivaTuner.

Other games like Crysis slowed way down when the SPs were disabled (someone tested a bunch of other games).

IMO the PhysX effects are far more noticeable than AA.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: terentenet
Got it. Physx up and running on 9800GX2 Quad-SLI. Now if I only had the games to put it to good use...
Besides UT3, what other games support Physx?
There are a bunch of them, but AFAIK only UT3 supports NV PhysX at this point.

I'm pretty sure all the UT3-engine games support PhysX.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Sickbeast, that conclusion is still +1 for me. Only in CoD4 there were shaders not being used, and in a 'bunch' ( which ones ? ) of games disabling SPs immediatly resulted in lower performance? Meaning no SPs to do physics acceleration? Or am I wrong in the assumption that only the shaders can run the physics? If someone happen to has a nice, understandable link, where I can see what part of a videocard does what, it be nice :p