Nvidias new PhysX drivers

SergeC

Senior member
May 7, 2005
484
0
71
I'd also LOVE to see a comparison with the same GPU not accelerating physics plus an actual PhysX card.

Hopefully some review site will pick that up. (cough, hint)
 

amenx

Diamond Member
Dec 17, 2004
4,195
2,498
136
Originally posted by: Jax Omen
yawn. Lemme know when they support the 8800GTS.
You have to modify the Nvidia INF (177.39 drivers) and insert your cards ID strings. They work with the G92 cards incl the GTS and GT. As I mentioned I got an 1100pt gain in 3dMark Vantage, many others as well at Guru3D report similarly. If thats not some subtle clue that it works, then you better take another yawn and go back to sleep, lol.

 

OCGuy

Lifer
Jul 12, 2000
27,224
36
91
VERY interesting. Look what the ATI launch is doing..it has already caused a paper launch of a die shrink, and now this...........

Here would be my question: The lukewarm rumors of the 280 were already abound before its launch. If this increases performance in any way, why wouldnt NV have released these drivers to the reviewers so the cards would get a good first impression?
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Ocguy31
VERY interesting. Look what the ATI launch is doing..it has already caused a paper launch of a die shrink, and now this...........

Here would be my question: The lukewarm rumors of the 280 were already abound before its launch. If this increases performance in any way, why wouldnt NV have released these drivers to the reviewers so the cards would get a good first impression?

It's only going to increase the total vantage score, which should be ignored anyway. The GPU score, which is what is used when comparing cards, is not affected.

I sort of like that this is offered and it shows the power that these cards can have, but on the other hand it is a pretty cheap move by nVidia. It shows their cards in a better light than would be realized in actual games. The CPU tests that are accelerated by the GPU in Vantage have no intensive graphics to render, so the GPU can concentrate on physics alone. In actual games, physics performance will not be nearly as good because the GPU has to concentrate on rendering.

I'm still really glad nVidia is getting PhysX to work with GPUs though. I always wanted to buy a PhysX card just to try it out, but now I can get the same effect through something I already bought.
 

amenx

Diamond Member
Dec 17, 2004
4,195
2,498
136
My GPU score went up about 300pts. My CPU score went from around 7000 to 24,000! But overall about an 1100pt gain in final score. IMO this is scandalous for Futuremark. Very few games use Physx acceleration. This inflated Vantage score is very unrepresentative of typical GPU performance. So I think Futuremark have to fix it or do something otherwise it may hurt their cred. ATI should vigorously protest.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I've still yet to see gpu physics implemented in any game, and until that happens, this is just a tech demo and a marketing "check box" feature. I'm also waiting to get useable, flexible video encoding acceleration on the gpu, and while there have been talks of it, nothing concrete has surfaced either.
 

amddude

Golden Member
Mar 9, 2006
1,711
1
81
FYI, to make other cards work such as the 8800, you need to edit the inf in the setup:

From under the [NVIDIA.Mfg], you copy these 2x strings across, as these are for the GT/GTS, you can easily tell from the Dev_ Number (which ive bolded) and which you also get from the ; Localizable Strings which is which, and we also know the GT/GTS are G92's (which ive also bolded).

%NVIDIA_G92.DEV_0600.1% = nv4_G9x, PCI\VEN_10DE&DEV_0600
%NVIDIA_G92.DEV_0611.1% = nv4_G9x, PCI\VEN_10DE&DEV_0611

Then we need to copy across the cards we want from under the ; Localizable Strings (they are all named as you can see so its easy).

NVIDIA_G92.DEV_0600.1 = "NVIDIA GeForce 8800 GTS 512"
NVIDIA_G92.DEV_0611.1 = "NVIDIA GeForce 8800 GT"
 

quattro1

Member
Jan 13, 2005
111
0
0
Originally posted by: Extelleron

I sort of like that this is offered and it shows the power that these cards can have, but on the other hand it is a pretty cheap move by nVidia. It shows their cards in a better light than would be realized in actual games. The CPU tests that are accelerated by the GPU in Vantage have no intensive graphics to render, so the GPU can concentrate on physics alone. In actual games, physics performance will not be nearly as good because the GPU has to concentrate on rendering.

Originally posted by: Amenx
IMO this is scandalous for Futuremark. Very few games use Physx acceleration. This inflated Vantage score is very unrepresentative of typical GPU performance. So I think Futuremark have to fix it or do something otherwise it may hurt their cred. ATI should vigorously protest.

I dont get these comments. Why is it bad for one company to do this? Vantage has a physics tests, you can either do it on the CPU or you can do it on the GPU if your card supports it. All it really shows is that physics on an NV GPU is faster than doing it on a CPU. Its a feature one card has the another does not. Whats wrong with that?

ATI supports DX10.1 does that mean they are "cheating" or a cheap move? Physics on the GPU is just like any other feature that one card has over another.

 

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136
Originally posted by: quattro1

I dont get these comments. Why is it bad for one company to do this? Vantage has a physics tests, you can either do it on the CPU or you can do it on the GPU if your card supports it. All it really shows is that physics on an NV GPU is faster than doing it on a CPU. Its a feature one card has the another does not. Whats wrong with that?

ATI supports DX10.1 does that mean they are "cheating" or a cheap move? Physics on the GPU is just like any other feature that one card has over another.

So how is ATI supposed to compete here? Is nvidia going to give them rights to develop physx support for their GPUs? And by give I do mean give, because selling ATI a license hardly levels the playing field when it comes to 3DMark.

A feature that is owned and controlled by nvidia doesn't belong in a multi-vendor benchmark tool.

FWIW I'm a long time nvidia owner. I have literally never owned an ATI video card, & I currently run 2 x G92 GTS in SLI. So if I'm biased at all it's in nvidia's favor.

[EDIT]Please forgive me if I've missed any recent news about ATI & physx... I haven't seen anything since nvidia bought out physx[/EDIT]

Viper GTS
 

OCGuy

Lifer
Jul 12, 2000
27,224
36
91
Originally posted by: Viper GTS
Originally posted by: quattro1

I dont get these comments. Why is it bad for one company to do this? Vantage has a physics tests, you can either do it on the CPU or you can do it on the GPU if your card supports it. All it really shows is that physics on an NV GPU is faster than doing it on a CPU. Its a feature one card has the another does not. Whats wrong with that?

ATI supports DX10.1 does that mean they are "cheating" or a cheap move? Physics on the GPU is just like any other feature that one card has over another.

So how is ATI supposed to compete here? Is nvidia going to give them rights to develop physx support for their GPUs? And by give I do mean give, because selling ATI a license hardly levels the playing field when it comes to 3DMark.

A feature that is owned and controlled by nvidia doesn't belong in a multi-vendor benchmark tool.

FWIW I'm a long time nvidia owner. I have literally never owned an ATI video card, & I currently run 2 x G92 GTS in SLI. So if I'm biased at all it's in nvidia's favor.

[EDIT]Please forgive me if I've missed any recent news about ATI & physx... I haven't seen anything since nvidia bought out physx[/EDIT]

Viper GTS



Who ever said it has to be a "level playing field?". ATI has Havok, NV has Physx. AMD wont win an anti-trust case over a benchmarking program.

Now if NV and game developers were locking out people who didnt have Physx capability, you would have a much bigger case. But as it is optional and not required to run games, it is just a perk.
 

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136
Originally posted by: Ocguy31

Who ever said it has to be a "level playing field?". ATI has Havok, NV has Physx. AMD wont win an anti-trust case over a benchmarking program.

Now if NV and game developers were locking out people who didnt have Physx capability, you would have a much bigger case. But as it is optional and not required to run games, it is just a perk.

I never said anything about anti-trust, this is obviously not a legal issue. You people get so worked up over this stuff. :roll:

If FutureMark wants Vantage to remain even remotely relevant they're going to have to address this. Whether that means killing the physx support or adding support for havok, something needs to change. Of course some would argue (and I tend to agree) that they're not relevant anyway so I suppose it doesn't really matter.

Viper GTS
 

slick2500

Junior Member
Sep 24, 2007
12
0
0
Originally posted by: amddude
FYI, to make other cards work such as the 8800, you need to edit the inf in the setup:

From under the [NVIDIA.Mfg], you copy these 2x strings across, as these are for the GT/GTS, you can easily tell from the Dev_ Number (which ive bolded) and which you also get from the ; Localizable Strings which is which, and we also know the GT/GTS are G92's (which ive also bolded).

%NVIDIA_G92.DEV_0600.1% = nv4_G9x, PCI\VEN_10DE&DEV_0600
%NVIDIA_G92.DEV_0611.1% = nv4_G9x, PCI\VEN_10DE&DEV_0611

Then we need to copy across the cards we want from under the ; Localizable Strings (they are all named as you can see so its easy).

NVIDIA_G92.DEV_0600.1 = "NVIDIA GeForce 8800 GTS 512"
NVIDIA_G92.DEV_0611.1 = "NVIDIA GeForce 8800 GT"

Um English?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Ocguy31


Who ever said it has to be a "level playing field?". ATI has Havok, NV has Physx. AMD wont win an anti-trust case over a benchmarking program.

Actually Intel has been working on Havok for the CPU.

So as long as your computer has a AMD or Intel CPU Havok will be supported.

http://www.xbitlabs.com/news/m...hysics_Technology.html

Advanced Micro Devices and Havok, a wholly owned subsidiary of Intel Corp., said that they would collaborate to optimize Havok physics engine for AMD?s x86 microprocessors as well as investigate possibilities to optimize the engine for the company?s ATI Radeon graphics processing units (GPUs).

"Investigate Possibilities" is a far cry from any sort of actual implementation.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Viper GTS
So how is ATI supposed to compete here? Is nvidia going to give them rights to develop physx support for their GPUs?
Actually yes. NVIDIA made PhysX an open standard, AMD can have all the documentation it wants and is free to implement a PhysX driver if they wish.
 

Allio

Golden Member
Jul 9, 2002
1,904
28
91
Are there any benchmarks/experiences in ANYTHING that's not 3DMark/a PhysX tech demo? Because, you know, those aren't that much fun to play...

This sounds awesome on paper but honestly, who gives a crap about 3dmark scores. I recognise about three of the games that support PhysX acceleration... and the only benchmarks nv published were in some physics test level. What about in the real game?

Edit: ok out of the games I recognise the only one that's actually hardware accelerated is UT3. Whoopie :p
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Ocguy31
Originally posted by: Viper GTS
Originally posted by: quattro1

I dont get these comments. Why is it bad for one company to do this? Vantage has a physics tests, you can either do it on the CPU or you can do it on the GPU if your card supports it. All it really shows is that physics on an NV GPU is faster than doing it on a CPU. Its a feature one card has the another does not. Whats wrong with that?

ATI supports DX10.1 does that mean they are "cheating" or a cheap move? Physics on the GPU is just like any other feature that one card has over another.

So how is ATI supposed to compete here? Is nvidia going to give them rights to develop physx support for their GPUs? And by give I do mean give, because selling ATI a license hardly levels the playing field when it comes to 3DMark.

A feature that is owned and controlled by nvidia doesn't belong in a multi-vendor benchmark tool.

FWIW I'm a long time nvidia owner. I have literally never owned an ATI video card, & I currently run 2 x G92 GTS in SLI. So if I'm biased at all it's in nvidia's favor.

[EDIT]Please forgive me if I've missed any recent news about ATI & physx... I haven't seen anything since nvidia bought out physx[/EDIT]

Viper GTS



Who ever said it has to be a "level playing field?". ATI has Havok, NV has Physx. AMD wont win an anti-trust case over a benchmarking program.

Now if NV and game developers were locking out people who didnt have Physx capability, you would have a much bigger case. But as it is optional and not required to run games, it is just a perk.

nvidia bought ageia, ported the code to CUDA... and opened the spec...
So ATI is fully free to develop their own physX implementation... if they don't mind assigning a dedicated team of programmers to do so for a year or two...
 

schneiderguy

Lifer
Jun 26, 2006
10,801
89
91
Has anyone actually tested a real game yet? I'm curious to see how much performance hit there is (if any)
 

amenx

Diamond Member
Dec 17, 2004
4,195
2,498
136
Originally posted by: schneiderguy
Has anyone actually tested a real game yet? I'm curious to see how much performance hit there is (if any)
Only subjective impressions. GRAW2 seems faster and smoother, fps in the 60s and 70s (settings fully maxed out). Never benched it before, so dont have a reference to compare to.

 
Mar 11, 2004
23,424
5,825
146
Originally posted by: amenx
Originally posted by: schneiderguy
Has anyone actually tested a real game yet? I'm curious to see how much performance hit there is (if any)
Only subjective impressions. GRAW2 seems faster and smoother, fps in the 60s and 70s (settings fully maxed out). Never benched it before, so dont have a reference to compare to.

And you're certain the PhysX part is being implemented?

I would assume the reason they didn't make a point for reviewers to test out the PhysX stuff (did AT's article even mention CUDA or physics as I didn't notice it anywhere in there but I skimmed through some parts) is that by applying physics processing to the card then the graphical performance would decrease, as its not like it'd be utilizing some extra part of the GPU that shouldn't already being used. That coupled with the fact that there's barely any games that utilize it, let alone in a worthwhile manner (that actually adds to the game), they would be worried about it being a big negative mark in reviews.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
PhysX is hardware limited where as Havok works regardless of what platform you are on (windows, linux, ps2, ps3, 360, Wii, GameCube) and it's free to use and also saves loads of time & money for the developers.