WOOHOO!! Got ATI + Geforce PhysX working in Win7!!!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: thilan29
Originally posted by: zagood
For anyone looking at uber-cheap nV cards to pair up for physx, look at some benchmarks. I assume they'll be similar using an ATi card as your main one.

http://www.pcgameshardware.com...nchmark_review/?page=2

http://www.pcgameshardware.com...ative_to_Nvidia_Physx/

Hmm...8400 seems really slow...maybe a 8600GT or maybe even 9600GT is the minimum? I'm surprised it's that slow though...I have my 8800GT underclocked to 400/1200/800, it made no difference from when it was at 650/1600/950 and it doesn't heat up at all while doing any PhysX so I figured the PhysX calculations required very little horsepower.

Given that Physics are inherently parallel calculations that are well suited to things such as GPUs... let's take a look at what's happening shall we?

8400GS - 64-bit memory interface, 450MHz core clock, 400MHz memory clock, 900MHz shader clock, 16 shaders - 19fps

8600GT - 128-bit memory interface, 540MHz core clock, 700MHz memory clock, 1180MHz shader clock, 32 shaders - 51fps

9600GT - 256-bit memory interface, 650MHz core clock, 900MHz memory clock, 1625MHz shader clock, 64 shaders - 59fps

9800GTX+ - 256-bit memory interface, 738MHz core clock, 1100MHz memory clock, 1836MHz shader clock, 128 shaders - 64fps

It looks to me judging by the pcgh review that PhysX requires 32 shaders to run properly, and from that point on it's a function of speed with diminishing returns (probably due to bus traffic, etc).

Again, if we look at the 8400GS versus the 8600GT, this appears to account for the 63% difference in speed - 50% comes from having the requisite number of shaders, the remaining difference is from the increased shader clock. Once you get past 32 shaders, it appears to become purely a function of shader clock speed.

Interesting to wonder - what would happen if nvidia were to truly allow an entire GPU to be utilized as a PPU, basically allowing multiple PhysX calculation groups in parallel. Technically a 9800GTX should be a bit more than 4 times faster than an 8600GT if that were the case. But odds are that it won't make much when given the fact that data must be pushed between CPU, PhysX GPU, and the Video GPU. We already see this when PhysX is calculated onboard the same GPU - there's a performance hit because PhysX data is pushed from the CPU to the GPU for calculation, then back to the CPU and once more back to the GPU for rendering.

Originally posted by: evolucion8
The PhysX card proved to be even faster than the 9600GT, so I guess that the minimum should be a 8800GT or better. Probably the new single slot 9800GT can fit nicely there.

I would say probably the 9600GSO, probably even (or especially, due to the faster shader clock) the newer 512 version will fit the roll for PhysX and be the best price-performance you can get - 48 shaders, 1625MHz shader core, 256-bit memory bus.
 

2dt Drifter

Senior member
May 23, 2007
253
0
0
I'm new to the W7 scene (just got it up and running last night). Are the drivers that windows recommends the only drivers I can use with my card, i.e. the wddm 1.1? Also I don't have any current games with the physX engine is it best to leave it off? I don't think i plan on getting a mid range card for the physX feature... at least for a while or until I get used to the OS... although from Sunny's post the GSO looks good to jump on when the time comes.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: 2dt Drifter
I'm new to the W7 scene (just got it up and running last night). Are the drivers that windows recommends the only drivers I can use with my card, i.e. the wddm 1.1? Also I don't have any current games with the physX engine is it best to leave it off? I don't think i plan on getting a mid range card for the physX feature... at least for a while or until I get used to the OS... although from Sunny's post the GSO looks good to jump on when the time comes.

No... Windows 7 can use WDDM 1.0 drivers... and ironically the multimonitor limitation is a Vista limitation, not really a WDDM 1.0 limitation (since you can use WDDM 1.0 drivers just fine). In fact, in order to get OpenGL support you will need to install Vista drivers... meaning WDDM 1.0 (since the beta drivers from both NVIDIA and ATI for Win7 don't include OpenGL).

If the game doesn't support PhysX, then it won't use PhysX. There's no need to "turn anything off".
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Will you be able to use an ATI card + an NV card for PhysX if you use WDDM 1.0 drivers, though? I would think not.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: aka1nas
Will you be able to use an ATI card + an NV card for PhysX if you use WDDM 1.0 drivers, though? I would think not.

No. You can use a WDDM 1.0 ATI driver and a 1.1 nV driver and still have PhysX working but you can't use a 1.0 nV driver and still have PhysX.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: 2dt Drifter
I'm new to the W7 scene (just got it up and running last night). Are the drivers that windows recommends the only drivers I can use with my card, i.e. the wddm 1.1? Also I don't have any current games with the physX engine is it best to leave it off? I don't think i plan on getting a mid range card for the physX feature... at least for a while or until I get used to the OS... although from Sunny's post the GSO looks good to jump on when the time comes.

If you only have a nV card in your system then this thread doesn't apply to you. Your setup should work fine in PhysX without any issues...you don't even need a 2nd card to run PhysX if you already have one nV card doing the graphics rendering.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
But that's not a problem, Pelu - there's a free beta of Win7 you can download.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: Denithor
But that's not a problem, Pelu - there's a free beta of Win7 you can download.

Public downloads of Win7 have stopped last week. It's no longer possible to get a Win7 beta key unless you're a Technet or MSDN subscriber.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
was trying on vista.. but it start complaining that some graphic adapter is incompatible or is causing trouble... and even if it works fine... those two cards there in my computer are too close... technically closing up the air intake of the radeon heatsink fan.. which kinda suck because in no time the radeon almost got up to 100C lol, in idle... !!!!! so i drop the matter... the other pcie slot... doesnt even detect the card...
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
anyway what cause this to now work in vista at all? lack of extend desktop option?
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
WDDM 1.0 on Vista cannot support more than 1 display driver being active at once.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Wouldn't count on it - how else is M$ going to drive people to upgrade to Win7?
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
This thing of mixing ATI with nvidia is going down the TOILET!!!!! why because nvidia didnt like the idea of people using old nvidia cards as physx, WHY~!!!! becaues that way they dont earn money... that is why their new driver 1.90 locks the physx if it detects an ati card on the computer, no matter if the ati card is a decoration... the physx will get lock and this is really dirty because if you put a nvidia card to physx and intel graphics do to crappy graphics because that is all those cards can do... it will works... OMG!!!! the freshness on nvidia is awesome!!!!!!!! Good to know that you are greedy and meanny bastards nvidia... I hate u so much but i guess i be getting one of your cards anyway becasue of the physx screw u for that... and SCREW ATI for not putting something good like physx on the table.. HAVOK is crap... and why bother on which one is better if you can have both with an nvidia card.. havok is procesor... nvidians got BOTH!!! I M SO ANNOYED!!!!!! AND P>>>D!!!!!!!!!!
 
Apr 20, 2008
10,067
990
126
Originally posted by: SunnyD
Originally posted by: soccerballtux
Cool! This will mean we can buy one of those cheap $25 nvidia cards, maybe used or something on ebay, and get an ATI for the main graphics stuff.

The only thing that would be better is if NVIDIA starts making PCIe 1x cards with a decent chip on it so we don't have to waste our 16x slots. :)

I would hop on a cheap 9500gt in a PCIe 1x form just to test the gimmick with my ATI cards.

(note: this post was in no way sarcastic. i mean it.)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Scholzpdx
Originally posted by: SunnyD
Originally posted by: soccerballtux
Cool! This will mean we can buy one of those cheap $25 nvidia cards, maybe used or something on ebay, and get an ATI for the main graphics stuff.

The only thing that would be better is if NVIDIA starts making PCIe 1x cards with a decent chip on it so we don't have to waste our 16x slots. :)

I would hop on a cheap 9500gt in a PCIe 1x form just to test the gimmick with my ATI cards.

(note: this post was in no way sarcastic. i mean it.)

I believe a 4x PCI-e slot is the recommended bandwidth for dedicated PhysX cards.
And, you would have to run older Forceware drivers (pre-190's).
Ah, why would you want to bother with a gimmick? hehe.