Sandy Bridge Integrated Graphics to Handle PhysX

Status
Not open for further replies.

RootForce

Junior Member
Mar 4, 2011
6
0
0
With the z56 chipsets being made available soon we will finally be able to have discrete and integrated gfx in our desktops.

I recall seeing the setting "CPU" when choosing a dedicated physx card in nvidia control panel.
Are the processors smart enough to process physx with the igp in tandem with discrete graphics which handle the rest of the game?

I was going to use my loud, power hungry, and hot gtx 260 as a physx card in my upcoming SB rig but if this works it would change everything.

The intel 3000 IGP in the SB k series should handle physx right?
if not I believe it can be oc'd upto 1350mhz.
Either way its on par with lower end cards that can handle physx!
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Nope, NV hasn't coded PhysX to allow anything except an NV GPU running CUDA to process GPU PhysX effects.
Even if they coded it to run on OpenCL, it still wouldn't work because the the integrated Sandybridge GPU can't run OpenCL.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
"my loud, power hungry, and hot gtx 260"

Really? But Lonyo is correct. Your Sandy Bridge cannot run GPU PhysX. That is what GPU's are for.
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
Hmmm, and I'll be using my 'loud, power hungry, and hot 9800GT green edition video card' to handle the Physx that I want to use in select games.

Funny how that changes everything
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
How about the CPU doing the phsyx, whats better that or a 8800 GT which is gonna create more heat in that area. thx

true hardware PhysX (or any other hardware physics api) runs several orders of magnitude faster on a GPU, in other words a lowly GPU that could be passively cooled can run PhysX without breaking a sweat while the fastest CPUs would churn out slide show results.

that being said your argument might be more valid suggesting people to just turn PhysX off and not bothering to invest in a dedicated PhysX card as it could be convincingly argued that the PhysX effects just aren't worthwhile. I know I sure wouldn't bother with it if my SLI GTX470s couldn't handle it without pause.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
true hardware PhysX (or any other hardware physics api) runs several orders of magnitude faster on a GPU, in other words a lowly GPU that could be passively cooled can run PhysX without breaking a sweat while the fastest CPUs would churn out slide show results.

that being said your argument might be more valid suggesting people to just turn PhysX off and not bothering to invest in a dedicated PhysX card as it could be convincingly argued that the PhysX effects just aren't worthwhile. I know I sure wouldn't bother with it if my SLI GTX470s couldn't handle it without pause.
that's a wee bit of an exaggeration as many lower cards would be too slow to even do physx. using an 8600gt as dedicated physx card is slower than letting my gtx260 do both graphics and physx. so really the faster your gpu is for graphics the faster your dedicated physx card needs to be. with a gtx580 you will want at least a gtx260 or better to get the best experience.

yeah most games are almost a slideshow when doing hardware level physx but surprisingly in some games a fast cpu can actually do physx okay. heck nobody even noticed that Sandy Bridge blows away other cpus at handling physx in the review on xbitlabs. unfortunately xbit does not seem to be working today. of course the best solution is to use a dedicated card for hardware physx.
 
Last edited:

Moebious

Banned
Mar 3, 2011
14
0
0
The 9600GT is the minimum for playable High APEX PhysX effects. I used it as a dedicated PhysX card with Batman AA, Cryostasis, Mirror's Edge and even Mafia 2 and never experienced a single slow down, considering that my 9600GT looks and feels like a chinese knock out and the core runs at 600MHz and RAM at 1.2GHz GDDR3 512MB, it ressembles the Palit custom cooler version.

Wonderful! Now I feel that I'm doing you a favor by giving you more time to play the games you enjoy. Permabanned... again. -Admin DrPizza
 
Last edited by a moderator:

toyota

Lifer
Apr 15, 2001
12,957
1
0
The 9600GT is the minimum for playable High APEX PhysX effects. I used it as a dedicated PhysX card with Batman AA, Cryostasis, Mirror's Edge and even Mafia 2 and never experienced a single slow down, considering that my 9600GT looks and feels like a chinese knock out and the core runs at 600MHz and RAM at 1.2GHz GDDR3 512MB, it ressembles the Palit custom cooler version.
sorry but you still had slowdowns in Cryostasis and a 9600gt does not even meet the minimum requirements for dedicated physx card for Apex high settings in Mafia 2. if you had something like a gtx480 for graphics then using a 9600gt for physx would actually be no better than letting a gtx480 do both.
 
Last edited:

RootForce

Junior Member
Mar 4, 2011
6
0
0
That's what I expected. Its a shame though.
Really wanted to put the integrated graphics to good use when not being used for quick sync.

You guys do realize I didn't mean the actual CPU should handle physx. That would be way too slow.
However if the processor realized that it was being asked to handle graphics I thought maybe it would make use of the IGP.

Could someone with an nvidia card and sandy bridge try selecting CPU as the dedicated physx?
If it runs smoothly then either the IGP is being used or the settings fell back to discrete.

Finally how does a gtx 560 ti with a dedicated gtx 260 for physx sound?
Good futureproof match?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Im not sure Id call a 560 ti future proof.... I own a 5870 and I wouldnt call it future proof.

If you want future proof you should buy big... like 590 or 6990.
I doubt any game in the near future will be unplayable with performance of 580 SLI or 6970 Crossfire. So I would call that future proof.

I think useing a 260 for physx with a 560 will probably slow you down more, than if you just use your 560 ti for everything. This happends when the physx dedicated card is too slow compaired to the faster main card, from what I remember of seeing benchmarks where they tested it, think gap is to big between the 260 and the 560. Your better off just useing the 560 and throwing the 260 away.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
At this time you’ll only get software PhysX if nVidia’s driver detects an active non-nVidia graphics card in the system. It’s possible you might be able to hack around it but YMMV.
 

RootForce

Junior Member
Mar 4, 2011
6
0
0
The 560 isn't future proof but I think it will last a few years in sli. Also it OC's like a beast..
If only it had more VRAM.
How long until I need more than 1gb for running at 1080p?

I don't see how a gtx 260 for physx would slow anything down.
Aren't the 8xxx series cards generally recommended for physx?
If your correct I'd expect such a setup to come to a screeching halt.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
The 560 isn't future proof but I think it will last a few years in sli. Also it OC's like a beast..
If only it had more VRAM.
How long until I need more than 1gb for running at 1080p?

I don't see how a gtx 260 for physx would slow anything down.
Aren't the 8xxx series cards generally recommended for physx?
If your correct I'd expect such a setup to come to a screeching halt.
like I said earlier it will depend on what gpu that you are using to determine what level of card would be suitable for a dedicated physx card. a gtx260 will be fine for quite a while.

EDIT: and going back to your op; a gtx260 is far from loud, hot and consuming a lot of power. silentpcreview.com even recommended the gtx260 and they are picky as hell about noise. heck my gtx260 is about the coolest and quietest running card I have owned which is saying a lot because my other cards were low end. power consumption is not bad either especially considering mine is 65mn and overclocked to boot. my entire system rarely goes much over 250 watts.

now I personally would not want to use it as a physx card because it is quite large. if I was a higher end enthusiast with something like a gtx570 or gtx580 and a larger case then I probably would not mind a card like the gtx260 for physx though.
 
Last edited:

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
"my loud, power hungry, and hot gtx 260"

Really? But Lonyo is correct. Your Sandy Bridge cannot run GPU PhysX. That is what GPU's are for.

From what we've seen, PhysX would probably run extremely well on Sandy Bridge, from a relative standpoint at least.

I hope that OpenCL catches on, personally. It makes sense to use the CPU for physics to free up the GPU for other stuff.

What would be really cool is if Llano could do physics. With its 400+ shaders, it could probably do a half decent job.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,117
9,371
136
I got my fingers crossed that AMD's Fusion processors allow exactly this to happen only with an OpenCL based physics library. It'd be a great way for them to market their processors to enthusiasts, set themselves apart from Intel (and subsequently have Intel follow thru with support) and it wouldn't lock anyone out of the market (NV/ATI cards can both handle OpenCL, and the absence of a Fusion processor just means turning those settings down or dealing with lower framerates like everyone already does with every other game setting.

Of course its probably way too much to ask from an AMD that currently can't tell its left nut from its right.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
With the z56 chipsets being made available soon we will finally be able to have discrete and integrated gfx in our desktops.

I recall seeing the setting "CPU" when choosing a dedicated physx card in nvidia control panel.
Are the processors smart enough to process physx with the igp in tandem with discrete graphics which handle the rest of the game?

I was going to use my loud, power hungry, and hot gtx 260 as a physx card in my upcoming SB rig but if this works it would change everything.

The intel 3000 IGP in the SB k series should handle physx right?
if not I believe it can be oc'd upto 1350mhz.
Either way its on par with lower end cards that can handle physx!

Actually the IGP HD 3000 has been successfully O/Ced to 1.9 ghz , But it required a .2 V increase
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Nope, NV hasn't coded PhysX to allow anything except an NV GPU running CUDA to process GPU PhysX effects.
Even if they coded it to run on OpenCL, it still wouldn't work because the the integrated Sandybridge GPU can't run OpenCL.


For instance, OpenCL can be used in Apple' iLife titles, such as iPhoto for scene parsing and face recognition. OpenCL has been somewhat of a trump card for graphics chip supplier Nvidia, which already has support for the technology in most of its chips. Although Intel plans to support Open CL natively in its processors and has released Alpha drivers and a software development kit for Open CL, that support, as stated publicly, is CPU-centric and still at a nascent stage of development.

But Intel is also working on OpenCL for the graphics part of Sandy Bridge, according to sources.

Intel declined to comment directly on Apple's plans, but regarding OpenCL it would only say "In terms of full product support, we continue to evaluate when and where OpenCL will intercept our various products."
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
true hardware PhysX (or any other hardware physics api) runs several orders of magnitude faster on a GPU, in other words a lowly GPU that could be passively cooled can run PhysX without breaking a sweat while the fastest CPUs would churn out slide show results.

that being said your argument might be more valid suggesting people to just turn PhysX off and not bothering to invest in a dedicated PhysX card as it could be convincingly argued that the PhysX effects just aren't worthwhile. I know I sure wouldn't bother with it if my SLI GTX470s couldn't handle it without pause.

Do we really Know this as fact . What if AVX was used
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
true hardware PhysX (or any other hardware physics api) runs several orders of magnitude faster on a GPU, in other words a lowly GPU that could be passively cooled can run PhysX without breaking a sweat while the fastest CPUs would churn out slide show results.

that being said your argument might be more valid suggesting people to just turn PhysX off and not bothering to invest in a dedicated PhysX card as it could be convincingly argued that the PhysX effects just aren't worthwhile. I know I sure wouldn't bother with it if my SLI GTX470s couldn't handle it without pause.

Do we really know this as fact . What if AVX was used
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Whats up with forums . There should Not have been a double post as I was editing the word lnow to know
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
thats why we need open platform physics to take off,

btw shame intel didn't support openCL yet, it will just be a waste of silicon
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I got my fingers crossed that AMD's Fusion processors allow exactly this to happen only with an OpenCL based physics library. It'd be a great way for them to market their processors to enthusiasts, set themselves apart from Intel (and subsequently have Intel follow thru with support) and it wouldn't lock anyone out of the market (NV/ATI cards can both handle OpenCL, and the absence of a Fusion processor just means turning those settings down or dealing with lower framerates like everyone already does with every other game setting.

Of course its probably way too much to ask from an AMD that currently can't tell its left nut from its right.


Physics has always been the redheaded stepchild of game developers. It simply requires too many resources, doesn't impact game play that often, and has trouble running on current consoles. Simply combining a gpu and cpu doesn't solve these problems.

Of these the most important factor is probably the limitations of consoles. The next generation consoles should have quad core processors and modern graphics cards making them much better able to handle physics, but until then I don't see a lot of emphasis on improved physics in games. Even then, the games will utilize the quad core cpu as much as the gpu for physics.

That leaves desktop enthusiasts right back where we started requiring a more powerful separate cpu and gpu then a console because we still have to run an operating system. For physics and AI an 8 core cpu is an ideal minimum due to the symmetry of the equations. Eventually AMD might combine an 8 core cpu and gpu on a fusion chip, but I'm not holding my breath.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Physics has always been the redheaded stepchild of game developers. It simply requires too many resources, doesn't impact game play that often, and has trouble running on current consoles. Simply combining a gpu and cpu doesn't solve these problems.

Of these the most important factor is probably the limitations of consoles. The next generation consoles should have quad core processors and modern graphics cards making them much better able to handle physics, but until then I don't see a lot of emphasis on improved physics in games. Even then, the games will utilize the quad core cpu as much as the gpu for physics.

That leaves desktop enthusiasts right back where we started requiring a more powerful separate cpu and gpu then a console because we still have to run an operating system. For physics and AI an 8 core cpu is an ideal minimum due to the symmetry of the equations. Eventually AMD might combine an 8 core cpu and gpu on a fusion chip, but I'm not holding my breath.

I think that was the apu purpose is to become a co-processor like fpu did back then.
 
Status
Not open for further replies.