PhysX worthless with ATI?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: evolucion8
Pff wow, the same features which are enabled on the GeForce 8 like PhysX, DX10.1 is simply more inmersive in terms of graphics effects, I wonder how a GTX 280 will perform when doing real time globall ilumination, and I meant the real one, not some fake using fake cubemaps, and please, trolling? I'm not the one who sold his soul to nVidia to get free stuff ;) and uses some smokescreen to hide the truth.

:roll:
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Wreckage
Originally posted by: evolucion8
Pff wow, the same features which are enabled on the GeForce 8 like PhysX, DX10.1 is simply more inmersive in terms of graphics effects, I wonder how a GTX 280 will perform when doing real time globall ilumination, and I meant the real one, not some fake using fake cubemaps, and please, trolling? I'm not the one who sold his soul to nVidia to get free stuff ;) and uses some smokescreen to hide the truth.

:roll:

Everything is fine now, you are late to the party, enjoy the drinks left:wine:
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: chizow
Like I said, I don't have a list and I sure as hell am not going to cross-reference and compile one. There's 2 reliable sources who have echoed 20-25 titles before X-mas. I'm sure not all will be GPU-accelerated but its certainly a good indication PhysX is gaining traction.
Well I did take the time going to IGN, nvidia, and other sources to look up the release dates, so your sources must not be very reliable. I would hope there's some extra workload for GPU PhysX in those titles.

If they enabled GPU PhysX and show additional physics effects where there were none before, there certainly is a benefit.
Agreed, people should be able to go to Nvidia.com and clearly see titles that offer additional physic effects from having GPU physx. Gears of War and other similar titles shouldn't be listed.

And that's certainly your prerogative but to say the differences aren't obvious is clearly untrue.
That's because the differences are very small, and some difference can be caused by the time of day. If it so obvious why was very high left out of DX9?

But they are still adding value and functionality where there was none before. Heck, if they fix the WDDM driver issue, you could even have a 4870X2 + NV PhysX card or even ATI CF with a PhysX card in your 3rd x16/x4 electrical PCIE slot. What's not to like about that?
Agreed. I don't see Nvidia or ATI playing nice with that combination, but it would be nice if it works.

We need faster CPUs for current GPUs anyways, but that doesn't matter since GPU PhysX allow effects that are not possible otherwise. Sure it has a hit on overall performance, but its still playable and runs circles around CPU-only performance. Again, this is no different than cranking up options or AA in any game you have today.
That's just it we need faster CPUs for current GPUs and for PhysX. UT3 shows the biggest hit right now, firingsquad showed it CPU bound at a avg of about 40fps with PhysX. Just a sign that PhysX is not going to see it's true power until new CPUs come out. AA is the GPU bound so it's actually the opposite of PhysX.

Well its obvious you're not going to be impressed, but as I said before, what we've seen so far is still better than anything we've seen recently in the GPU industry and more is yet to come.
The keyword is yet. Still it's great that Nvidia took the 1st step and it's free.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
The keyword is yet. Still it's great that Nvidia took the 1st step and it's free.

i almost changed my mind.

After seeing what they are doing at Nvision with PhysX and 3-D/CUDA ... i think Nvidia has much better than 50-50 chance to make PhysX a standard in gaming

rose.gif


i will give my reasons later .. but i have to head to my regular work for a few hours
- then i have a Nvision08 Wrap-up to write about and a game investigation to post

unlike Anand, color me impressed with Nvision .. and also with where Nvidia is going with PhysX/CUDA/3-D gaming; they are simply extending their TWiiMtB program :p

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
Well I did take the time going to IGN, nvidia, and other sources to look up the release dates, so your sources must not be very reliable. I would hope there's some extra workload for GPU PhysX in those titles.
You're missing the point. Games under development aren't going to give you reliable release dates and often aren't even reliably announced. They're certainly not going to give solid details about feature sets as that commits them to it. You might as well ask to look into a Crystal Ball and demand a list of specs for ATI and NV's next GPUs, its just not going to happen. However, in this case, I gave direct links to quotes from people in-the-know, Roy Taylor from NV.

Agreed, people should be able to go to Nvidia.com and clearly see titles that offer additional physic effects from having GPU physx. Gears of War and other similar titles shouldn't be listed.
Why shouldn't they be listed? They use the PhysX SDK for their software/CPU PhysX. If anything they should clearly denote whether they use hardware PhysX.

That's because the differences are very small, and some difference can be caused by the time of day. If it so obvious why was very high left out of DX9?
They're not very small, look at any of the cliff faces on the 2nd page or go in-game and look at images that aren't static 200x120 jpegs. There's no point in arguing this really, if there's no difference you might as well play on Medium settings and call it a day.

That's just it we need faster CPUs for current GPUs and for PhysX. UT3 shows the biggest hit right now, firingsquad showed it CPU bound at a avg of about 40fps with PhysX. Just a sign that PhysX is not going to see it's true power until new CPUs come out. AA is the GPU bound so it's actually the opposite of PhysX.
No, faster PhysX parts will not distinguish themselves from slower parts, but that doesn't mean PhysX isn't showing an advantage over CPU-only solutions right now. Dependencies aside, the end result is the same as you trade FPS for eye-candy with features like AA or raising detail/effects fidelity.

The keyword is yet. Still it's great that Nvidia took the 1st step and it's free.
Of course PhysX is forward looking just like any new feature set, but again, the difference is there is no yet for AMD/Intel and once they finally push something out the door, that yet will be months/years behind anything NV is offering and what games are supporting in the way of hardware physics.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: apoppin
The keyword is yet. Still it's great that Nvidia took the 1st step and it's free.

i almost changed my mind.

After seeing what they are doing at Nvision with PhysX and 3-D/CUDA ... i think Nvidia has much better than 50-50 chance to make PhysX a standard in gaming

rose.gif


i will give my reasons later .. but i have to head to my regular work for a few hours
- then i have a Nvision08 Wrap-up to write about and a game investigation to post

unlike Anand, color me impressed with Nvision .. and also with where Nvidia is going with PhysX/CUDA/3-D gaming; they are simply extending their TWiiMtB program :p

Exactly. Except now they're actually offering tangible benefits. Glad you had fun, looks like it was a blast, especially GeForce LAN. Glad to see the NV perimeter detectors didn't xplode your Radeon! ;)
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Taltamir, I read perfectly fine, thank you :p I know Crysis first order physics, on the scale of 3000 barrels, doesn't run very well, if at all hehe. But, erm, when are 3000 moving barrels going to add anything to my gaming experience :p You're using a completely ludicrous example. I'm talking oil barrels you can shoot at the top, and it will only empty till where the hole is. I'm talking being able to mow down trees at exactly the point where you shoot them.

But I don't think you guys really get it, or perhaps I'm not getting it, who knows. I think Nvidia is doing great with what they are doing, ALL I'm saying is, physx are NOT a selling point as of yet, especially when you consider the ATI HD4850's better bang for buck for example. That's it. No more, no less. I don't think anyone can convince me, that in a short period of time, there will be games out there, that make a nvidia card a musthave. And I don't think you can even prove it, I've been installing games for a long time now and for a long time have they come with physx software, pretty damn often. Having physx doesn't mean it will be gpu accelerated physx, nor that it will make a world of difference. This is in no ways about directx 10.1, I don't care about all that. It's about saying, dude, you should buy an nvidia card, because it has physx. That's where I disagree.

The performance hit you take when enabling physx is something I would take for granted if it really makes a difference, and I think physics can make a difference. Often it's pretty cool stuff, and if Nvidia can take it to the next level with gpu accelerated physx, more power to them. If there are games for it, supporting it, then I will be the first to tell people to consider an nvidia videocard because you get some awesome physics in lots of games, enhancing the gameplay experience. But before that, no can't do. By the way, I was kind of planning on getting a HD4850, great value in my country, but I changed my mind. I'm going to wait a little longer, for multiple reasons, and the fact that an Nvidia videocard like a GTX260 might mean something extra for me as a reviewer, only strengthened my position on waiting.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: MarcVenice

But I don't think you guys really get it, or perhaps I'm not getting it, who knows.

I think the video of Cell Factor is exactly what you are talking about. No?
http://www.youtube.com/watch?v...JiXaYs&feature=related

It's a great example of how physics directly affect game play and not just eye candy like bloom or HDR.

In fact I can think of any other technology since 3D to affect game play so much. AA/AF etc. don't affect game play. Physics actually does.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
We GET what you are talking about, we say that's exactly what we want, and we say physX can do it already to a limited degree. (aka, its a major step towards it).

I'm talking oil barrels you can shoot at the top, and it will only empty till where the hole is.

physX can do it TODAY. It does not look 100% right (aka, gravirt isn't right and the liquid is made out of noticable droplets), but it CAN and IS being done.

If physX renders that only on the CPU, it will be at unplayable frame rate, if on the GPU and CPU at the same time, it will work...

Question is, how many such oil barrels can be on the screen at the same time. But last week I ran a physX liquid demo that did exactly that on an impressively large scale (aka, amount of water) on my GPU.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Wreckage
I think the video of Cell Factor is exactly what you are talking about. No?
http://www.youtube.com/watch?v...JiXaYs&feature=related

It's a great example of how physics directly affect game play and not just eye candy like bloom or HDR.

In fact I can think of any other technology since 3D to affect game play so much. AA/AF etc. don't affect game play. Physics actually does.
Cell Factor can also run without PPU accelerator really smooth, but there's still no GPU PhysX support that I know of.

Still I don't think this is was MarcVenice has in mind.

 

sourthings

Member
Jan 6, 2008
153
0
0
Every reference to physx in this thread is talking about demos, or online videos giving examples. Or a level of UT3, or two games, one awful, the other so-so. And then allusions to 'coming soon' garbage titles.

That right there is all you need to know to sum up physx at this point in time. When physx is actually worthwhile, if it even manages to get some quality market share. The current nv cards will be bargain cards.

Physx is not a reason to buy an nvidia card today or for a good six months to come.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
I know physx can do all that taltamir, and it's ffing great. You don't get what I'm saying though. I'm saying "there are no games out there, and there won't be any games out there for a while, that will do all that kind of cool stuff, warranting an nvidia videocard". Do you understand that? It's okay if you disagree, but then we'll have to agree to disagree, for a while, untill some games DO start appearing that warrant an nvidia videocard.

And no Wreckage, I'm not impressed with the cellfactor game. Yes, throwing crap around is fun, for 3 minutes, just like playing pong is fun for 3 minutes. It does NOT enhance gameplay though. I'm talking FPS, or RTS, where stuff gets blown to pieces and leaves craters, where cover can be shot at and removed, or you can shoot THROUGH cover, like in GRAW 2. Those kind of physics make you think twice before you blow something up, or, leaves you with more possibilities then just one. It forces you to think more about your actions, shooting a massive tower could block a road, shooting a building to pieces could kill it's (civilian) inhabitants, and gets you bonuspoints, or not of course. It adds strategic value, and in a way, gameplay value. Just throwing around barrels because it looks cool is worthless. Fact is, most of those physics I described can allready be found in some games, and don't need GPU acceleration. But, those physics could be improved and expanded upon, with the use of GPU acceleration, but those games aren't here yet, and won't be out yet for a while. I do encourage Nvidia to try and get developers to use more physics. But it's still not a selling point.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
a reason to buy a card over all others? no...
A reason to tilt the perceived value of said card? yes...

Would you say it is worth 5$ to you when calculating price performance? 0$? 10? 20?
If it comes to the level of fruition that you suggest, it will give nvidia such a huge overwhelming bonus that there will absolutely no reason to buy a competitor card, at any price!
It is a question of how many dollars are you willing to "gamble" so to speak on physX.

When comparing card A to card B. Take the price of each, add to it the "added value" from things like speed disparity (which one is faster), physX, DX10.1, warranty, fan quality, etc. And see which one of them is worth more, and choose it.

Originally posted by: MarcVenice
I know physx can do all that taltamir, and it's ffing great. You don't get what I'm saying though. I'm saying "there are no games out there, and there won't be any games out there for a while, that will do all that kind of cool stuff, warranting an nvidia videocard". Do you understand that? It's okay if you disagree, but then we'll have to agree to disagree, for a while, untill some games DO start appearing that warrant an nvidia videocard.

Actually I do finally understand. You say you understand physX on GPU could do that, there jsut aren't any actual games on sale today that use those features as of yet. Only titles "in development". Very VERY valid point.
Future proofing in computer hardware is generally a bad idea since by the time the "future" arrives, the hardware you bought is obsolete, can be had for penuts. And usually too weak to perform the future proofed task at reasonable performance.

Which is why I personally assign a relatively low money value to physX when weighing competing cards. But it is more then 0$.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: sourthings
Every reference to physx in this thread is talking about demos, or online videos giving examples. Or a level of UT3, or two games, one awful, the other so-so. And then allusions to 'coming soon' garbage titles.
Which is all far, far ahead of anything the competition offers. Of course its a horse and carriage/chicken and egg situation, devs will simply not implement features for vaporware or until the tools are available. The difference is GPU PhysX is now a reality when it wasn't only a few months ago.

That right there is all you need to know to sum up physx at this point in time. When physx is actually worthwhile, if it even manages to get some quality market share. The current nv cards will be bargain cards.
When PhysX is actually worthwhile its going to be all or nothing if you don't have an NV card. And if it takes as long as you think and the current cards are bargain cards, you can just use them as a PhysX card instead of selling them at bargain basement prices on Ebay. Win-Win situation. :)

Physx is not a reason to buy an nvidia card today or for a good six months to come.
Not by itself, but its certainly a consideration like any other feature if price and performance are similar.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Here's the thing taltamir, you can say, ok, physx is worth $10 to me, so I don't mind if I pay $250 for a GTX260, instead of buying a $250 HD4870, which is slightly faster ( let's say $10 worth faster ). But, if there are no games that really support physx, and the HD4870 gives you the exact same gaming experience, with better framerates/more eyecandy, then you LOST $10 worth of performance. It's a gamble. If all things equal, and you tell ppl to get a GTX260 because of physx, instead of a HD4870, you are gambling with THEIR money.
 

sourthings

Member
Jan 6, 2008
153
0
0
Originally posted by: chizow
Originally posted by: sourthings
Every reference to physx in this thread is talking about demos, or online videos giving examples. Or a level of UT3, or two games, one awful, the other so-so. And then allusions to 'coming soon' garbage titles.
Which is all far, far ahead of anything the competition offers. Of course its a horse and carriage/chicken and egg situation, devs will simply not implement features for vaporware or until the tools are available. The difference is GPU PhysX is now a reality when it wasn't only a few months ago.

That right there is all you need to know to sum up physx at this point in time. When physx is actually worthwhile, if it even manages to get some quality market share. The current nv cards will be bargain cards.
When PhysX is actually worthwhile its going to be all or nothing if you don't have an NV card. And if it takes as long as you think and the current cards are bargain cards, you can just use them as a PhysX card instead of selling them at bargain basement prices on Ebay. Win-Win situation. :)

Physx is not a reason to buy an nvidia card today or for a good six months to come.
Not by itself, but its certainly a consideration like any other feature if price and performance are similar.

Physx is not some magical mystery, and it's a given game developers will not program for only one brand of video card, shutting out a huge market to themselves. This in and of itself is why Physx as it stands today, is going to go nowhere.

It's not a far stretch, that if a year from now, game developers en masse actually decide to bother with gpu physx and it becomes the norm. You will obviously see it on ATI hardware. And considering it runs on shaders, well, ATI and shaders...

A consumer who cares about performance will go the route of which card performs better, not offers useless features. A sensible consumer will look at framerates, not physx, a pointless feature.

If money is a concern, then they'll look at cost and contrast the price and performance of nv vs ati cards.

If they're sensible they will ignore physx and not let it influence their decision, which is exactly what nvidia wants, physx to influence purchasing decisions. Because at the moment, nvidia needs some gimmick to sway consumers.

Planning on using an nv card today to use as a secondary card for physx tomorrow is silly. If it ever does gain some market share, and it likely won't, developers will want to go the universal route, not leave people who want to buy there game out in the cold because they don't own nv, but if if does, you won't need a secondary card to run it, the performance hit should not be that huge. Not to mention you lose the ability for SLI and likely an add-in soundcard because your slots are cramped and full now.

Physx is a marketing trick to try and sell NV cards today, because ATI is doing a heck of a job trumping nvidia's offerings currently.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: sourthings
Physx is not some magical mystery, and it's a given game developers will not program for only one brand of video card, shutting out a huge market to themselves. This in and of itself is why Physx as it stands today, is going to go nowhere.
I've already gone through all this but I guess I can bludgeon it through one more thick skull. Devs will program for the bleeding-edge minority because they want to make the best possible games using all tools available to them. This has been proven time and time again, most recently with EAX5/X-Fi and DX10. The install base for GPU PhysX by NV's count has already reached 70 million in a PC market they dominate with a 2:1 edge on discrete GPU sales. By default there are more PhysX capable machines than DX10, as they use the same hardware but PhysX does not require Vista. We've seen numerous DX10 titles, and I'd estimate nearly 25-30% of my titles purchased over the last 2 years support DX10. Same goes for EAX, which is actually closer to 40-50% of my games, which is particularly impressive given Creative's market share is probably somewhere between 10-20% of gaming PCs based on the Valve survey.

It's not a far stretch, that if a year from now, game developers en masse actually decide to bother with gpu physx and it becomes the norm. You will obviously see it on ATI hardware. And considering it runs on shaders, well, ATI and shaders...
Its not obvious when ATI is blowing smoke up your ass about Havok and how they'll "support GPU physics when its faster" and denying NGO samples to test on. NV has already stated numerous times they're willing to work with ATI to support PhysX on their parts...for a price of course. The other option is DX11 which won't release until next year, maybe 2010 and while it may be compatible with PhysX using some wrapper that's certainly not a given.

A consumer who cares about performance will go the route of which card performs better, not offers useless features. A sensible consumer will look at framerates, not physx, a pointless feature.

If money is a concern, then they'll look at cost and contrast the price and performance of nv vs ati cards.
And when price and performance are similar a sensible consumer will look at other features like PhysX, whether you consider them pointless or not is irrelevant.

If they're sensible they will ignore physx and not let it influence their decision, which is exactly what nvidia wants, physx to influence purchasing decisions. Because at the moment, nvidia needs some gimmick to sway consumers.
I think their products stand well on their own from top to bottom after the initial price adjustments, certainly the purchasing decisions on this forum indicate as much and I'm sure the quarterlys/market share reports will as well. Don't mistake ATI being competitive again as Nvidia needing gimmicks to sell their cards.

Planning on using an nv card today to use as a secondary card for physx tomorrow is silly. If it ever does gain some market share, and it likely won't, developers will want to go the universal route, not leave people who want to buy there game out in the cold because they don't own nv, but if if does, you won't need a secondary card to run it, the performance hit should not be that huge. Not to mention you lose the ability for SLI and likely an add-in soundcard because your slots are cramped and full now.
Its not silly if you already own an NV card, like many here do/did. I'm sure there's plenty of people who will make nice use of an 8800GT or 9600GT in a year as a PhysX card when they upgrade instead of trying to sell it for $30 on Ebay or FS/FT.

Physx is a marketing trick to try and sell NV cards today, because ATI is doing a heck of a job trumping nvidia's offerings currently.
Really? I don't see it. NV still has the highest performing single GPU and arguably the fastest overall solution with TRI-SLI. The only offering NV doesn't currently have an answer for is the 4870X2 at the very high end, but I certainly don't see the 40+ buyers we saw in that pre-release poll...... :)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: sourthings
Every reference to physx in this thread is talking about demos, or online videos giving examples. Or a level of UT3, or two games, one awful, the other so-so. And then allusions to 'coming soon' garbage titles.
Well of course just a few months after buying the company they should have inserted PhysX into every game ever made and everyone currently in development. :roll:

That right there is all you need to know to sum up physx at this point in time. When physx is actually worthwhile, if it even manages to get some quality market share. The current nv cards will be bargain cards.
No, that right there is you limited view on the subject.

Physx is not a reason to buy an nvidia card today or for a good six months to come.
It's more than enough reason considering it gives them a huge edge over ATI. Since they are about equal on price and performance. There are several games out there ready to play and more on the way. Just because you can't/refuse to see that is not really an issue.

 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
Originally posted by: sourthings A sensible consumer will look at framerates, not physx, a pointless feature.
.

so you are saying you would prefer a card that gives you 200 fps w. 0xAA, over a card 190fps at 16xAA.
 

sourthings

Member
Jan 6, 2008
153
0
0
Originally posted by: chizow

And when price and performance are similar a sensible consumer will look at other features like PhysX, whether you consider them pointless or not is irrelevant.

4870 is faster than GTX260, similar price, 4870 is 80-85% of a GTX280, sometimes faster, $150 cheaper. 4870X2 is faster than GTX280, in many cases, faster than GTX280 SLI. 4870X2 CF is generally faster than GTX280 Tri-Sli and is $200 or so cheaper and doesn't require the usage of a horrible nvidia motherboard.

About the only answer ATI does not have to NV's lineup is a something equal to GTX280, they either have something with close to it's performance, or more. The former is cheaper, the latter more expensive.

Have your PhysX, it's a revolution! It will change gaming as we know it. I'll revisit this happily six months from now, when the glorious PhysX has revolutionized the gaming market and no one is buying ATI cards anymore because every game is using PhysX and ATI has no hardware physics to speak off....

:roll:



 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: sourthings
Originally posted by: chizow

And when price and performance are similar a sensible consumer will look at other features like PhysX, whether you consider them pointless or not is irrelevant.

4870 is faster than GTX260, similar price, 4870 is 80-85% of a GTX280, sometimes faster, $150 cheaper. 4870X2 is faster than GTX280, in many cases, faster than GTX280 SLI. 4870X2 CF is generally faster than GTX280 Tri-Sli and is $200 or so cheaper and doesn't require the usage of a horrible nvidia motherboard.
I can find reviews refuting all these claims just as easily as you can find ones that support them. :)

About the only answer ATI does not have to NV's lineup is a something equal to GTX280, they either have something with close to it's performance, or more. The former is cheaper, the latter more expensive.
The same can be said of the only part NV doesn't have a direct answer for, the 4870X2.

Have your PhysX, it's a revolution! It will change gaming as we know it. I'll revisit this happily six months from now, when the glorious PhysX has revolutionized the gaming market and no one is buying ATI cards anymore because every game is using PhysX and ATI has no hardware physics to speak off....

:roll:
Yep, I'm sure you will be back...but only if ATI is still competitive right? ;) I've been through enough of these anti-progress threads to know they never come back because they're always wrong. But time will tell. :)



[/quote]