NVIDIA, Epic add DX11 features to Unreal Engine 3

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Funny you mention Unreal Engine 3 because with the addition of PhysX I think this gives a real boost to PhysX. Now, assuming it's GPU accelerated that will allow developers to more easily utilize hardware PhysX in games developed using the Unreal Engine 3. At that point I'd have to finally, after all these years, consider PhysX support in any video card I buy as a real point of interest. As it stands today, PhysX is just another bullet point among many on what features a video card supports that I don't do much but give a glance at. So after over three years and millions spent in "developer support" nVidia might finally have bought...errr...supported...the right developer in helping to push PhysX.

PhysX has been the default physics engine for Unreal Engine 3 since it was released. It has supported hardware accelerated (PPU) physics since it was pretty much released. Anandtech even did an article on it which I linked earlier in the thread.

If it was going to magically drive GPU PhysX adoption, it might have done it across the hundreds of UE3 titles shipped so far, rather than only about 3 making any use of it whatsoever. (3 UE3 engine titles that is, not 3 titles overall on any engine).


Some of the middleware supported by Unreal Engine 3:
PhysX
Scaleform GFx
TriOviz 3D
Steamworks
Speedtree
Bink Video
Face FX

And I'm sure there are others.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0


You are funny:
http://breachgame.com/game-info/

No matter how hard people try and pronouce PhysX dead...it's still kicking..and the most used physics API :p
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
You are funny:
http://breachgame.com/game-info/

No matter how hard people try and pronouce PhysX dead...it's still kicking..and the most used physics API :p

Funny? Why is it funny.
You said basically "look at this tech demo of APEX PhysX in Unreal Engine 3"
Then I said "We had accelerated PhysX in Unreal Engine 3 doing accelerated destruction of objects 3+ years ago".
And that's funny?
What's funny is you acting like we're "getting there" because NV and Epic can show off things which are barely different to what we had 3 years ago, and act like it's progress.

Oh, and I like your link to that game. I already linked it on your PhysX claims thread. It runs on th CPU, and works on Xbox 360 (apparently it's also crap).
So I don't get what you linked it for, unless you are trying to show us that you can have destruction simulated by physics engines without needing GPU power?

In fact, do you know what Breach does with the extra power offered by an NV GPU? It adds more particle effects. Doesn't actually improve the game-changing element, just gives more debris etc.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Neither of those statements are correct.

You can't tack on Dx10 onto a Dx9 game. You have to develop a completely separate Dx10 render path from scratch and offer it alongside the old Dx9 path.

The Dx11 API is NOT backwards compatible to the Dx10 API. The Dx11 API is capable of supporting Dx10 and Dx9-class hardware with a single render path, by disabling features that the lower-end hardware cannot use. You still have to write a brand new Dx11 render path to support it if you are trying to add support to an existing Dx9 or Dx10 game.

The only thing that Dx11 gives you an advantage on is that if you are writing a new game or render path from scratch, you can potentially target all 3 feature levels at once with minimal additional work.

Which part is not a fact? Look up batman AA or AA+deferred shading on google and you will find that it can be done. I never say it is easy, i said it is possible, and "it has been done" multiple times on different games.

About the Dx11 API, it is backward compatible to Dx10 and a few Dx9 hardwares. In short, a game that is written based upon Dx11 can run on Dx10 card, this is the reason of the massive change between Dx9 and Dx10. A little history recap, Dx9 does everything dx10 can do, in fact it is a bit more powerful then dx10. The problem with Dx9 is the fact that different vendors have different features. To support it, Dx9 has a lot of "capability bits" or "CAPS" which were added through the revision. Although it is more powerful, it got to a point that the CAPS are so messy that a)there are so many of them, b)Different video cards of the same vendor use different CAPS for the same feature, so let alone different vendors, c)No one really know whether a new feature works or not without testing all video cards ever existed, d)No one really know if 2 or 3 different features of different CAPs are compatible to be utilized at the same time. These are the reasons why games were so unstable back then. Vista was suppose to fix it with its Dx10, but unfortunately, it ended up worst then Dx9. Remember?

So to simplify things, Dx10 is developed, by removing all the old CAPs and lay out a new standard. This standard is simply a set of hardware capability that hardware must have, which is also known as Dx10 compatibility. On top of that, there is a standard Dx API which uses them, this API is called the Dx10 API. (Please please please note that dynamic tessellation was on ATI Dx9 cards, but Dx10 failed to support it because it is really like a vendor specific feature at the time, and therefore didn't get classify as minimum hardware capability.)

By now, you should be able to see that you don't need to use Dx10 to utilize Dx10 features. Programmers can make calls from application layer to device layer(or even graphics layer) as long as the hardware itself supports such feature(of course the video card will support the feature if it is Dx10 compatible!)

If you ask what happens to features exclusive to Dx11? Sorry those doesn't exists. Dx is nothing more than an abstract layer to device layer so programming can be done easily. There will only be features that Dx doesn't support, but not the other way around (Else the video card won't be said as Dx11 compatible!!)

So don't wet anything if AMD suddenly release a driver that make tessellation possible through a new driver on 4xxx series. It is possible. Ask yourself, what happens of you are a vendor and have a new eye candy that only your hardware can run it, and say you will like to call it PhysY.... I believe you got the idea.


It's amazing how people that know nothing about coding (not talking about you) keep making the same uninformed claims :|

The myth about "tacked" on is really stupid and need to die.
Personal attack : Check
No value : Check.
I'm too amazed how people that know nothing about programming keep making the same uniformed claims :|
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Dx9 does everything dx10 can do
Most definitely not
Dx9 does everything dx10 can do, in fact it is a bit more powerful then dx10
Actually, if they are doing the same thing DX10 is cheaper (computational resources wise). DX10 games are typically heavier because they get fancy and do much more.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Which part is not a fact? Look up batman AA or AA+deferred shading on google and you will find that it can be done. I never say it is easy, i said it is possible, and "it has been done" multiple times on different games.

About the Dx11 API, it is backward compatible to Dx10 and a few Dx9 hardwares. In short, a game that is written based upon Dx11 can run on Dx10 card, this is the reason of the massive change between Dx9 and Dx10. A little history recap, Dx9 does everything dx10 can do, in fact it is a bit more powerful then dx10. The problem with Dx9 is the fact that different vendors have different features. To support it, Dx9 has a lot of "capability bits" or "CAPS" which were added through the revision. Although it is more powerful, it got to a point that the CAPS are so messy that a)there are so many of them, b)Different video cards of the same vendor use different CAPS for the same feature, so let alone different vendors, c)No one really know whether a new feature works or not without testing all video cards ever existed, d)No one really know if 2 or 3 different features of different CAPs are compatible to be utilized at the same time. These are the reasons why games were so unstable back then. Vista was suppose to fix it with its Dx10, but unfortunately, it ended up worst then Dx9. Remember?

So to simplify things, Dx10 is developed, by removing all the old CAPs and lay out a new standard. This standard is simply a set of hardware capability that hardware must have, which is also known as Dx10 compatibility. On top of that, there is a standard Dx API which uses them, this API is called the Dx10 API. (Please please please note that dynamic tessellation was on ATI Dx9 cards, but Dx10 failed to support it because it is really like a vendor specific feature at the time, and therefore didn't get classify as minimum hardware capability.)

By now, you should be able to see that you don't need to use Dx10 to utilize Dx10 features. Programmers can make calls from application layer to device layer(or even graphics layer) as long as the hardware itself supports such feature(of course the video card will support the feature if it is Dx10 compatible!)

If you ask what happens to features exclusive to Dx11? Sorry those doesn't exists. Dx is nothing more than an abstract layer to device layer so programming can be done easily. There will only be features that Dx doesn't support, but not the other way around (Else the video card won't be said as Dx11 compatible!!)

So don't wet anything if AMD suddenly release a driver that make tessellation possible through a new driver on 4xxx series. It is possible. Ask yourself, what happens of you are a vendor and have a new eye candy that only your hardware can run it, and say you will like to call it PhysY.... I believe you got the idea.



Personal attack : Check
No value : Check.
I'm too amazed how people that know nothing about programming keep making the same uniformed claims :|

Try tesselating (on the GPU) in DX9/10...
Try multi-threading (on the CPU) in DX9/10...
Try looking at OpenCL:
http://en.wikipedia.org/wiki/Open_CL

It really is amazing isn't it...

What is next...DX10 for XP?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
So don't wet anything if AMD suddenly release a driver that make tessellation possible through a new driver on 4xxx series. It is possible. Ask yourself, what happens of you are a vendor and have a new eye candy that only your hardware can run it, and say you will like to call it PhysY.... I believe you got the idea.



Personal attack : Check
No value : Check.
I'm too amazed how people that know nothing about programming keep making the same uniformed claims :|

That question was asked long ago, when there was knowledge of a tessellator on the gpu, in this article
http://www.extremetech.com/article2/0,2845,2329315,00.asp
When Gee was asked if the hardware tessellator currently built into AMD Radeon HD series GPUs would support DX11 tessellation, the answer was "No."
Gee went on to explain that DX11 tessellation was more robust and general than the solution built into current AMD GPUs. The AMD hardware uses essentially the same as the tessellation unit in the Xbox 360; DX11 tessellation is a superset of the AMD approach.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Try tesselating (on the GPU) in DX9/10...

DX9 Tessellation SDK
That is from AMD website in case you don't know.
Try multi-threading (on the CPU) in DX9/10...
Ageia tech demo some years ago. Look at the utilization on cores. Turn on speak and hear about the reason why the 4th core isn't completely utilized.
Read
It really is amazing isn't it...

What is next...DX10 for XP?
That was old news bro.
 
Last edited:

HeXen

Diamond Member
Dec 13, 2009
7,835
37
91
OPENGL 4.1 ftw. :)
lets get some real games on Linux for a change.
*edit. from my understanding, you can make any API do whatever hardware functions it is capable of, you just have to code it. I think thats the part that makes DX easy for development is its alread supported so less coding and hassles?
i read that U4 engine will use an internal API or i guess basically the engine will talk to the hardware itself. thats how it was put anyway.

also, remember software rendering? lol their suppose to bring it back i guess.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
PhysX has been the default physics engine for Unreal Engine 3 since it was released. It has supported hardware accelerated (PPU) physics since it was pretty much released. Anandtech even did an article on it which I linked earlier in the thread.

If it was going to magically drive GPU PhysX adoption, it might have done it across the hundreds of UE3 titles shipped so far, rather than only about 3 making any use of it whatsoever. (3 UE3 engine titles that is, not 3 titles overall on any engine).


Some of the middleware supported by Unreal Engine 3:
PhysX
Scaleform GFx
TriOviz 3D
Steamworks
Speedtree
Bink Video
Face FX

And I'm sure there are others.

Actually, I was under the impression that it used software physics. Especially considering the recent nVidia and Epic announcement at GDC. Then again, I haven't paid as much attention to gaming recently. Two kids in the house will do that.

You are funny:
http://breachgame.com/game-info/

No matter how hard people try and pronouce PhysX dead...it's still kicking..and the most used physics API :p

A sick horse will still kick. Doesn't mean it's healthy.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Actually, I was under the impression that it used software physics. Especially considering the recent nVidia and Epic announcement at GDC. Then again, I haven't paid as much attention to gaming recently. Two kids in the house will do that.



A sick horse will still kick. Doesn't mean it's healthy.

It can use either. There were 3 special maps released which showcase hardware PhysX in UT3, but the core game uses CPU PhysX only. The engine integrates PhysX so obviously it's available for anyone to use GPU acceleration, but most people don't.

Lonbjerg likes to flog the dead horse of PhysX being super popular, and acting like that has any meaning whatsoever when it comes to GPU PhysX.

I have a pretty picture for that.
physx.png

I'm sure he'll tell me if I missed any games. (Games, not tech demos or benchmarks or in development titles.)
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
So what does physx better/faster AMD, the cpu or gpu physx?

I say who cares, but if something gives me a free feature, that I might be able to utilize in some games, I like that idea, don't you guys?
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0

DX9 Tessellation SDK
That is from AMD website in case you don't know.

Comparing apples to oranges...or do you call the implementation of same level and performance?


Ageia tech demo some years ago. Look at the utilization on cores. Turn on speak and hear about the reason why the 4th core isn't completely utilized.

Who is ignorant now?!
http://www.anandtech.com/show/2716/4

If you don't even know what multitreading under DX 11 means, don't post all smarty-pants again.
But if I am to humor your ignornace...try running RealityMark or ANY other public available demo or PhysX game from the AGEIA era...and then report back to us.

I know the result...you obviously don't.







Read up:
  • New data types including 3-component vectors and additional image formats;
  • Handling commands from multiple host threads and processing buffers across multiple devices;
  • Operations on regions of a buffer including read, write and copy of 1D, 2D or 3D rectangular regions;
  • Enhanced use of events to drive and control command execution;
  • Additional OpenCL C built-in functions such as integer clamp, shuffle and asynchronous strided copies;
  • Improved OpenGL interoperability through efficient sharing of images and buffers by linking OpenCL and OpenGL events.
Most of these changes are possible due to hardware changes in DX11 hardware

That was old news bro.

Thanks for confirming that yoy truely are ignorant about the topic:
http://beyond3d.com/content/articles/55/

Now show me DX10 games running under XP with no artifacts..oh wait...you can't!

:thumbsdown::thumbsdown::thumbsdown::thumbsdown:

It really is old news that it is NOT possible, so perhaps you should do some serios reading up before posting debunked myths as facts...


This manner of posting is counter-productive and inflammatory. Please do not post in this manner.

This is a technical forum, not elementary school, and you are expected to conduct yourself accordingly.

Please familiarize yourself with the AnandTech Forum Guidelines:
We want to give all our members as much freedom as possible while maintaining an environment that encourages productive discussion. It is our desire to encourage our members to share their knowledge and experiences in order to benefit the rest of the community, while also providing a place for people to come and just hang out.

We also intend to encourage respect and responsibility among members in order to maintain order and civility. Our social forums will have a relaxed atmosphere, but other forums will be expected to remain on-topic and posts should be helpful, relevant and professional.

We ask for respect and common decency towards your fellow forum members.

(I'm going to keep quoting this same body of text, over and over again, because some of our VC&G forum members appear to have a real difficult time remembering it)

Moderator Idontcare
 
Last edited by a moderator:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
It can use either. There were 3 special maps released which showcase hardware PhysX in UT3, but the core game uses CPU PhysX only. The engine integrates PhysX so obviously it's available for anyone to use GPU acceleration, but most people don't.

Lonbjerg likes to flog the dead horse of PhysX being super popular, and acting like that has any meaning whatsoever when it comes to GPU PhysX.

I have a pretty picture for that.
physx.png

I'm sure he'll tell me if I missed any games. (Games, not tech demos or benchmarks or in development titles.)


Compared to what?

And your list is outdated, by a major factor:

http://physxinfo.com/index.php?p=gam&f=all
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
compared?
the red line is how many titles are released a year with CPU only physX, the blue line how many titles released a year with GPU physX.

Other physics middlewares perhaps?

And his list is still outdated...

and last I looked 20 > 0...
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Every single GPU physx title here:

http://physxinfo.com/data/vreview.html

There are 15... BUT one of those games only uses gpu physx in a couple levels. So that is only .3 of a game, the other is a free tech demo game download from nvidia, so it's off the list.

13.3 Games in 5 years use gpu physx.

:D:D:D:D

^^assume these smilies to represent me laughing at how ridiculous of a joke gpu physx has proven its self to be.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
Other physics middlewares perhaps?

So PhysX hardware acceleration in 20 or so games during 5 years of existence makes it a success because other physics middleware don't have GPU acceleration?

Looking at the numbers, I can understand why everyone but Nvidia doesn't invest heavily in GPU physics. Doesn't seem to take off despite Nvidia pushing it whenever they get the chance, does it?