Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.
Tim Sweeney is responsible for the engine used in Mirror's Edge, so what's your point?
Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.
Originally posted by: Cookie Monster
Edit - Misread![]()
Originally posted by: Scali
He said: "When not using 10.1". I think you owe him an apology.
Originally posted by: Scali
Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.
Tim Sweeney is responsible for the engine used in Mirror's Edge, so what's your point?
Originally posted by: Keysplayr
At two or three times the developement time, doesn't sound like he's actually "loving" it either. And it would appear that not all parts of the game are multithreaded. Just the ones' he had mentioned here.
My point is UE4 is in to works to avoid single-threaded code. Instead of making excuses, he's working on a new engine that will be ready for future CPUs.Originally posted by: Scali
Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.
Tim Sweeney is responsible for the engine used in Mirror's Edge, so what's your point?
Originally posted by: SSChevy2001
My point is UE4 is in to works to avoid single-threaded code. Instead of making excuses, he's working on a new engine that will be ready for future CPUs.
Originally posted by: SSChevy2001
Mirror's Edge is a title made to market GPU PhysX.
Originally posted by: SSChevy2001
Everything but glass breaking works fine with PhysX running on my CPU, but there's no option to just enable cloth effects, why is that? IMO the extra glass effect in this title is useless anyway, as you don't have time to really look at glass on the floor.
Originally posted by: SSChevy2001
@Scali
I've read your post, but developers need to continue to try and make the most of these extra cores. What do you think PS3 developers are doing with the extra SPEs?
Originally posted by: SSChevy2001
One of DX11 features is multithreaded rendering.
Originally posted by: SSChevy2001
My point is Mirror's Edge is a good example of a title were PhysX effects where removed from CPU to market the need for GPU PhysX.
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Originally posted by: Keysplayr
At two or three times the developement time, doesn't sound like he's actually "loving" it either. And it would appear that not all parts of the game are multithreaded. Just the ones' he had mentioned here.
Yes I know what taskmgr is. While I'm not a developer, but I can understand that you just can't break up a game to run 100% on a quad core. What I expect to see is more titles favoring quad core cpus by more than just by 5%.Originally posted by: Scali
What makes you think developers aren't already doing that?
You DO understand that Task Manager doesn't mean anything, right? I mean, I didn't make that long post for nothing, I hope.
Both PhysX and Havok are already highly optimized for multicore systems. So at any rate it's not going to be the physics that aren't properly optimized.
And as Keysplayr already posted, UnrealEngine3 itself is also optimized for multicore. Just like most other major engines.
The bottom line is: multicore processors don't work as well as laymen think they do.
You might want to go to Wikipedia and read about Amdahl's Law to see why that might be.
Agreed. Only some people will make like these extra cloth, smoke, and debris effect can't run on current CPUs, which is not the case.Originally posted by: Scali
EA didn't remove anything. The console versions of Mirror's Edge are the same as the PC version without GPU PhysX.
They just *added* GPU PhysX effects for nVidia cards.
I could care less what you like. It's a fast paced running game which leaves little time to enjoy some extra glass on the floor. What's ironic is it's the only effect in the game that causes CPUs to crawl.Originally posted by: Scali
It also doesn't change the fact that I don't care about your opinion.
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Just as useless as AA/AF/HDR, heck even trees, clothes, etc.
Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
Originally posted by: Keysplayr
Originally posted by: Lonyo
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
I see so a demo that no one but they can see = released games on the market?Originally posted by: SickBeast
AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
Sure that sounds the same :roll:
Now I know you are trolling.
I said that they're not too far off, not that they are the same. :light:
It would be nice if you would actually read my posts and take what I say at face value.
The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.
Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?
BTW: I read your post. And this is the face value I got from it.
AA doesn't alter gameplay, but would you buy a card without it? Nahh...
Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.
Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?
Ah, thank you Lonyo. I was hoping somebody would go there. But I see that Wreckage and also Scali had some insight on this theory of yours. And I'd like to add, that both ATI cards and Nvidia cards perform AA. This is not a debate on whether or not these cards can perform AA, it was a metaphor stating, if one did DID perform AA and the other DID NOT AT ALL, which would you choose. Even though AA DOESN'T alter game content and behavior. You'd still want to have it. I don't care what planet your from. Can you see the purpose of the analogy now? So, dude, your kind of helping me with my points than hindering me. Kudos.
Originally posted by: SSChevy2001
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Just as useless as AA/AF/HDR, heck even trees, clothes, etc.
Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
Originally posted by: SSChevy2001
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Just as useless as AA/AF/HDR, heck even trees, clothes, etc.
Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
Originally posted by: Wreckage
Originally posted by: SSChevy2001
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Just as useless as AA/AF/HDR, heck even trees, clothes, etc.
Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
I remember when ATI fans were talking about adding AA to HDR was the greatest thing in the history of gaming. :laugh:
The hypocrisy is so thick here.
Originally posted by: SickBeast
We're just saying that without an open standard it's pretty pointless if you look at the big picture.
Originally posted by: SickBeast
Originally posted by: Keysplayr
Originally posted by: Lonyo
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
I see so a demo that no one but they can see = released games on the market?Originally posted by: SickBeast
AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
Sure that sounds the same :roll:
Now I know you are trolling.
I said that they're not too far off, not that they are the same. :light:
It would be nice if you would actually read my posts and take what I say at face value.
The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.
Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?
BTW: I read your post. And this is the face value I got from it.
AA doesn't alter gameplay, but would you buy a card without it? Nahh...
Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.
Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?
Ah, thank you Lonyo. I was hoping somebody would go there. But I see that Wreckage and also Scali had some insight on this theory of yours. And I'd like to add, that both ATI cards and Nvidia cards perform AA. This is not a debate on whether or not these cards can perform AA, it was a metaphor stating, if one did DID perform AA and the other DID NOT AT ALL, which would you choose. Even though AA DOESN'T alter game content and behavior. You'd still want to have it. I don't care what planet your from. Can you see the purpose of the analogy now? So, dude, your kind of helping me with my points than hindering me. Kudos.
Look, PhysX is a really cool idea and can produce some incredible effects that are way more impressive than AA in my books. The problem is that AA is supported by all hardware and works with all games. PhysX only runs on NV hardware and is supported by a handful of games.
I can see why you would make the comparison, Keys. PhysX is some really amazing eye candy. CUDA will more than likely do some absolutely incredible things as well. The problem is that as a closed platform, CUDA and PhysX will never fully catch on. Please tell your NV masters to try to get PhysX established as a universal standard, or else cave and go with Havok if that's what AMD is using. We need a standard.
I don't think anyone is arguing the value of hardware physics and GPGPU applications. We're just saying that without an open standard it's pretty pointless if you look at the big picture.
Originally posted by: SickBeast
Originally posted by: SSChevy2001
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
Just as useless as AA/AF/HDR, heck even trees, clothes, etc.
Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
I think those PhysX effects are valuable enough. It would be better if AMD hardware could run them, though.
Originally posted by: SickBeast
I'm pretty sure that some people actually managed to hack PhysX to run on AMD GPUs a while back. Please correct me if I'm wrong. I think I remember reading that it actually ran faster on the AMD GPUs as well.
Originally posted by: Keysplayr
They probably can't. We talked about a theory in this thread, that ATI knew it could not run PhysX on it's GPUs, and quickly turned it down and signed on with Havok. This was how long ago? I dunno. The thing is, I don't think ATI can run any sort of Physics on it's GPUs.
Unless they are severely limited with staff and programmers, we should have seen a whole lot of physics on their GPU's by now. Like we said earlier in the thread. This is just ATI postponing the inevitable IMHO. Please let me be wrong.
Originally posted by: Keysplayr
Originally posted by: SickBeast
I'm pretty sure that some people actually managed to hack PhysX to run on AMD GPUs a while back. Please correct me if I'm wrong. I think I remember reading that it actually ran faster on the AMD GPUs as well.
One programmer, with Nvidia's assistance, was able to get PhysX to run on a 3series ATI GPU.
But where did you get the idea that it was faster? Send a link if you have one. Anyway, the effort was abandoned after an unsupportive ATI declined to help the programmer.
When I say, I don't think ATI can run PhysX on their GPUs, I mean "physics" in general, whether it's PhysX, Havok, Bullet or whatever is there, properly or fast enough. IMHO.
