• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD vs. Nvidia - Prefs & Why

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Some people truly believe in that a placebo is actually working for them me though I prefer truth and facts. I have displayed irrefutable Facts that are undeniable with links provided and anybody with reason can see that nvidia Physx is a sham after they read these articles I have posted.

From Semi-Accurate?.....CharLIE?...LMAO!
 
From Semi-Accurate?.....CharLIE?...LMAO!
It's more facts than 100 FAN BOY opinions could legitimately offer as tangible and truthful to the reality of things as they stand today. What and how are you goin to prove that Physx is not BOLLOCKS ! as you stand there and claim blasphemy against precious Nvidia LOL.
 
Regarding PhysX:

I think everyone agrees that GPU physics is coming with the next gen engines/consoles. UE4 and the Agni techdemo have shown that quite beautifully. Problem is: Until this tech becomes mainstream (i.e. consoles), no dev will code stuff like this only for the PC, because it just doesn't pay. So until we're there it's only 2 possibilities:

1. Everyone does nothing. We don't get nice effects
2. Someone adds advanced effects to games

I prefer 2. I don't care if it is AMD, Nvidia or Santa Clause, it IS a plus. PhysX neither accelerates nor slows down the adoption of GPU-based physics, so it has only one single positive effect: Effects vs. no effects. I don't get why people are so upset about that.
 
It's more facts than 100 FAN BOY opinions could legitimately offer as tangible and truthful to the reality of things as they stand today. What and how are you goin to prove that Physx is not BOLLOCKS ! as you stand there and claim blasphemy against precious Nvidia LOL.

http://physxinfo.com/news/199/physx-most-popular-physics-library/

http://physxinfo.com/news/5645/physx-sdk-in-top-5-middleware-libraries-used/

I think I have proved PhysX is not Bollocks!🙂 Gordon, you're wonderful for forums!🙂
 
Regarding PhysX:

I think everyone agrees that GPU physics is coming with the next gen engines/consoles. UE4 and the Agni techdemo have shown that quite beautifully. Problem is: Until this tech becomes mainstream (i.e. consoles), no dev will code stuff like this only for the PC, because it just doesn't pay. So until we're there it's only 2 possibilities:

1. Everyone does nothing. We don't get nice effects
2. Someone adds advanced effects to games

I prefer 2. I don't care if it is AMD, Nvidia or Santa Clause, it IS a plus. PhysX neither accelerates nor slows down the adoption of GPU-based physics, so it has only one single positive effect: Effects vs. no effects. I don't get why people are so upset about that.
I guess the real question is does CPU Physics really run faster and better than on a GPU and whats wrong with Physics by "Havok" I think they are great.
 
I guess the real question is does CPU Physics really run faster and better than on a GPU and whats wrong with Physics by "Havok" I think they are great.

When I see how an overclocked highend CPU struggles with only a few thousand soldiers in games like Shogun 2, I doubt that it can handle a couple 100k particles for water/gas simulation. Havok is okay, but only for the "normal" stuff. If you really want to see advanced particle effects, a GPU is has much more horsepower.

I'm thinking a bit ahead here. I like what they have done with the fog in Batman or the water in Cryostasis, but that is only the tip of the iceberg. I want much much more. Have you seen the particle demos on UE4? Havok can't even come close to do that.
 
It's more facts than 100 FAN BOY opinions could legitimately offer as tangible and truthful to the reality of things as they stand today. What and how are you goin to prove that Physx is not BOLLOCKS ! as you stand there and claim blasphemy against precious Nvidia LOL.

I dont need to prove anything at all..there are plenty of people enjoying Physx right now.....and calm down will you, the only person here acting precious is the one with most of the posts (Mr Freeman).
Semi-Accurate has more holes in its articles than a kitchen sieve & CharLIE's hatred of NV is famous FCS!.....
 
When I see how an overclocked highend CPU struggles with only a few thousand soldiers in games like Shogun 2, I doubt that it can handle a couple 100k particles for water/gas simulation. Havok is okay, but only for the "normal" stuff. If you really want to see advanced particle effects, a GPU is has much more horsepower.

I'm thinking a bit ahead here. I like what they have done with the fog in Batman or the water in Cryostasis, but that is only the tip of the iceberg. I want much much more. Have you seen the particle demos on UE4? Havok can't even come close to do that.

The GPU particles were impressive.
 
The GPU particles were impressive.
Another problem is that Physics does not mean particle/graphical effects Physics is how everything relates to each other in a given game seance.

Physics (from Ancient Greek: φύσις physis "nature") is a natural science that involves the study of matter and its motion through spacetime, along with related concepts such as energy and force.


Physx: I don't know what the heck that is seems like some sorta of marketing ploy to me.
https://en.wikipedia.org/wiki/Physics#cite_note-4



 
Only thing I want to add, is this is false. The code worked on AMD and someone smart shared a little trick before the GOTY edition came out:

If you used Radeon Pro tools, you could spoof an nVidia ID and magically the AA worked like charm.

And it's untested, if it breaks gameplay you're SOL, if it had bad IQ you were also SOL. zero support.

That's the thing
 
Another problem is that Physics does not mean particle/graphical effects Physics is how everything relates to each other in a given game seance.

Physics (from Ancient Greek: φύσις physis "nature") is a natural science that involves the study of matter and its motion through spacetime, along with related concepts such as energy and force.

Physx: I don't know what the heck that is seems like some sorta of marketing ploy to me.
https://en.wikipedia.org/wiki/Physics#cite_note-4








There are three important areas to me where Physics may enhance and these are fidelity, realism and improvements in game-play. I believe that Physics and improved dynamics is the next frontier to improve immersion moving forward. Could be dead wrong.

I was excited when nvidia and ATI was trying to bring GPU Physics with HavokFX and still equally excited about GPU Physics moving forward, even though it has been slow going.

Look at that Unreal 4 Demo and investigate the dynamics and what it did for immersion.
 
There are three important areas to me where Physics may enhance and these are fidelity, realism and improvements in game-play. I believe that Physics and improved dynamics is the next frontier to improve immersion moving forward. Could be dead wrong.

I was excited when nvidia and ATI was trying to bring GPU Physics with HavokFX and still equally excited about GPU Physics moving forward, even though it has been slow going.

Look at that Unreal 4 Demo and investigate the dynamics and what it did for immersion.

I agree. Some people see it as useless eye candy, I see it as a new level if interaction with the art.
 
There are three important areas to me where Physics may enhance and these are fidelity, realism and improvements in game-play. I believe that Physics and improved dynamics is the next frontier to improve immersion moving forward. Could be dead wrong.

I was excited when nvidia and ATI was trying to bring GPU Physics with HavokFX and still equally excited about GPU Physics moving forward, even though it has been slow going.

Look at that Unreal 4 Demo and investigate the dynamics and what it did for immersion.
I was all for PhysX by Ageia and I was really considering getting a dedicated "Physx" card by Ageia software developer support pending then Nvidia bought out Ageia and I was even more pumped. Some years later and now am I depressed with what a scam Physx has become I mean I relegated my old GTX 275 just to the Physx processing for Metro 2033 and I witnessed absolutely no increase or benefit in anything. I could upload a video if you like that would prove and debunk that Physx at leased in Metro 2033 is a scam and does nothing.
 
Rage was a flop on nvidia cards and drivers as well man it's the game that's a mess not nvidia or AMD and my gtx 5xx is even more of a mess then my previous 2xx card in Rage with same driver 301.42 for both cards only more texture pop ins and other bugs with the 5xx series card vs the 2xx card from nvidia.

Maybe you never played it because day 1 it worked on my GTX 295. No issue at all.


I was all for PhysX by Ageia and I was really considering getting a dedicated "Physx" card by Ageia software developer support pending then Nvidia bought out Ageia and I was even more pumped. Some years later and now am I depressed with what a scam Physx has become I mean I relegated my old GTX 275 just to the Physx processing for Metro 2033 and I witnessed absolutely no increase or benefit in anything. I could upload a video if you like that would prove and debunk that Physx at leased in Metro 2033 is a scam and does nothing.

Just like you're "proof" of poor IQ on Nvidia cards?
 
Last edited:
Maybe you never played it because day 1 it worked on my GTX 295. No issue at all.




Just like you're "proof" of poor IQ on Nvidia cards?
2xx cards had lesser IQ than 4xx cards and above and why and how do I need to prove to you i own RAGE LOL who really cares bud RAGE is a mess and flop and it is well known and documented that it was John Carmacks Fail game. Hope Doom 4 is better cause Mega Textures suck.
 
2xx cards had lesser IQ than 4xx cards and above and why and how do I need to prove to you i own RAGE LOL who really cares bud RAGE is a mess and flop and it is well known and documented that it was John Carmacks Fail game. Hope Doom 4 is better cause Mega Textures suck.

I agree with you that rage sucks, and mega textures are awful (except for consoles), but with statements regarding IQ you'll want some type of citation or evidence to back that up. Honestly, amd and nvidia both look the same to me, with AMD having easier to enable SGSSAA (not everyone knows about nv inspector for nv cards).

That said, rage does suck - its funny how a game can be beautiful and ugly at the same time. Textures are absolutely crap up close but at a long view distance it doesn't look bad. And this, is more definitive proof that PC gaming isn't better than it was in the year 2000. Thanks to these consoles we have abortions like rage which don't shine at all on the PC, further proof that PC gaming was at its peak in the year 2000. I'm glad you gave me the opportunity to point that out.
 
2xx cards had lesser IQ than 4xx cards and above and why and how do I need to prove to you i own RAGE LOL who really cares bud RAGE is a mess and flop and it is well known and documented that it was John Carmacks Fail game. Hope Doom 4 is better cause Mega Textures suck.
This is probably a dumb question but what does the game rage have to do with Nvidia IQ being poor? Not being funny, I haven't heard about that.
 
2xx cards had lesser IQ than 4xx cards and above and why and how do I need to prove to you i own RAGE LOL who really cares bud RAGE is a mess and flop and it is well known and documented that it was John Carmacks Fail game. Hope Doom 4 is better cause Mega Textures suck.

You are totally avoiding the issue. The issue is... AMD cards took months to get the game working without user developed hacks. Nvidia cards were playable immediately.
 
You are totally avoiding the issue. The issue is... AMD cards took months to get the game working without user developed hacks. Nvidia cards were playable immediately.

The way I see it is, AMD did their users a service by causing some of their userbase to NOT purchase rage 😀

I definitely regret wasting a single cent on that game D:
 
The game quality is not the issue. It was a big release (people didn't know it was bad) so not having things working was no good.

The shooting was quality, the textures and lack of story (ID never made good story) wasn't.

Not being funny but how does the buggieness of RAGE have anything to do with my ownership of it. GET OFF MY BACK.

You said Nvidia cards had issues at release. I challenged you on that because I owned a GTX 295 at that time and it worked flawlessly. AMD cards had missing textures, flashing textures and lots of stutter. I know because when I switched to a 6950 I tried it again and noticed those issues.

So if you think Nvidia cards had issues running the game you don't know what you're talking about. SLI didn't work, but it didn't need SLI anyway.

This all goes back to drivers and support from the driver team on new titles.
 
Last edited:
Back
Top