Nvidia Fermi is recommended for Metro 2033 game

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SmCaudata

Senior member
Oct 8, 2006
969
1,532
136
Systems specs are NOT an appropriate place for advertising. That is obviously what is going on here. As someone who has only ever owned one ATI card I am certainly not an ATI fanboy.

System specs should read as:

Minimum:
AMD CPU:
Intel CPU:
ATI Video Card:
Nvida Video Card:

Recommeded:
AMD CPU:
Intel CPU:
**ATI Video Card:
**Nvida Video Card:


**Nvida Video Cards are required to run hardware accelerated PhysX effects.


How hard is that? It covers all the bases and leaves no one confused or feeling lied to.

Edit: The other option is that perhaps they feel the Nvidia folks are the ones that need prompting because they got suckered into paying more for a card with less performance. :p
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
5,697
797
126
I'm pretty sure most people are aware that Cryostasis got pretty bad mediocre reviews, hence the "Best Game No One Played" distinction. Based on the reviews I wouldn't have bought it either, but I got it free with an EVGA card, gave it a shot, and was pleasantly surprised.

Don't believe everything you read in reviews. Penumbra, another game sort of similar to Cryostasis got pretty bad reviews, but is received much better by actual gamers.

That being said, the PhysX in Cryostasis still chokes even a single GTX 275, so the current status of this game for me is that I'm waiting to for a card that can handle this game before I continue it.

I agree, Cryostasis is a pretty unique game despite its flaws (I wrote some comments on it here). The PhysX in it would probably be unusable on a single GPU though. It runs about the same as Crysis even without PhysX.
 

Blue Shift

Senior member
Feb 13, 2010
272
0
76
Biggest reason why Physx doesn't matter to me is because it isn't acclerated by Radeons. AMD has been bringing the better price/performance parts to market over the past 2 years. I'm always going to be buying the best price/performance parts I can find.

If I had a choice of Physx or No Physx, the choice is easy. But nVidia's attempt to make Physx so unavailable to anyone but nVidia users is a slap in the face to gamers IMO. Won't be buying nVidia cards or reccomending them for the forseeable future as a result.

Like it or not, Ageia was bought by the Big Green, so PhysX is theirs now. I'd like a common standard for gpu-accelerated physics as much as the next guy (which would make optimizing for it seem more worthwhile to developers), but this is what we've got. One company has hardware-accelerated physics effects, and the other doesn't.

The proprietary PhysX SDK was initially only hardware-accelerated by Ageia's PPUs, or Physics Processing unit. After nVidia bought Ageia, they added that ability to their CUDA-based hardware.

Choosing to not buy from a company because they added a feature (that you believe is good) to their products just doesn't make much sense to me.




this coming from someone that wanted to put a $650 5970 with a 5000 X2 at 1440? lol
oh its funny that a game like RF Guerrilla and BC 2 have destructible environments yet pretty much any destructible stuff had to be removed from the shipping Batman game because it would cause too big of a framerate hit. sorry but hardware level physx in its current state is a very inefficient pos.

This coming from the person who wanted me to buy an i7 in order to play Red Faction: Guerrilla? Your advice in that thread, followed by this comment, was the reason for the 'palm. I was about to edit that in, then realized that you had already replied.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
This coming from the person who wanted me to buy an i7 in order to play Red Faction: Guerrilla? Your advice in that thread, followed by this comment, was the reason for the 'palm. I was about to edit that in, then realized that you had already replied.
what? how does that make any sense? in your thread you had mentioned Red Faction Guerrilla, Prototype, and GTA 4 as being some of the games you were playing. I told you that having the fastest card in world would not make up for having a 5000 X2 in those cpu intensive games now didnt I?

Red Faction Guerrilla calculates all its physics on the cpu and is very demanding during destruction especially when other action is going on too. so having a quad would help in that game in that game as well as those other two you mentioned. overall getting something like a 5850/5870 and an i5/i7 would make much more sense than having a 5970 and 5000 X2.

obviously you still havent figured that out. but hey if you want to have 25-30 fps with unplayable min framerates while using a $650 video card on your 5000 X2 then knock yourself out.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Choosing to not buy from a company because they added a feature (that you believe is good) to their products just doesn't make much sense to me.


It's really more due to the powers at nVidia making decisions that I feel is bad for me as a gamer. I want to be able to enjoy games fully on whichever graphics card suits my wallet.

It sets a bad precedent to have certain games using features that are locked to a certain hardware, taken to the extreme it would mean that gamers would need GPU's from both competitors to ensure full functionality of PC games.
 
Last edited:

Blue Shift

Senior member
Feb 13, 2010
272
0
76
oh its funny that a game like RF Guerrilla and BC 2 have destructible environments yet pretty much any destructible stuff had to be removed from the shipping Batman game because it would cause too big of a framerate hit. sorry but hardware level physx in its current state is a very inefficient pos.

what? how does that make any sense? [cut off-topic] Red Faction Guerrilla [...blah blah off-topic blah blah...] those cpu intensive games now didnt I?

Red Faction Guerrilla calculates all its physics on the cpu and is very demanding during destruction especially when other action is going on too. so having a quad would help in that game in that game as well as those other two you mentioned.

obviously you still havent figured that out. but hey if you want to have 25-30 fps with unplayable min framerate while using a $650 on your 5000 X2 then knock yourself out. [Oh look, he's making this personal!]


My point here, which you've done an awesome job of missing and/or ignoring, is that Physics on the CPU (via Havok, non-accelerated physX, etc) is at least as expensive as PhysX. I'd say "more expensive", but I don't have any evidence to back that up, much the same as how you lack evidence to support the claim that PhysX is "inefficient." The fact that games with a lot of physics done on the cpu need a lot of processing power to run points to the computing cost of physics in general.

What the exact balance is between what should be done on the GPU and on the CPU is, I can't say. But ruling out the gpu-accelerated solution because it lowers framerates stands in direct opposition to embracing on-cpu physics, which also lowers framerates.
 

*kjm

Platinum Member
Oct 11, 1999
2,222
6
81
So, I can't tell if you were using extreme sarcasm or not, but at least my argument was kind of realistic. The dev is probably just covering their collective butts. If they are touting PhysX, 3DVision, and DX11, I think it's easier to list cards that would perform all touted features instead of just one out of three. Just my opinion.

Keys just FYI ATI can do 3D with Bit Cauldron glasses.
http://www.youtube.com/watch?v=1DU4u5Z133k

Also if Nvidia wants PhysX to be the standard why not make it open.... Not listing ATI gpu's OR AMD cpu's is just plain BS.

If this war starts and we get a bunch of closed standards only we will pay for it....... buy paying big bucks for the winners cards/CPU's or by having to get a console:(
 

Blue Shift

Senior member
Feb 13, 2010
272
0
76
It's really more due to the powers at nVidia making decisions that I feel is bad for me as a gamer. I want to be able to enjoy games fully on whichever graphics card suits my wallet.

It sets a bad precedent to have certain games using features that are locked to a certain hardware, taken to the extreme it would mean that gamers would need GPU's from both competitors to ensure full functionality of PC games.

Yes, that argument is perfectly valid. A common library for gpu-accelerated physics would certainly be great, as it would allow developers put more time into developing effects that would work on everyone's system.

If havok FX happens, then it would definitely be nice to see it implemented as a replacement for physX. Unfortunately, it's widely thought to have been cancelled. All we've got right now for physics on GPUs is physX, and Green paid for it fair and square.
 

sandorski

No Lifer
Oct 10, 1999
70,864
6,396
126
MS needs to implement Physics in DX and get this issue over with. Once that's done, Nvidia, AMD/ATI, Intel, or someone else can worry about producing Hardware that runs it best.

All this dickering around is just delaying the next big thing in PC Gaming.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
My point here, which you've done an awesome job of missing and/or ignoring, is that Physics on the CPU (via Havok, non-accelerated physX, etc) is at least as expensive as PhysX. I'd say "more expensive", but I don't have any evidence to back that up, much the same as how you lack evidence to support the claim that PhysX is "inefficient." The fact that games with a lot of physics done on the cpu need a lot of processing power to run points to the computing cost of physics in general.

What the exact balance is between what should be done on the GPU and on the CPU is, I can't say. But ruling out the gpu-accelerated solution because it lowers framerates stands in direct opposition to embracing on-cpu physics, which also lowers framerates.
I understand what you are saying but I disagree. to competently run hardware physx you need a separate dedicated card. that is silly for those 2 or 3 games that add anything worth mentioning from a visual standpoint. again they removed some destructive parts of Batman that were planned because physx couldnt even run that worth a crap.

now on the other hand you have a a game like Red Faction Guerrilla or Bad Company 2 whose very gameplay has an impact form all the destructible environments. since most people that have a good gpu will also have a fast dual core or quad core then they can get a nice experience with settings cranked. also having a quad has other benefits for everyday users and gamers where physx doesnt.

just to be clear when I see some games like Red Faction Guerrilla or Bad Company 2 I amazed that all of that destruction as well as other things are all being done pretty nicely on the cpu. yet pyhsx would not even be playable on the cpu with effects that are no where near as gameplay influencing. heck even running the current full physx effects on a very strong gpu annihilates the framerate for very little eye candy and no gameplay influence at all.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
MS needs to implement Physics in DX and get this issue over with. Once that's done, Nvidia, AMD/ATI, Intel, or someone else can worry about producing Hardware that runs it best.

All this dickering around is just delaying the next big thing in PC Gaming.
I hate to sound like a broken record here, but physics simulation is middleware, it's not an API construct. The solution won't be from MS, it will be from someone like Havok offering a physics simulation package that runs on DirectCompute/OpenCL.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Yes, that argument is perfectly valid. A common library for gpu-accelerated physics would certainly be great, as it would allow developers put more time into developing effects that would work on everyone's system.

If havok FX happens, then it would definitely be nice to see it implemented as a replacement for physX. Unfortunately, it's widely thought to have been cancelled. All we've got right now for physics on GPUs is physX, and Green paid for it fair and square.

Good points.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
also having a quad has other benefits for everyday users and gamers where physx doesnt.

I remember that ixbt article that quads had much less stutter independent of recorded fps. that alone should make quads worthwhile for gaming.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yes, that argument is perfectly valid. A common library for gpu-accelerated physics would certainly be great, as it would allow developers put more time into developing effects that would work on everyone's system.

If havok FX happens, then it would definitely be nice to see it implemented as a replacement for physX. Unfortunately, it's widely thought to have been cancelled. All we've got right now for physics on GPUs is physX, and Green paid for it fair and square.
Good points.

Why would it be any better if Intel controlled the dominant physics API, as opposed to NVIDIA?

...it doesn't seem like this would help AMD/ATI get a fair shake at this one bit.
 

sandorski

No Lifer
Oct 10, 1999
70,864
6,396
126
I hate to sound like a broken record here, but physics simulation is middleware, it's not an API construct. The solution won't be from MS, it will be from someone like Havok offering a physics simulation package that runs on DirectCompute/OpenCL.

Whoever it is, I don't care. Just as long as it's not controlled by a single Party with a conflicting interest so that all have equal footing in its' implementation into the Real World.
 

Phil1977

Senior member
Dec 8, 2009
228
0
0
I had a 9600GT before and I played Mirrors Edge. The effects where nice, but seriously nothing that couldn't be done via the CPU...

Remember Farcry and the "64bit" patch. LOL it was jsut a texture pack and some other mods. Nothing to do with 64 bit. It was to promote AMDs Athlong 64bit CPUs.

Someone mentioned Bad Company 2 and boy are the physics amazing. Same with Crysis. But BC2 takes it to the next level. You can blow the whole building up with massive bits flying around. And all that on a 32 player server.


First time I saw BC2 I was impressed. It really takes physics to the next level. When I played mirrors edge all I was thinking was "where the hell are all these physics effects? Surely it can't just be the chopper blowing some bits in my face and glass shattering..." Glass shattering has been around for ages...

Physx is a great marketing tool however. Its a "nice" feature to have and likely a reason for many to buy a Nvidia card. Even if the effects are minor and don't offer better gameplay I rather have them then not. I think that's a give. I am very surprised about the performance impact though. Especially considering that multi core CPUs are so underused in games these days.

What I have seen from Physx is very micro oriented. Small bits. Leafs here, dust there, smoke over here. But nothing that is really gameplay critical. PhysX will never offer gameplay critical effects, because developers would be stupid to make a game that needs PhysX.

So what they do is just add a bit of eye candy and that's it.

And that's why PhysX has no future. It's not a technology you can build your game engine around and make a better game. It's a technology that you run on top of your engine to add some eye candy.
 
Last edited:

Blue Shift

Senior member
Feb 13, 2010
272
0
76
Why would it be any better if Intel controlled the dominant physics API, as opposed to NVIDIA?

...it doesn't seem like this would help AMD/ATI get a fair shake at this one bit.

According to wikipedia, "The company was developing a specialized version of Havok Physics called Havok FX that made use of ATI and NVIDIA GPUs for physics simulations,[4] but may have been cancelled."

In this form, FX would have been great news for ATI.
 

Phil1977

Senior member
Dec 8, 2009
228
0
0
I believe Physics stuff should be done on the CPU...

Quadcores are quite mainstream now and we soon have 6 cores and more... I really believe this is the way to go...
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Yea it got awesome reviews.

http://www.metacritic.com/games/platforms/pc/cryostasissleepofreason?q=%20Cryostasis

And it was you that brought up Cryostasis not me.

When someone makes a good game that uses Physx is a good way that affects gameplay then that's great but they haven't and it doesn't look like they will any time soon. Games like Battlefield Bad Company 2 use a physics api that everyone can use and it actually affects game play.

Right metacritic. So you listen to industry backed reviews, like for Modern Warfare 2, instead of looking at peer reviews?

Cryostasis Metacritic: 6.9
User reviews: 7.8

Modern Warfare 2 Metacritic: 8.5
User reviews: 3.3

Far Cry 2 Metacritic: 8.5
User reviews: 5.5

Bioshock Metacritic: 96
User reviews: 8.0


See the trend? Notice how all the massively huge games were either hated by actual gamers or just not liked as much as your industry reviewers? Notice how Cryostasis's user review scores are actually noticeably higher than critical scores? You're trusting meta critic industry reviews when the industry itself is largely influenced or straight bough out by hype, advertisements, and sometimes money. Want me to keep going?

Another smaller, not known game.
Penumbra Overture Metacritic: 7.3
User reviews: 8.5

Penumbra Black Plague Metacritic: 7.8
User reviews: 9.0


Smaller, lesser known games don't score as well because reviewers aren't under the pressure to keep good relations with publishers and developers.

So once again, you've never played Cryostasis - not even the free demo - and you are openly talking about how horrible it is. And then you base your opinion on "industry" reviews which are almost always slanted and unfair to begin with.

Nice! You're good at this!
 
Last edited:

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
According to wikipedia, "The company was developing a specialized version of Havok Physics called Havok FX that made use of ATI and NVIDIA GPUs for physics simulations,[4] but may have been cancelled."

In this form, FX would have been great news for ATI.


It's been over 4 years since that article, which was written two years before the acquisition of Havok by Intel... doesn't look like that's gonna happen.
 

Blue Shift

Senior member
Feb 13, 2010
272
0
76
It's been over 4 years since that article, which was written two years before the acquisition of Havok by Intel... doesn't look like that's gonna happen.

Yup. As long as intel GMA and laughabee exist, I doubt that Intel will be doing any favors for ATI and nV.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
well now i'm interested in Penumbra.

Physics is great, I loved Red Faction Guerilla for the destructable environments (Reccomended bottle of Excedrin for dealing with the game issues at launch).

Ya, so i'ma try Penumbra and physics is cool.