zerocool84
Lifer
- Nov 11, 2004
- 36,041
- 472
- 126
this coming from someone that wanted to put a $650 5970 with a 5000 X2 at 1440? lol
Round 1
this coming from someone that wanted to put a $650 5970 with a 5000 X2 at 1440? lol
I'm pretty sure most people are aware that Cryostasis got pretty bad mediocre reviews, hence the "Best Game No One Played" distinction. Based on the reviews I wouldn't have bought it either, but I got it free with an EVGA card, gave it a shot, and was pleasantly surprised.
Don't believe everything you read in reviews. Penumbra, another game sort of similar to Cryostasis got pretty bad reviews, but is received much better by actual gamers.
That being said, the PhysX in Cryostasis still chokes even a single GTX 275, so the current status of this game for me is that I'm waiting to for a card that can handle this game before I continue it.
Biggest reason why Physx doesn't matter to me is because it isn't acclerated by Radeons. AMD has been bringing the better price/performance parts to market over the past 2 years. I'm always going to be buying the best price/performance parts I can find.
If I had a choice of Physx or No Physx, the choice is easy. But nVidia's attempt to make Physx so unavailable to anyone but nVidia users is a slap in the face to gamers IMO. Won't be buying nVidia cards or reccomending them for the forseeable future as a result.
this coming from someone that wanted to put a $650 5970 with a 5000 X2 at 1440? lol
oh its funny that a game like RF Guerrilla and BC 2 have destructible environments yet pretty much any destructible stuff had to be removed from the shipping Batman game because it would cause too big of a framerate hit. sorry but hardware level physx in its current state is a very inefficient pos.
what? how does that make any sense? in your thread you had mentioned Red Faction Guerrilla, Prototype, and GTA 4 as being some of the games you were playing. I told you that having the fastest card in world would not make up for having a 5000 X2 in those cpu intensive games now didnt I?This coming from the person who wanted me to buy an i7 in order to play Red Faction: Guerrilla? Your advice in that thread, followed by this comment, was the reason for the 'palm. I was about to edit that in, then realized that you had already replied.
Choosing to not buy from a company because they added a feature (that you believe is good) to their products just doesn't make much sense to me.
oh its funny that a game like RF Guerrilla and BC 2 have destructible environments yet pretty much any destructible stuff had to be removed from the shipping Batman game because it would cause too big of a framerate hit. sorry but hardware level physx in its current state is a very inefficient pos.
what? how does that make any sense? [cut off-topic] Red Faction Guerrilla [...blah blah off-topic blah blah...] those cpu intensive games now didnt I?
Red Faction Guerrilla calculates all its physics on the cpu and is very demanding during destruction especially when other action is going on too. so having a quad would help in that game in that game as well as those other two you mentioned.
obviously you still havent figured that out. but hey if you want to have 25-30 fps with unplayable min framerate while using a $650 on your 5000 X2 then knock yourself out. [Oh look, he's making this personal!]
So, I can't tell if you were using extreme sarcasm or not, but at least my argument was kind of realistic. The dev is probably just covering their collective butts. If they are touting PhysX, 3DVision, and DX11, I think it's easier to list cards that would perform all touted features instead of just one out of three. Just my opinion.
It's really more due to the powers at nVidia making decisions that I feel is bad for me as a gamer. I want to be able to enjoy games fully on whichever graphics card suits my wallet.
It sets a bad precedent to have certain games using features that are locked to a certain hardware, taken to the extreme it would mean that gamers would need GPU's from both competitors to ensure full functionality of PC games.
I understand what you are saying but I disagree. to competently run hardware physx you need a separate dedicated card. that is silly for those 2 or 3 games that add anything worth mentioning from a visual standpoint. again they removed some destructive parts of Batman that were planned because physx couldnt even run that worth a crap.My point here, which you've done an awesome job of missing and/or ignoring, is that Physics on the CPU (via Havok, non-accelerated physX, etc) is at least as expensive as PhysX. I'd say "more expensive", but I don't have any evidence to back that up, much the same as how you lack evidence to support the claim that PhysX is "inefficient." The fact that games with a lot of physics done on the cpu need a lot of processing power to run points to the computing cost of physics in general.
What the exact balance is between what should be done on the GPU and on the CPU is, I can't say. But ruling out the gpu-accelerated solution because it lowers framerates stands in direct opposition to embracing on-cpu physics, which also lowers framerates.
All this dickering around is just delaying the next big thing in PC Gaming.
I hate to sound like a broken record here, but physics simulation is middleware, it's not an API construct. The solution won't be from MS, it will be from someone like Havok offering a physics simulation package that runs on DirectCompute/OpenCL.MS needs to implement Physics in DX and get this issue over with. Once that's done, Nvidia, AMD/ATI, Intel, or someone else can worry about producing Hardware that runs it best.
All this dickering around is just delaying the next big thing in PC Gaming.
Yes, that argument is perfectly valid. A common library for gpu-accelerated physics would certainly be great, as it would allow developers put more time into developing effects that would work on everyone's system.
If havok FX happens, then it would definitely be nice to see it implemented as a replacement for physX. Unfortunately, it's widely thought to have been cancelled. All we've got right now for physics on GPUs is physX, and Green paid for it fair and square.
also having a quad has other benefits for everyday users and gamers where physx doesnt.
Good points.Yes, that argument is perfectly valid. A common library for gpu-accelerated physics would certainly be great, as it would allow developers put more time into developing effects that would work on everyone's system.
If havok FX happens, then it would definitely be nice to see it implemented as a replacement for physX. Unfortunately, it's widely thought to have been cancelled. All we've got right now for physics on GPUs is physX, and Green paid for it fair and square.
I hate to sound like a broken record here, but physics simulation is middleware, it's not an API construct. The solution won't be from MS, it will be from someone like Havok offering a physics simulation package that runs on DirectCompute/OpenCL.
Why would it be any better if Intel controlled the dominant physics API, as opposed to NVIDIA?
...it doesn't seem like this would help AMD/ATI get a fair shake at this one bit.
Yea it got awesome reviews.
http://www.metacritic.com/games/platforms/pc/cryostasissleepofreason?q=%20Cryostasis
And it was you that brought up Cryostasis not me.
When someone makes a good game that uses Physx is a good way that affects gameplay then that's great but they haven't and it doesn't look like they will any time soon. Games like Battlefield Bad Company 2 use a physics api that everyone can use and it actually affects game play.
According to wikipedia, "The company was developing a specialized version of Havok Physics called Havok FX that made use of ATI and NVIDIA GPUs for physics simulations,[4] but may have been cancelled."
In this form, FX would have been great news for ATI.
4. Anton Shilov (2005). "Havok Intros Havok FX Engine to Compute Physics Effects on GPUs". Xbit Laboratories. http://www.xbitlabs.com/news/multimedia/display/20051028224421.html. Retrieved 2008-11-28.
It's been over 4 years since that article, which was written two years before the acquisition of Havok by Intel... doesn't look like that's gonna happen.
Yup. As long as intel GMA and laughabee exist, I doubt that Intel will be doing any favors for ATI and nV.
