Cryostasis Retail.. Abysmal performance & poor graphics

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
Good to hear this game sucks, since Nvidia was using it as a major selling point of Physx. I want Physx to fail because it would not be a good thing if Nvidia was in control of the defacto Physics standard.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: nRollo
Originally posted by: nismotigerwvu
You clearly accused him of joining the board to bash PhysX and tried to start a flame war in a thread that painted a negative view of a title that had potential to showcase nVidia technology.
For the consumers sake, I hope the performance issues are cleared up with a timely patch.
Rollo, you're doing more harm than god for your cause acting so negitive/combative.

You read a whole lot into my two sentences.

I'm looking forward to Cryostasis, I'll try to get an advance copy and either verify or refute this.

My guess is the drivers for the game aren't done or have a bug.

You are just so aggressive about how Nvidia is absolutely amazing, that it gets annoying. I know it puts a bad taste in my mouth from time to time. Not to bash, but maybe you should work on your marketing/people skills. It would be nice fi you could sound like a marketer, not some angry fanboy.

Being so hard headed and combatative got you in a fair amount of turmoil before your Permaban IIRC.

The forum is a place for public exchange of information. As much as you'd like to see nothing bad ever said about NV, it's not possible.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: Sureshot324
Good to hear this game sucks, since Nvidia was using it as a major selling point of Physx. I want Physx to fail because it would not be a good thing if Nvidia was in control of the defacto Physics standard.

Instead you'd prefer Intel in command of the defacto physics standard with Havok right?
 

EvilComputer92

Golden Member
Aug 25, 2004
1,316
0
0
Originally posted by: mwmorph
Originally posted by: nRollo
Originally posted by: nismotigerwvu
You clearly accused him of joining the board to bash PhysX and tried to start a flame war in a thread that painted a negative view of a title that had potential to showcase nVidia technology.
For the consumers sake, I hope the performance issues are cleared up with a timely patch.
Rollo, you're doing more harm than god for your cause acting so negitive/combative.

You read a whole lot into my two sentences.

I'm looking forward to Cryostasis, I'll try to get an advance copy and either verify or refute this.

My guess is the drivers for the game aren't done or have a bug.

You are just so aggressive about how Nvidia is absolutely amazing, that it gets annoying. I know it puts a bad taste in my mouth from time to time. Not to bash, but maybe you should work on your marketing/people skills. It would be nice fi you could sound like a marketer, not some angry fanboy.

Being so hard headed and combatative got you in a fair amount of turmoil before your Permaban IIRC.

The forum is a place for public exchange of information. As much as you'd like to see nothing bad ever said about NV, it's not possible.

There's a reason he doesn't post anywhere but the video forum.
 

AdamK47

Lifer
Oct 9, 1999
15,820
3,619
136
I've been able to run the tech demo smoothly with these settings. I get these results. With the system in my sig. The game looks fantabulous to me based on what I've seen in the tech demo. I can't imagine why some would think it looks bad. We'll see what people with a non pirate slant think of the game once the US English version hits retail.
 

CP5670

Diamond Member
Jun 24, 2004
5,677
776
126
I haven't tried the tech demo but none of the screenshots in this thread look that great to me, certainly nothing to justify the sort of performance figures people are bringing up. They are nice enough, but not spectacular.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
Originally posted by: SunnyD
Originally posted by: Sureshot324
Good to hear this game sucks, since Nvidia was using it as a major selling point of Physx. I want Physx to fail because it would not be a good thing if Nvidia was in control of the defacto Physics standard.

Instead you'd prefer Intel in command of the defacto physics standard with Havok right?

I would prefer Microsoft controls it actually, since they are neutral as far as hardware goes. Hopefully they'll implement something in a future DirectX.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: AdamK47
I've been able to run the tech demo smoothly with these settings. I get these results. With the system in my sig. The game looks fantabulous to me based on what I've seen in the tech demo. I can't imagine why some would think it looks bad. We'll see what people with a non pirate slant think of the game once the US English version hits retail.
Exactly, which leads me to believe people aren't running the same config or the tech demo actually has better graphics than the Russian retail version. Seeing that thread on Rage3D makes me think the former. Again, I don't think I own a better looking game other than Crysis.

Originally posted by: CP5670
I haven't tried the tech demo but none of the screenshots in this thread look that great to me, certainly nothing to justify the sort of performance figures people are bringing up. They are nice enough, but not spectacular.
Well if you're capable of running in DX10/SM4.0 I'd certainly check the demo out, its only like 800MB. Many of the dynamic shader/PhysX/lighting effects look much better when animated. The screen shots don't fully do them justice but I can say for sure the OP's version/settings do not look anything like what I'm seeing on my PC from the tech demo.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Sureshot324
Originally posted by: SunnyD
Originally posted by: Sureshot324
Good to hear this game sucks, since Nvidia was using it as a major selling point of Physx. I want Physx to fail because it would not be a good thing if Nvidia was in control of the defacto Physics standard.

Instead you'd prefer Intel in command of the defacto physics standard with Havok right?

I would prefer Microsoft controls it actually, since they are neutral as far as hardware goes. Hopefully they'll implement something in a future DirectX.
LOL, is that a joke? The only gaming hardware Microsoft really cares about is the Xbox. Anyways, DX11 may potentially bring GPU physics acceleration with compute shader support, but it'll require Vista or W7. Good thing we don't have to wait for that to happen though, since Nvidia is offering PhysX support now.
 

Sureshot324

Diamond Member
Feb 4, 2003
3,370
0
71
Originally posted by: chizow
Originally posted by: Sureshot324
Originally posted by: SunnyD
Originally posted by: Sureshot324
Good to hear this game sucks, since Nvidia was using it as a major selling point of Physx. I want Physx to fail because it would not be a good thing if Nvidia was in control of the defacto Physics standard.

Instead you'd prefer Intel in command of the defacto physics standard with Havok right?

I would prefer Microsoft controls it actually, since they are neutral as far as hardware goes. Hopefully they'll implement something in a future DirectX.
LOL, is that a joke? The only gaming hardware Microsoft really cares about is the Xbox. Anyways, DX11 may potentially bring GPU physics acceleration with compute shader support, but it'll require Vista or W7. Good thing we don't have to wait for that to happen though, since Nvidia is offering PhysX support now.

Games these days are using like 1-1.5 cores at the most and are mostly GPU bound. Why use GPU power for physics when you have all those cores going to waste?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Sureshot324
Games these days are using like 1-1.5 cores at the most and are mostly GPU bound. Why use GPU power for physics when you have all those cores going to waste?
Because GPGPUs excel in areas CPUs have tradionally been weak, floating point operations (FLOPs), ie math. It just so happens physics involves a lot of math and floating point calculations and take advantage of a GPGPU's massively parallel computing abilities. As we've seen, the compute portions of the GPUs have excess compute capacity so that the additional load from PhysX acceleration generally does not adversely impact performance.

If CPUs could manage accelerated physics beyond whats been offered with software/CPU acceleration, we would've seen these effects years ago. But to give you an idea, Intel's recent Core i7 965, the world's fastest desktop processor, manages about 80 GigaFLOPs. Nvidia's GT200 manages 960 GigaFLOPs and ATI's RV770 is capable of 1.2 TERAFLOPs.

Also more games are taking advantage of more cores and high-end multi-GPU solutions absolutely require the fastest CPU solutions to avoid bottlenecking.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: Sureshot324
Originally posted by: chizow
Originally posted by: Sureshot324
Originally posted by: SunnyD
Originally posted by: Sureshot324
Good to hear this game sucks, since Nvidia was using it as a major selling point of Physx. I want Physx to fail because it would not be a good thing if Nvidia was in control of the defacto Physics standard.

Instead you'd prefer Intel in command of the defacto physics standard with Havok right?

I would prefer Microsoft controls it actually, since they are neutral as far as hardware goes. Hopefully they'll implement something in a future DirectX.
LOL, is that a joke? The only gaming hardware Microsoft really cares about is the Xbox. Anyways, DX11 may potentially bring GPU physics acceleration with compute shader support, but it'll require Vista or W7. Good thing we don't have to wait for that to happen though, since Nvidia is offering PhysX support now.

Games these days are using like 1-1.5 cores at the most and are mostly GPU bound. Why use GPU power for physics when you have all those cores going to waste?

We are solidly in the dual core age. Most games wouldn't play well on even a 4Ghz single core Athlon 64 at this point.
It's going beyond dual core that we're having trouble with.

Though I don't believe single card physx is a tenable solution. Too much performance hit, even on a gtx 280. Until nvidia fixes physx (either with hardware that handles context switching better or improving it via software), physx is only a reasonable solution for those with dual video cards, even lower end hardware. Not IGPs though. Those are too slow, though perhaps a dual core cpu, plus nvidia IGP, plus nvidia graphics card would be decent? The IGP is slow, but it's at least a dedicated resource.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: idiotekniQues
so nRollo is the local nVidia fanboy?

lol, where were you 3 years ago? FYI stay on topic! we don't need another thread getting locked
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Originally posted by: Grooveriding
To answer some questions. There is no option to enable or disable physx within the game, only through disabling it via the NV control panel of course. Yes, this is a TWIMTBP title, it has the brief logo when you load up the game. The AA option in game is merely turn AA on or off, no 2x, 4x etc. And it seems to only affect certain objects not everything. I'll try to get some screens of that later tonight.

Performance is pretty dismal imo, considering how poor the visuals of the game are. I have a spare 8800gts 512 lying around, maybe I'll try offloading the physx onto it and try some more benching. Regardless the physx effects are pretty underwhelming. But the game is a lot of fun! :)
If you do add the 8800GT, please post the results.
Im very interested in how mush using one dedicated for physx along side a GTX 280 turns out.


 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Originally posted by: MTDEW
Originally posted by: Grooveriding
To answer some questions. There is no option to enable or disable physx within the game, only through disabling it via the NV control panel of course. Yes, this is a TWIMTBP title, it has the brief logo when you load up the game. The AA option in game is merely turn AA on or off, no 2x, 4x etc. And it seems to only affect certain objects not everything. I'll try to get some screens of that later tonight.

Performance is pretty dismal imo, considering how poor the visuals of the game are. I have a spare 8800gts 512 lying around, maybe I'll try offloading the physx onto it and try some more benching. Regardless the physx effects are pretty underwhelming. But the game is a lot of fun! :)
If you do add the 8800GT, please post the results.
Im very interested in how mush using one dedicated for physx along side a GTX 280 turns out.


Been really busy with work this week. I'll try to do some more benching tomorrow and provide some more screenies. I'll do some benches with physx disable and some with the 8800gts512 as a physx card.

Have to get it done this weekend, going to send in the 280 for a stepup to a 295, as I don't have an SLI mobo, so another 280 is out for me.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Sorry to revive this old thread, but with Crostasis recently being released in the US and with a couple of patches I thought it would be worth mentioning.

With the very recently released beta 186 nvidia drivers (that also contain the latest physx software), and the 1.1 Cryostasis patch (there are several different patch versions depending on how you bought the game) I've observed a very noticeable increase in performance. I'm not using fraps, but the game went from somewhat fluid at best (my guess is avg. 20-25 fps) to very playable now (avg. 30 fps now, guess).

My rig is an e8400 @ 4ghz, GTX 260 core 216 @ 650mhz, 4 gigs 1066 RAM, Win 7 64 bit.
 

1ManArmY

Golden Member
Mar 7, 2003
1,333
0
0
Originally posted by: tviceman
Sorry to revive this old thread, but with Crostasis recently being released in the US and with a couple of patches I thought it would be worth mentioning.

With the very recently released beta 186 nvidia drivers (that also contain the latest physx software), and the 1.1 Cryostasis patch (there are several different patch versions depending on how you bought the game) I've observed a very noticeable increase in performance. I'm not using fraps, but the game went from somewhat fluid at best (my guess is avg. 20-25 fps) to very playable now (avg. 30 fps now, guess).

My rig is an e8400 @ 4ghz, GTX 260 core 216 @ 650mhz, 4 gigs 1066 RAM, Win 7 64 bit.

what's the proper way to install the patch? I have it installed on my E drive and when I attempt to install it it states software not found, could not be located on your system. Re-install Cryostasis and then run this patch again.

I didn't even get an option to point it to the appropriate drive. I have the free version from EVGA gift from nvidia via digital river and I grabbed the digital river patch. My current file version is 1.0.0.1 is this already patched?
 

1ManArmY

Golden Member
Mar 7, 2003
1,333
0
0
Originally posted by: tviceman
I found the digital river patch in a google search. It does not look like your current version of Cryostasis updated with the patch.

http://www.google.com/url?sa=t...caKGfD3wX12x3mHbamswwg

That should be the file you need.

I already have this file and when I attempt to install it it states software not found. I heard reports of people being pissed off that the digital river Cryostasis was already patched and the retail version of the patch was being withheld. What is the file version of the patched version?
Like I previously stated in my video options I can enable physx hardware and I have a 9th weapon "water cannon"
My video options look just like this minus the 19 X 12 resolution. I play at 1650 X 1050.
video options
 

coloumb

Diamond Member
Oct 9, 1999
4,069
0
81
Sorry to dig up an old thread - but found this on sale at Target the other day. ~6 hours later - finished the game. Craptastic performance on an HD4870 1gb card [I had to play at 800x600 resolution with a modified config file] - but pretty cool storyline [kinda neat how you go into the past to fix the present].

I really wish they would've used an already established graphics engine and physics engine [ie: source with havoc physics] rather than build the game engine to use a specific hardware platform to showcase the game.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
I too found poor performance; running with PhysX off at 1680x1050 with no AA still slide-showed in some places on my GTX285.

The game was reasonably fun and interesting to play though.