- Oct 9, 1999
- 9,140
- 67
- 91
* Normal Map Compression. Requirements: NVidia: Geforce FX Family or better,
ATI: x800 card or better. This feature is disabled by default. To enable it,
type r_TexNormalMapCompressed 1 in the console after loading a level. Enabling
this feature during the game may take some time - the PC may appear to freeze.
This variable will not be saved when restarting the game. Enabling normal
map compression will have prolonged execution the first time running through a
level due to initial compression phase occurring in real time through the
level. Subsequent reloads of the same level will yield better performance and
therefore we recommend that you run any benchmark twice and to take the second
of the two runs for benchmarking purposes since this most closely represents
the usual user experience.
* SM 3.0 and SM 2.0x are now enabled by default when graphics settings are set
to ?Very High?. To see performance increases you must have Direct X 9.0c
installed.
* Anisotropic filtering disabled for some textures (light-maps, several lookup
textures, fall-off maps) for increased performance.
Originally posted by: Gamingphreek
Anyone with a Geforce 6 card.... any gains noticed.
Does anyone know when a native SM3.0 game is slated to come out?
Nice find Skywalker.
-Kevin
Originally posted by: Brian48
Wow! First, the Red Sox finally beat the Yankees and now, Crytek finally releases the v1.3 patch. Hell has indeed frozed over :Q
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!
F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!
Originally posted by: otispunkmeyer
seems u have to enable normal map compression your self......do they improve performance or do they improve image quality?
Originally posted by: Marsumane
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!
F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!
It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.![]()
Originally posted by: jiffylube1024
Originally posted by: Marsumane
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!
F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!
It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.![]()
I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy.
They talk about how for quality they decided to code HDR for FP32 instead of FP16 (they neglect to even mention FP24, which half of the market uses).
Then the interviewer asks something to the effect of "oh, so if it supports FP32 then it should work on the FX 5900's then (obviously with lower performance)" and the Crytek guy says something about "well, no, because the FX6800 has a type of blending technique that we use (so it's 6800 only)" .
It's bollocks - just a convenient excuse to cut out owners of older cards and ATI users for not having FP32 (funny, HDR seems to be working fine in HL2 on both ATI and Nvidia cards).
It shouldn't even be an issue of "ATI vs Nvidia," it's just common sense - support what your userbase runs. Perhaps FP24 would run faster than FP32 on the X800 series vs the 6800 series. So what? It's an apples to oranges comparison anyways, as the 6800 series is running it at a higher quality (whether this would make a significant difference or not in actual visual quality is currently unknown). It's just annoying. ATI has the support for HDR right there. It runs fine in RTHDRIBL and Pixel Shader demos...
Originally posted by: jiffylube1024
Originally posted by: Marsumane
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!
F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!
It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.![]()
I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy.
They talk about how for quality they decided to code HDR for FP32 instead of FP16 (they neglect to even mention FP24, which half of the market uses).
Then the interviewer asks something to the effect of "oh, so if it supports FP32 then it should work on the FX 5900's then (obviously with lower performance)" and the Crytek guy says something about "well, no, because the FX6800 has a type of blending technique that we use (so it's 6800 only)" .
It's bollocks - just a convenient excuse to cut out owners of older cards and ATI users for not having FP32 (funny, HDR seems to be working fine in HL2 on both ATI and Nvidia cards).
It shouldn't even be an issue of "ATI vs Nvidia," it's just common sense - support what your userbase runs. Perhaps FP24 would run faster than FP32 on the X800 series vs the 6800 series. So what? It's an apples to oranges comparison anyways, as the 6800 series is running it at a higher quality (whether this would make a significant difference or not in actual visual quality is currently unknown). It's just annoying. ATI has the support for HDR right there. It runs fine in RTHDRIBL and Pixel Shader demos...
Originally posted by: jiffylube1024
Awesome. I just got this game the other day and I love it (man I'm behind the times). Was waiting to get a next gen video card before I played it. Finally got an X800 Pro and I'm loving it.
Also, why does it take an x800 card to run "normal map compression"?
Originally posted by: jiffylube1024
I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy...
Originally posted by: j1nx
Also, why does it take an x800 card to run "normal map compression"?
I think that is the ATi 3Dc feature.