Silverforce11
Lifer
- Feb 19, 2009
- 10,457
- 10
- 76
Optional SSAA and 4K textures. These are definitely PC centric features, kudos to the developers to bother putting these in their games.
So true and we should give credit to the developers for exclusive PC features.Optional SSAA and 4K textures. These are definitely PC centric features, kudos to the developers to bother putting these in their games.
So true and we should give credit to the developers for exclusive PC features.Optional SSAA and 4K textures. These are definitely PC centric features, kudos to the developers to bother putting these in their games.

Even if it says it requires 6GB I bet a 3 or 4GB will run this game faster at 4K than trying to run Crysis 3![]()
Optional SSAA and 4K textures. These are definitely PC centric features, kudos to the developers to bother putting these in their games.
That's not the point; the point is almost no one can crank the settings up. People using a 6GB card are most likely playing beyond 1080p, therefore, they can't afford the high resolution. Given that that's the case, what's the point of even having such textures?no! how dare they create options that won't allow us to ignorantly crank all the settings up!![]()
That's not the point; the point is almost no one can crank the settings up. People using a 6GB card are most likely playing beyond 1080p, therefore, they can't afford the high resolution. Given that that's the case, what's the point of even having such textures?
Also, even recent games don't use 6GB video ram at triple 4K.
As far as creating textures for a rare few goes; does it actually take much time to add those textures?
Textures are created at a high resolution then scaled down, not the other way around.
So what you are saying is that it doesn't take any time.
Perhaps proper use of technology like metatextures
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6
could put a pipe in the 'memory leaks' we're currently seeing on gpu's vram..
I cant help but get the feeling it is one of these areas where "hey lets just throw more hardware at it" instead of "hey lets do this smarter" applies.
A little pause in hardware evolution may not be a bad thing at all, might just force the devs to compete on a different set of parameters.
That's not the point; the point is almost no one can crank the settings up. People using a 6GB card are most likely playing beyond 1080p, therefore, they can't afford the high resolution. Given that that's the case, what's the point of even having such textures?
Also, even recent games don't use 6GB video ram at triple 4K.
It isn't the point; everyone adjusts settings if they don't get the right performance. If you want better performance, you decrease the graphics settings; acting like people don't know that is, well, insulting.no, that is the point
1. we have yet to really know how much VRAM is being used at 1080p for each of these settings, for now we only know what is recommended (not required)
2. we don't know how good/bad the various settings are
3. Ultra textures is an extra, optional install...
4. why would it matter if they went a step further and created a SuperDuperUltraMega-High setting that required 12GB of VRAM in preparation for 8K, despite the fact that there are no consumer products available for such options?
Those are worst comparison pics I have seen for a while.A good example on how you can waste VRAM.
![]()
![]()
Is the Ultra HD pack actually available yet?People are reporting no need for 6GB of vRam for Mordor's Ultra textures. They are saying usage is around 3GB.
Reason enough for us to examine what it actually is because the requirements to be. One advance: Seemingly contained the "Ultra-HD Texture Pack", which you have to download and install the retail version separately, in our already provided by Warner available test version. By the time these lines were written, we did not know where you can download the texture pack is said. A request in this regard we have already sent and you will naturally keep this regard to date.
After "Ultra" texture maps have naively set in conjunction with our Radeon R9 290X / 4G, we were quickly taught a lesson: The VRAM ran regularly, so that it came to Nachladerucklern. The graphics card was so unceremoniously replaced with a GTX Titanium "Classic" with 6 GiByte whose VRAM usage was estimated in GPU-Z to just 5.3 GiByte. If you switch from full-on Ultra HD are well and happy even 5.8. The other given values ​​are also down, so you get "High" on the 3.5 GiByte and with "medium" to 2.5.
Who has been looking forward to going through crisp, high-resolution textures, sorry to be disappointed. The quality varies even quite strong on "Ultra", so for example mountains look rather muddy walls but yet are quite respectable. Since the rocks but not infrequently take up much space, the whole thing falls while playing significantly (negatively) on - which is also not just high-resolution soils carry their remaining in these circumstances. Characters look however from consistently good
