- Mar 8, 2013
- 934
- 346
- 136
What exactly happens when its 6 gb (which is impossible) and the gpu reaches its vram limit?
I had watch dogs on max settings except AA turned down some tiers, but it ran on an average of 45 fps. What did the game do with the 1.5 vram?
What the heck? Are devs really this lazy now? Ugh... What kind of idiot releases a game like this? Excuse me while I smash my head through a wall.
Man that is some SLOP programming if I've ever seen some. Just taking advantage of the PC users on their ports now?
Not laziness per se, but I'm sure developers aren't taking the PC into account when designing the game. That most people's discrete GPU only have 2, maybe 3 GB of memory is of no concern to them.
which is why they have low (1GB) medium (2GB) and high (3GB)...? seems they were concerned enough to create those tiers of textures and were also good enough to spell out what is recommended...
6gb V-ram is for Ultra textures at 1080p.
So if you're running a 2560x1440 or 4k monitor, you can't even buy a GPU "today" with enough v-ram to run Ultra textures.
So you'd basically be stuck running at a lower game resolution just to use higher resolution textures... or run at a higher game resolution with lower resolution textures.....
Hmmmm........yeah..ok........Good Plan! :thumbsup:
I doubt they are creating different textures. I imagine it'd be like the CPU cutting the resolution of the texture before sending it off to the GPU; as if you went into photoshop and resized the image to some percentage smaller. Then just tweak the percentage based upon the minimal testing that gets done.
so by your "logic", if they just flat out didn't include the Ultra textures, it would be a better game because then you wouldn't feel like you were missing out on anything because you didn't have to compromise with the settings, you could just ignorantly crank everything all the way up......Hmmmm........yeah..ok........Good Plan! :thumbsup:
ah ok, I see how it is, as PC gamers we're just going to prejudge everything on frivolous speculation. Its ok, its not uncommon that our pessimistic "hunches" turn out to be true, so we're justified in treating every situation as the same
Who knows, maybe the game will turn out to be a sloppy port to the PC, but I'm going to reserve my judgement until I can see how things actually play out
The game will pause momentarily and resume as the VRAM is filled with new data. It hitches and is a really awful experience. Watchdogs did exactly this on my system if I used settings that were too much for my 3GB cards.
The game will pause momentarily and resume as the VRAM is filled with new data. It hitches and is a really awful experience. Watchdogs did exactly this on my system if I used settings that were too much for my 3GB cards.
Perhaps proper use of technology like metatextures
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6
could put a pipe in the 'memory leaks' we're currently seeing on gpu's vram..
I cant help but get the feeling it is one of these areas where "hey lets just throw more hardware at it" instead of "hey lets do this smarter" applies.
A little pause in hardware evolution may not be a bad thing at all, might just force the devs to compete on a different set of parameters.
I think AMD and NV need to anticipate that next gen console ports will use more memory and produce 6-8GB products. GM200 should have at least 6GB. I also don't see why NV can't ship an 8GB 980 for $600 when their mobile cards ship with 6-8GB!! It also sucks that a 1 year old 780Ti got gimped this badly that it's nearly "outdated" just 1 year later due to VRAM gimping vs. Titan Black.