The people have spoken. Whether 10GB is "enough" or not is case dependant and is now beside the point. However, with a "flagship" card, there should be no question about it, and clearly, there is nothing but questions about Nvidia's choice to screw people with 10GB of ram. People want more and aren't happy with a new "flagship" card having less ram than the previous two generations had. It's looking more and more likely that Dr. Su is about to kick their Vram-skimping butts. They deserve to get rekt.
This is false and I have proof.Bottom line: More VRAM won't future proof current cards. Because you will need to turn settings down sooner and more often for the lack of bandwdith/cores, before you need to turn them down from lack of VRAM.
This is false and I have proof.
The HD 7970 vs. the GTX 680 – revisited after 7 years
The Retro Series – the HD 7870 vs. the GTX 680 compared with the RX 570 and the GTX 1050 Ti The GTX 680 versus ... Read morebabeltechreviews.com
Please I implore you to read this whole article. Really pay attention to the summary and the tests at max settings but the Real Bottom Line: is that more vram 100% can contribute to the performance of a GPU. Back then a 50% increase in memory was a big selling point and while the 680 handily beat the 7970 at lower resolutions, it really fell behind at higher res. And now here in the future the 7970 absolutely eclipses the 680 and can handle high texture settings much much better.
One could argue something along the lines of AMD fine wine or nvidia gimping drivers/not adding new support. And maybe thats part of it, but seeing this massive divergence in performance historically, why would you want to artificially limit how "future proof" your gpu is? Especially considering consoles are easily going to be using 12gb+ of vram in a few years easy.
This is false and I have proof.
The HD 7970 vs. the GTX 680 – revisited after 7 years
The Retro Series – the HD 7870 vs. the GTX 680 compared with the RX 570 and the GTX 1050 Ti The GTX 680 versus ... Read morebabeltechreviews.com
Please I implore you to read this whole article. Really pay attention to the summary and the tests at max settings but the Real Bottom Line: is that more vram 100% can contribute to the performance of a GPU. Back then a 50% increase in memory was a big selling point and while the 680 handily beat the 7970 at lower resolutions, it really fell behind at higher res. And now here in the future the 7970 absolutely eclipses the 680 and can handle high texture settings much much better.
One could argue something along the lines of AMD fine wine or nvidia gimping drivers/not adding new support. And maybe thats part of it, but seeing this massive divergence in performance historically, why would you want to artificially limit how "future proof" your gpu is? Especially considering consoles are easily going to be using 12gb+ of vram in a few years easy.
Did you likewise criticize Nvidia and their sponsored games, for "over-tesselation" of things like ground surface and "hidden" objects, just to spite AMD, when NVidia cards had greater tesselation power?That's just AMD creating a setting that uses a little bit more VRAM than NVidia cards have, as a slime move.
The exact kind of thing that AMD fans whine about NVidia doing.
There are no good guys when it comes to corporate behavior.
For starters the intent of the this posting is for the memory demand related topic. Sure it's a AMD sponsored game, but they haven't really gamed their console dominance in the past. Nvidia on the other hand is a different story.....
I'm leaning towards the console and new AMD offerings have the same 16GB's of memory for more then the bigger number.
Did you likewise criticize Nvidia and their sponsored games, for "over-tesselation" of things like ground surface and "hidden" objects, just to spite AMD, when NVidia cards had greater tesselation power?
Thought not.
Same kind of thing. I am not defending NVidia. It's the way corporations behave when they perceive an upper hand.
I am just pointing out one is just as bad as the other. As AMD rises expect more of this from them...
So are you saying that Godfall uses 12 GB of VRAM for zero visual benefit?
There will be no real way to determine what could be achieved if the desire was do the same while conserving memory.
What do us PC gamers always want? We've always wanted developers to push games to use high-end PC hardware. AMD is leading the way in this area with larger frame buffers, and developers are using that ability. If Nvidia falls behind with their hilarious 8 and 10GB buffers, that's not my problem and it shouldn't be yours either. Just don't buy overpriced crap with inadequate RAM and you'll be in good shape.
Also, I wouldn't be happy with 10GB for 1440p either. Just saying. The 6000 series has 16GB all around, so you aren't forced to compromise in an area as critical as RAM. If you save money with a 6800 and have a 1440p monitor, you can upgrade it to an ultrawide or 4K without stressing about RAM. Nvidia just screws you there. Want a decent priced Nvidia card? You get a slow 3070 with 8GB of ram, LOL. Nice one there. The 6800 absolutely REKS that card with twice the ram for only $80.00 more. Good Lord.
So your theory is that AMD convinced the developers to practice bad memory management just to be sure to exceed 10 GB of VRAM?
The people have spoken. Whether 10GB is "enough" or not is case dependant and is now beside the point. However, with a "flagship" card, there should be no question about it, and clearly, there is nothing but questions about Nvidia's choice to screw people with 10GB of ram. People want more and aren't happy with a new "flagship" card having less ram than the previous two generations had. It's looking more and more likely that Dr. Su is about to kick their Vram-skimping butts. They deserve to get rekt.