I think there is something to this but I think it is more like nvidia outsells AMD cards and we have to cater to what the majority have. I can’t imagine there being a big installed base for a game that requires 10GB or more video card memory.
Yeah that would be useful to compare with. Some VR games can go way beyond 12GB though, even over 20GB occasionally. I think that earlier comment about 12GB not good for 4K RT is even more true for VR. It's enough for most non-RT 4K games though.
VR has both a much higher resolution and wider FOV than regular gaming, and I believe both eyes' frames are rendered separately. Many games run much worse in VR and need settings turned down even when the same game is great in regular 4K. Once you see it in VR though, you don't want to play it in regular flatscreen. 🙂
For RT 12GB should actually be enough, since none of the current cards do that well at native 4K anyway and you have to use DLSS.
Lulz. There are always the sticks in the mud that go through the Kübler-Ross model, whenever hardware requirements start to change. It never happens all at once, so why they get salty about it, deny it, and debate it, seems silly to me. Just accept it and be prepared to act when you cannot avoid it any longer. 😉
To preface my remarks: I don't want to derail the thread, so I will make a very on topic point too.Yeah, I am so glad I have able to learn to accept if a game looks good or not by looking at it and not hyper evaluating benchmarks or FPS or whatever.
So much easier and so much less of a pain in the ass.
If I am satisfied with how it looks or runs that good enough for me, if I am not satisfied either don’t play it or buy a new card.
Yes, there is often no perceptible difference between medium/high and ultra settings, but the latter tanks the framerate. It's useful to keep in mind for VR where ultra is usually too demanding even for top cards.
I came to the conclusion years ago that people who really enjoy video games are not the same people that obsess over gpu settings.
I want to say screen shot wise I can tell a difference maybe not between high & ultra but usually yes I can see a difference.
While playing the game and 30, 40 or 60 frames are passing per SECOND I do not see a difference.
Your conclusion is wrong I'm afraid: https://forums.anandtech.com/threads/games-you-finished-in-2020.2589014/I came to the conclusion years ago that people who really enjoy video games are not the same people that obsess over gpu settings. The former just want a fun gaming experience which for most means medium, 60 fps at 1080p or so, the latter don't really play games, they just have fun discussing the merits of 8gb vs 16gb of gpu memory for games they barely ever play or perhaps don't even own.
People aren't really talking about this given non-existent availability, but I came across two new examples where 8GB absolutely cripples performance.
So sitting here at 1080p, I'll just snap up a fast 8gb card and call it a day, eh?
If it takes 4k to break 8gb, I am feeling pretty comfy.
As someone who dropped money on a monitor instead of a GPU due to the shortages, I have no regrets and wish I would have moved to 4k sooner. Granted the few pancake games I play are mostly sims and civ, my 580 still does good enough to hang with turning down minimal settings. And boy does surfing in csgo feel nice at 4k144hz.And this is exactly why a new monitor is off the table for me
As someone who dropped money on a monitor instead of a GPU due to the shortages, I have no regrets and wish I would have moved to 4k sooner. Granted the few pancake games I play are mostly sims and civ, my 580 still does good enough to hang with turning down minimal settings. And boy does surfing in csgo feel nice at 4k144hz.
As someone who dropped money on a monitor instead of a GPU due to the shortages, I have no regrets and wish I would have moved to 4k sooner. Granted the few pancake games I play are mostly sims and civ, my 580 still does good enough to hang with turning down minimal settings. And boy does surfing in csgo feel nice at 4k144hz.
Well my main panel was a 27" 1440p IPS from back in early days of cheap korean monitors. I jumped to it from 23" 1080p and while it was a nice upgrade it never really had a wow factor. Now granted I went from 27" 1440 to 43" 4k so lol form factor is quite different and PPI is the same. Even that considered though I immediately noticed an increase in visual fidelity, with the big 4k monitor its really easy for me to see anything that isn't being rendered at native. First step with civ and other games was installing 4k texture packs and boy did it make a difference. At 1440 I never felt the need to do such things and felt like the diminishing returns on textures was essentially the same as at 1080p. Now in terms of games with lots of eye candy, idk it can be tradeoff depending on the game. With mgs:tpp the render distance causes the biggest frame drops with my 580, and turning that down isn't really an option in that game so I definitely lost visual fidelity turning down settings compared to playing it at 1440, but rendering at native still looks really nice and 4k really lets texture detail shine. Other games like R6S, pretty much no visual difference that I can actively notice.It's probably hard to answer without also having a comparison, but do you think it's a better experience at 4K with lower settings or might you think 1440p at higher settings would be better. I only ask because I'm in the same boat where component availability or cost is making me look at other avenues of upgrade in the meanwhile and a new monitor is on the list regardless.