What is more important for gaming at high resolutions?

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Most important is raw gpu power followed by vram.
For example a 1050Ti won't be any good at 4K resolution even if it has 2ghz core clock and 8gb vram.
Bandwidth is also important as 128bit card would be bottlenecked more than 256bit card.
 

Guru

Senior member
May 5, 2017
830
361
106
Raw GPU power. All of what you listed is secondary to the core architecture. You can have 20GB GDDR3, GTX 580 at 512bit memory bus running at 3000MHz, it doesn't mean it will be any good.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Core clock?
Memory clock?
Amount of Vram?
Memory interface?
Predominantly, "raw GPU power" is a combination of those combined with the number of processing cores available to the GPU (CUDA Cores/Stream Processors). And obviously, hopefully it's obvious, a newer architecture with more cores will generally be better than an older architecture. Optimization of texture units and ROPs all makes a difference as well.

LOL basically, what makes a GPU better for high resolution gaming is raw GPU power.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
A g-sync or freesync monitor. :) To handle those frame drops. You can see in my sig what I am running and I am gaming at 3440x1440 but the monitor is freesync and I now realise how effective freesync actually is because I got much smoother game play with my Fury X cards. :) It's a little irritating because I now have to buy a G-sync 3440x1440 monitor but I am cool with that.....

If I run an FPS counter during gaming I can see I get stuttering when it drops literally a couple of frames. Probably depends on the game as well though.

First world problems right?
 

Grubbernaught

Member
Sep 12, 2012
66
19
81
Raw power. But specifically for high resolutions: Pixel fill rate. Lots of fill rate.

This used to be calculated by ROP's x clock rate but not sure if this still holds true...
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Raw GPU power and I find it can depend on the game engine/game as well as freesync or g-sync. I am playing far cry 3 on Ultra and find it will happily render at 100fps, then drop to 60fps, then into the 50s and ending up in the low 40s. So even though I am using a g-sync screen I will still get some occasional stuttering. No tearing though. :p So I turned down MSAA to 4x and now I get some occasional MINOR stuttering. On the other hand I was playing dishonored 2 with everything maxed out and it is buttery smooth.

I think I mentioned it earlier but I am playing @ 3440x1440. I just couldn't go back to a 27" screen (2560x1440) or a 34" (2560x1080) screen even though I know I am going to get more consistent performance across games. I did however buy the most beastly GPUs I could get my hands on. So I am in with a fighting chance.