• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

4K gaming: MOAR RAM NEEDED!!

Denithor

Diamond Member
http://www.anandtech.com/show/7120/some-quick-gaming-numbers-at-4k-max-settings

Good read, one thing I take away is the fact that 2GB cards are not going to cut it for 4K gaming.

In Metro 2033 a single 7950 (3GB) nearly matches the performance of either 2x GTX 680 (2GB) cards or a single GTX 690 (2x2GB) card. In terms of raw horsepower, this isn't even a competition - the 7950 should lose badly. But it doesn't, falling behind by only ~10%, indicating that the other cards are severely limited on RAM.

And this becomes even more painfully obvious in Sleeping Dogs, where the single 7950 manages to beat both the 2x GTX 680 and the GTX 690 by a noticeable margin.

I am also happy to see how well SLI and Crossfire are scaling these days.
 
And textures still meant for 1080p gaming and 1 GB cards in those games. Roll in some 4k suitable texturing and even that quad Titan would scream to stay in double digit frame rates.
 
Seems 4k Gaming isn't too far away. Perhaps 2016?

I think 2016, we'll have the displays at reasonable prices and graphics cards that can push those displays too. Right now though getting a 4k Display is hard. Hoping TVs push that down a bit. I know for me personally, 2016 is the date I'm expecting 4K HDTVs (above 60 inches) to hit affordability.

I agree with the review though, I think that consoles will hold back 4K quite a bit. Why develop a graphics card to push 4K, when most developers will be pushing out games that are playable across all 3 platforms. As has become the case now, Playstation and Xbox are holding back PC development on the graphics end quite a bit.
 
The premise of this thread is based around the fact that the 7950 gets 15 fps and 11.1 fps in another?

Lawls...

2GB, 3GB, 6GB, none of it matters no card out has the power to drive this resolution with any realistic cost -> fps ratios.
 
I love how weak video cards are on a grand scale. I could buy a CPU and have it last me 5+ years, but a GPU will become useless in AAA gaming so quickly.

Interesting times are ahead!
 
I love how weak video cards are on a grand scale. I could buy a CPU and have it last me 5+ years, but a GPU will become useless in AAA gaming so quickly.

Interesting times are ahead!

I don't think that means that video cards are weak, just that it is very easy to swamp them with seemingly tiny adjustments. The opposite is true also. I'm sure many people would be surprised at how well an 8800 GTX still plays games maxed out at lower resolutions
 
I love how weak video cards are on a grand scale. I could buy a CPU and have it last me 5+ years, but a GPU will become useless in AAA gaming so quickly.

Interesting times are ahead!

They are obsoleted hyper fast. It sucks because I can't ever really look at my GPUs and think they are truly awesome, because in a year they will be outdated and in two years they will suck.
 
The article that you linked is pretty moot in my opinion. I think we should wait for non-antialiasing benchmarks. Because of AA, VRAM usage skyrockets (especially at 4K). When 4K support is more official, I have a feeling memory utilization may improve or current Anti-aliasing methods may be altered.
 
They are obsoleted hyper fast. It sucks because I can't ever really look at my GPUs and think they are truly awesome, because in a year they will be outdated and in two years they will suck.

Obsoleted is a strong word, and it's mostly in the mind of the gamer I think. Up until I bought my 770s, I was able to play the latest games at very high to ultra IQ settings at 2560x1440 on my nearly 3 year old 580 SLI..

If you don't go crazy with the AA, then GPUs can age nicely. Overclocking can also obviously extend the lifespan of your GPUs in terms of performance..
 
I think the article made it painfully obvious that we need faster GPU's as opposed to more RAM. 3GB looks like plenty, which is pretty easy to come by today for as little as $250. Quad Titan GPU power to almost get to 60fps is in a slightly different price range.
 
Bear in mind that full SSAA was mentioned, which if I'm not mistaken is the old fashioned Super-Sampled Anti-Aliasing that either doubles or quadruples the actual resolution then brings it back down to the actual resolution eliminating most jaggies.

Now that SMAA has come out there is no longer the need to do use SSAA and the huge memory footprint it needs, you can make use of all those extra shader cores to get pretty much the same quality.

Watch the 2nd half of the video I linked to and you'll see what I mean.
 
Obviously they used too high of settings to compare. Faster GPU's would be good, but there is nothing wrong with lowering settings to a point you can get good FPS. At which point, you may find the lack of VRAM becomes less important.
 
Ore use more modern techniques that avoid these issues in the first place (Like most gamers who know what they are doing will do) so they can provide more realistic results instead of making us think we need to upgrade yet again!
 
Anandtech staff have failed, a quick run on 4k? We need detailed benchmarks...

1. They don't have enough AMD cards to benchmark eg 2 7990 or 4 7970 so they can compare it to 4 Titans
2. Using any kind of filtering at 4k in current games that have textured suited for 1080p gaming is absolutely stupid!
3. No variations, only all maxed out, I would love to see how it plays on lower settings to see performance!

I would like to see an article involving A10 6800k being overclocked to 4.8Ghz and GPU overclocked to 950mhz and see if its an viable option for low end 1080p gaming or even higher if possible...
 
Straight off the bat is a bit of a shocker – to get 60 FPS we need FOUR Titans
😵

I was kinda hoping to ease into 4K by collecting 7970s as they become available at good prices. It's early to say, but I believe trifire 7970s should be pretty playable with the right settings.
 
Last edited:
Well, the point is to make it look good and play nice for less than $4K in video cards. Not gonna look real good on a 7850, I'd imagine. We'll have to see where the new knee is in the price/performance calculation. Obviously it will be a moving target, but it's always fun to speculate!
 
for desktop display - as in display on top of a desk being 18"-24" away. the old school 0.25 dot pitch still applies.

so until you need/want a ~46" screen for desktop display, there is no need for 4k resolution.

unless you simply want bragging rights.

-----

as for the VRAM question. gonna need more than 2GB.
 
Looks doable to me. Three Titans seems to cut the mustard fine for 4K in these benches. If you're using a $4K monitor, I don't think you'd be hard-pressed to spend $3K on video cards to go with it.
 
for desktop display - as in display on top of a desk being 18"-24" away. the old school 0.25 dot pitch still applies.

so until you need/want a ~46" screen for desktop display, there is no need for 4k resolution.

unless you simply want bragging rights.

Are you the designated comedian for this thread?
 
Looks doable to me. Three Titans seems to cut the mustard fine for 4K in these benches. If you're using a $4K monitor, I don't think you'd be hard-pressed to spend $3K on video cards to go with it.

Yup, but I'd expect something like Crysis 3 to be a baseline for performance in next-gen titles. Considering 3x Titans only pull 60 FPS in Metro 2033, a game that's 3 years old, that doesn't exactly bode well.

Also, I think we will see some cheap monitors a year from now, probably around $1k. However I think 4K gaming isn't going to be viable for most people.
 
Nothing against progress but even today 2560x1600 is still a rarity. In my circle, only folks who have anything higher than 1920x1200/1920x1080 are those who run 27" iMacs. I wish 2560x1600 became more widespread (read: affordable).

Could the market largely skip 1600p/1440p and go directly from 1080p to 4K, with the help of 4K hype? I won't mind it but I doubt it.
 
Back
Top