Can we get to the bottom of 1900x1080 4xAA and video ram usage?

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Since I'm away from my main rig, can someone please show me how much video memory these games use @1900x1080 with 4x AA? Please.

And if these games are playable with card like a 5870/overclocked gtx460 1gb at these settings.

What single card do you need to play at these setting?

Metro
AVP
Stalker

Example :
Metro 750 mb vram used, , not playable with a 5870 @ 1900x1080 4x AA.
or
AVP 640mb vram used , not playable with a gtx470 @ 1900x1080 4x AA
Stalker 550 mb used , playable with a 5870 @ 1900x1080.

I'm more intrested in 1gb cards but any will do to get the video ram usage

Thanks.
 
Last edited:

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
You're really looking at two separate issues there. As far as playable vs unplayable I suggest looking at the reviews for those cards on hardocp.com
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
You're really looking at two separate issues there. As far as playable vs unplayable I suggest looking at the reviews for those cards on hardocp.com

I understand what you are saying, but I want to get some real evidence about how much video memory is used with these settings and modern games, first and formost.

We know that GTA4 can/does but are there any others?

Thanks
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Whether or not 1GB is enough is fairly easy to tell with benchmarks. Framerates usually either drop linearly or on a curve as resolution/AA is increased. When framerates take a sudden drop, it's a good indication that you've run out of framebuffer.

From what I've seen with the latest benchmarks, 1GB is still plenty for the vast majority of games out there at 4x AA @ 1920x1200 or less.
 

brencat

Platinum Member
Feb 26, 2007
2,170
3
76
Given my 16:10 LCD, I will either game at 19x12 or 16x10 (scaled) but no matter what, I WILL use a minimum of 4xAA. If it can't run at 19x12 with 4xAA, then I drop to 16x10.

I can't really see significant difference anyway between these 2 resolutions on a 24" monitor. I'm playing BFBC2 @ 8xAA/16xAF at 16x10 because that is the best setting for my rig with a GTX 260 @ 660mhz core.

Hopefully the GTX 560 will be priced right, and I'll finally be able to enjoy 19x12 gaming with 4xAA on everything. But if it's priced too high, I'll probably just hold off as 16x10 scaled is fine for me too.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
At 19x12 Enthusiast/0x AA/AF Crysis Warhead peaked at 1100MB video mem usage. I was surprised by this as everyone has said 1GB is easily enough unless you are at 25x16 and using AA/AF.

It was very playable on my 470 but I have no idea if the 1GB on the 460 would have a problem.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
At 19x12 Enthusiast/0x AA/AF Crysis Warhead peaked at 1100MB video mem usage. I was surprised by this as everyone has said 1GB is easily enough unless you are at 25x16 and using AA/AF.

It was very playable on my 470 but I have no idea if the 1GB on the 460 would have a problem.

Thanks a ton. How did you check the usage in game?
I googled it but there really is no answer there.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Thanks a ton. How did you check the usage in game?
I googled it but there really is no answer there.

EVGA precision can do it.

I wonder who said anything about playability? Just cause a game exceeds a cards frambuffer does not mean the game suddenly becomes unplayable. You could get sudden drops into single digit FPS, but the game can still be playable. Looking at minimum framerates vs a card with a larger frambuffer.

I also know what this thread is about. You claimed a 6950 would never use 2GB of ram and I gave you a few examples of where it would. I never said 1GB wasn't enough for today's games. Talk about taking things out of context.
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
What do you mean by "not playable"?

Is it that you have 20-29 FPS in games or 2-3 FPS?
If it's a little under 30 FPS...why would you call a game "not playable"?
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
Given my 16:10 LCD, I will either game at 19x12 or 16x10 (scaled) but no matter what, I WILL use a minimum of 4xAA. If it can't run at 19x12 with 4xAA, then I drop to 16x10.

I can't really see significant difference anyway between these 2 resolutions on a 24" monitor. I'm playing BFBC2 @ 8xAA/16xAF at 16x10 because that is the best setting for my rig with a GTX 260 @ 660mhz core.

Hopefully the GTX 560 will be priced right, and I'll finally be able to enjoy 19x12 gaming with 4xAA on everything. But if it's priced too high, I'll probably just hold off as 16x10 scaled is fine for me too.

Are you serious? You use a non native resolution and accept a blurry messy image so you can use 4x AA? That makes no sense...

You say you can't see a difference between native and upscaled interpolated.... yet you won't accept 2x AA and think upscaled 4x looks better than native 2x? Placebo effect...
 

brencat

Platinum Member
Feb 26, 2007
2,170
3
76
Are you serious? You use a non native resolution and accept a blurry messy image so you can use 4x AA? That makes no sense...

I have a 24" 16:10 aspect LCD, so 1920x1200 = 1680x1050. The latter speed runs the game significantly faster. So I can have absolutely no jaggies and liquid smooth gameplay, versus jaggies with slower FPS.

Not sure exactly what you mean about a messy blurry image. I'm not using a 16:9 aspect LCD.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Because of the way pixels are shaped, any resolution that is not native will exhibit a loss in picture quality. This varies depending on the monitor and resolution, from hardly noticeable to glaringly obvious, but it is always there in my experience.

LCDs have a fixed amount of pixels so it has to rework the image to fit into them; CRTs do not have a fixed amount of pixels so they do not have this issue.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
What do you mean by "not playable"?

Is it that you have 20-29 FPS in games or 2-3 FPS?
If it's a little under 30 FPS...why would you call a game "not playable"?

A few dips under 30 are ok , but choppy, stuttery gameplay is not.

95% of the time above 30fps with a few dips under is ok.
 

brencat

Platinum Member
Feb 26, 2007
2,170
3
76
Because of the way pixels are shaped, any resolution that is not native will exhibit a loss in picture quality. This varies depending on the monitor and resolution, from hardly noticeable to glaringly obvious, but it is always there in my experience.

LCDs have a fixed amount of pixels so it has to rework the image to fit into them; CRTs do not have a fixed amount of pixels so they do not have this issue.

1920 x 1200 = 16:10 aspect, and 1680 x 1050 = 16:10 aspect. I can visually attest that there is absolutely no blurriness or image degradation whatsoever at 1680 x 1050 with 8xAA in BFBC2 on my particular LCD.

Don't know what else to say except at first I thought you guys were thinking I was trying to run a 1680 x 1050 resolution on a 1920 x 1080 native monitor which is not the case.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
That monitor is nice.

from a review....
"Verdict: The V2400W does PC resolutions at all aspect ratios extremely well. All the aspect ratio scaling works as expected. Very good. It is a top notch PC gaming monitor with overdrive on/off, little blur, and very low input lag."

http://hardforum.com/showthread.php?t=1315565

Can we get back on topic now please?
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
1920 x 1200 = 16:10 aspect, and 1680 x 1050 = 16:10 aspect. I can visually attest that there is absolutely no blurriness or image degradation whatsoever at 1680 x 1050 with 8xAA in BFBC2 on my particular LCD.

Don't know what else to say except at first I thought you guys were thinking I was trying to run a 1680 x 1050 resolution on a 1920 x 1080 native monitor which is not the case.

What you're thinking is not what they are saying. Aspect ratio is different than resolution. Just because the aspect ratio is the same doesn't mean anything. If you're fine with it that's good and no harm but there is definitely an image quality loss at any different rez. Whether it's large or small, it's still going to be there.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
It may have the same aspect ratio, but its not 1:1 with the native resolution of the monitor, and it will not look as good. You can test this by running your game 1680x1050 windowed. You should see a noticeable difference. You have x number of pixels in horizontal and vertical trying to represent a single pixel, and it wont look as good.
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
A few dips under 30 are ok , but choppy, stuttery gameplay is not.

95% of the time above 30fps with a few dips under is ok.

Thanks for the explanation.

I have a HD 6850 OCed @ 825/1100 and Metro 2033, Mafia 2, ME2, BFBC2, etc are all playable at 1920x1080 with AA and AF on, Vsync on, etc (look at my sig, nothing to write about...)

Don't understand why you say Metro 2033 is not playable with a HD 5870...
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Don't understand why you say Metro 2033 is not playable with a HD 5870...

This is what I'm being told by other forum members.
I'm being told that suddenly 1gb of video memory is not enough at 1900x1080.

Thanks for you input.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
This is what I'm being told by other forum members.
I'm being told that suddenly 1gb of video memory is not enough at 1900x1080.

Thanks for you input.

Care to provide a quote where people have claimed that 1gb of memory, in isolation, makes games unplayable at that resolution?

I have read some people saying that 1GB can hinder performance at some settings, in some games, at some resolutions (depending on the game). Can you please point me to quotes that support your take on the situation? Thanks.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Imo, you can form a good understanding of this subject by looking at the gtx 460~768 articles. The SLI results as well.
IMO, 1gb is acceptable in 99% of the situations at 1080.
Of course there are always going to be games, games and settings like GTA IV that can use more vram.
Metro 2033 is one game where in dual gpu situations, created abnormal results with some settings. From lack of ram.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Care to provide a quote where people have claimed that 1gb of memory, in isolation, makes games unplayable at that resolution?

I have read some people saying that 1GB can hinder performance at some settings, in some games, at some resolutions (depending on the game). Can you please point me to quotes that support your take on the situation? Thanks.

I allready did.

I say...........
When a game runs out of video memory it becomes unplayable.Simple fact!

He says
Quote:
"Metro2033 and stalker and AVP will easily go over 1GB at 1920x1080 with 4xAA."

Or ......................

Can it be said that with MANY games you can turn the details up high enough @ 1920x1080 to choke the gameplay without running out of video ram first.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
IMO, 1gb is acceptable in 99% of the situations at 1080.
Of course there are always going to be games, games and settings like GTA IV that can use more vram.

Hurray, for a straight answer, thank you.

I can agree with this 100%

So the question stands "is it time to have more the 1gb of memory on new cards that are geared to play at 1920x1080"?

example, gtx560,6950 ~ 5870 speeds

Are these cards even fast enough to use the extra ram?
I was told a million times a 5870 cannot utilize the extra ram, so its a waste.
Beside eyeinfinity/surround, why do these cards have 2gb of ram? Is it still a waste.

This is not a amd vs nvidia thing.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Hurray, for a straight answer, thank you.

I can agree with this 100%

So the question stands "is it time to have more the 1gb of memory on new cards that are geared to play at 1920x1080"?

example, gtx560,6950 ~ 5870 speeds

Are these cards even fast enough to use the extra ram?
I was told a million times a 5870 cannot utilize the extra ram, so its a waste.
Beside eyeinfinity/surround, why do these cards have 2gb of ram? Is it still a waste.

This is not a amd vs nvidia thing.

The GTX480 had 1.5GB, the GTX470 had 1.28 and the GTX460 had 1GB. In the 5000 series, there were no standard releases with above 1GB, right? Look, both companies are releasing cards (now) with more than 1GB of memory and this is not unique to the 6XXX/5XX series of cards. In fact, Nvidia seems to have beat them to the punch by offering 1.5GB on their single gpu flagship at the time and AMD only 1GB on their single gpu flagship.

I recall seeing enthusiast versions of the 5970 (2x2GB) and the 5870 (2GB). When we looked at benchmarks I believe that the 5970 gained quite a bit of performance, and the 5870 did not. Speaking of the 59704GB: http://www.overclockersclub.com/reviews/sapphire_toxic_hd5970/19.htm "Cons:
4GB shows little improvement on non Eyefinity setups" - supporting happy_medium's claim about non-eyefinity setups benefiting little from the increased memory.

My problem here is that this discussion (if it isn't about AMD vs Nvidia) didn't take place back when the 480 shipped with 1.5 gigs. Were people creating threads going "Wow, Nvidia tacked on 50% more memory on their flagship card! Is it even used?" Now that AMD have 2GB on their single gpu flagship, it gets called into question whether the memory is used or not?

Well, perhaps through this testing we'll see that extreme settings use up to 1.5GB but never over that, rendering the extra 0.5GB useless, for now. (Aside: where are all the tessellation people and their 'futureproofing! support for future games!' now?)

If there is a way to test in game memory usage that works for both vendor's cards in the same way I say we do it and see. The end result might indeed be that anything above 1GB is rarely ever used - but that'll be a stain for both parties, not just one.
 
Last edited: