Measure VRAM Usage on ATI Cards?

Rezident

Senior member
Nov 30, 2009
283
5
81
How do people measure actual VRAM usage on ATI cards (or AMD cards)?

I used to use RivaTuner on my Nvidia cards but I installed Rivatuner 2.24 on my current 4890 and the hardware monitoring option is missing (is there a workaround to include this for ATI cards?)


When I got my 4890 there was a special offer so the 2GB card was only €20 more than the 1Gb card so an extra 1GB of GDDR5 for €20, why not? (I would always prefer extra headroom rather than cutting it close). I game at 1920 x 1200 but I still suspect that most games do not use more than 1GB of VRAM, is this correct? I can’t see stats in most reviews.

I will probably get a 6XXX depending on prices, so am trying to decide on a 1Gb or 2Gb card. If there are any decent games that will use more than 1GB VRAM, then I will get the 2GB card just to be on the safe side. Thanks,
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
MSI Afterburner I'm pretty sure has this facility as does GPU-Z. 1GB is not a limiting factor at the moment and I assume you are not playing on a 30" with 4XAA (which may hit a Vram limit in one maybe 2 games) but at that rez and settings Vram usage will be the least of your troubles.

Wait for the reviews and decide on what best fits your budget/games you play.
 

Rezident

Senior member
Nov 30, 2009
283
5
81
MSI Afterburner I'm pretty sure has this facility as does GPU-Z. 1GB is not a limiting factor at the moment and I assume you are not playing on a 30" with 4XAA (which may hit a Vram limit in one maybe 2 games) but at that rez and settings Vram usage will be the least of your troubles.

Wait for the reviews and decide on what best fits your budget/games you play.

GPU-Z only seems to show it for Nvidia cards and not for ATI ones. Must be something to do with the sensors. It's a 27.5" display. I know VRAM isn't the main concern, but I'm not worried about how the 6XXX cards will handle games - the GPU should be fine, but it would seem silly to have all that GPU horsepower and run out of VRAM.

If they were selling 2GB cards over a year ago, there must be some reason, maybe a market for cards with 1.5GB VRAM? Either way, I would like to measure VRAM usage in my own system, and never had a problem doing this on Nvidia cards. There must be some way to measure VRAM usage on ATI cards?
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
GPU-Z only seems to show it for Nvidia cards and not for ATI ones. Must be something to do with the sensors. It's a 27.5" display. I know VRAM isn't the main concern, but I'm not worried about how the 6XXX cards will handle games - the GPU should be fine, but it would seem silly to have all that GPU horsepower and run out of VRAM.

If they were selling 2GB cards over a year ago, there must be some reason, maybe a market for cards with 1.5GB VRAM? Either way, I would like to measure VRAM usage in my own system, and never had a problem doing this on Nvidia cards. There must be some way to measure VRAM usage on ATI cards?

Like I said, MSI Afterburner (may only be Nvidia with Vram usage) try Rivatuner and ATI Tray tools.

I know VRAM isn't the main concern, but I'm not worried about how the 6XXX cards will handle games - the GPU should be fine, but it would seem silly to have all that GPU horsepower and run out of VRAM.

When the reviews hit you will see how it performs. AMD and Nvidia know the common target resolutions that their cards aim to accommodate (high end is 1920+, mid range is 1680 etc) and extensively test prior to release and quantify their results, if Vram was a large problem you'd know about it by now. Currently the only games that have been seen to be Vram limited are shader intensive games at 2560x1600 with large amounts of AA (Crysis 4XAA, probably Metro2033 w/AA) and at those setting the framerates are so poor (sub 20fps) that it's unplayable so who cares if you are Vram limited you are 'limited' by everything else before that anyway.

If they were selling 2GB cards over a year ago, there must be some reason, maybe a market for cards with 1.5GB VRAM?

Because it's marketable. Pretty sure there were 2GB FX5200's back in the day, now because it has a large framebuffer it must be good right? Unfortunately the average Joe does not know that the GPU in question was a low end GPU limited by shader pipelines, bus width, core and memory frequency...the list goes on. To the average buyer if two similarly priced cards are on the table and one has 1GB and the other 2GB then the uninformed would say the 2GB must be better because it's twice as much!

Unfortunately this is not the case, in years gone by vendors that have used this marketing tactic have actually used slower high density chips that may offer a larger framebuffer (2GB), but if it's not saturated (say a game uses 600mb) then the faster higher frequency 1GB card will actually be a faster card in general- but they'll charge the premium for the 2GB card.

Also there are sometimes features like this offered through commands in the console while in game. See Tweakguides for more info on this in their game specific guides.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Lavalys Everest can show you all kinds of utilization numbers for your GPU, including VRAM.

With optimizations turned off, transparency ssaa, vsync, and triple buffering, 1GB pretty skimpy these days. Sure, if you don't care about image quality, then 1GB will get you by just fine.

But buying an HD6000 with all that horsepower, and then giving it barely enough VRAM to make use of that horsepower... seems like quite the contradiction to me.