Poll: What's the RAMDAC speed on your ATI Card?

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Actually, they're probably 400MHZ. Since the RAMDAC are imbedded in the core, it would be difficult to change them from the 400MHz ATI specs. Windows has mis-reported them off and on for some time now.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
RAMDAC is not the same as core (GPU/VPU) clock speed. RAMDAC is the speed at which the RAM (memory) DAC (digital-to-analog converter) can convert the digital signal stored in your card's front buffer to an analog signal that will be sent to your analog monitor. FYI, LCDs that use DVI bypass the RAMDAC entirely, thus their cleaner signal and sub-pixel precision--no data loss or corruption because no extra translation.

You can calculate the max res your RAMDAC allows thusly: [horizontal res * vertical res * refresh rate (Hz)] * (1 MHz / 1,000,000Hz) = RAMDAC speed required (MHz). So 1024*768*100Hz = 78.6MHz RAMDAC required. Obviously you'll only start hitting the limits of current RAMDACs with very high-end monitors that can hit very high resolutions and refresh rates.

I'm guessing this poll should be renamed, "What's your ATi card's core clock speed?" A poll on RAMDAC speed is rather pointless, as most cards offer 400MHz RAMDACs.
 

acemcmac

Lifer
Mar 31, 2003
13,712
1
0
Which one? my 9800 has dual ramdacs and my 9100 has one too... Will be swapping to a 9000 so I can have 4 tho... :beer:
 

PliotronX

Diamond Member
Oct 17, 1999
8,883
107
106
400MHz, but I overclocked it to eleventy jiggahertz. Colors have never been crisper.

;)
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
Where's the "My GeForce will kick the snot out of your piddly Radeon" option?
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
Originally posted by: PorBleemo
Originally posted by: MonkeyDriveExpress Where's the "My GeForce will kick the snot out of your piddly Radeon" option?
Your wish is my command...
This is the only time you'll ever see me "w00t!"
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
Originally posted by: PorBleemo
Originally posted by: MonkeyDriveExpress
Originally posted by: PorBleemo
Originally posted by: MonkeyDriveExpress
Where's the "My GeForce will kick the snot out of your piddly Radeon" option?
Your wish is my command...
This is the only time you'll ever see me "w00t!"
"w00t!" for Radeon 2D quality! :D
If it matters, my Mobility Radeon 9600 has a 350MHz DAC.
 

sandorski

No Lifer
Oct 10, 1999
70,633
6,196
126
Mine has a 400mhz ramdac, but one program(don't recall which) claimed a 500mhz ramdac.
 

clicknext

Banned
Mar 27, 2002
3,884
0
0
400. What difference does it make anyway? I've never known what the RAMDAC does exactly. I bought a Voodoo3 back a long time ago because it had a RAMDAC speed that sounded high. =/
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
dual 400MHzRAMDAC on my geforcefx5900
but what's the point, since 2048x1536x85Hz = 283MHz RAMDAC
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Pointless, this. Why? Because the RAMDACs are an integral part of the Radeon chips themselves. Just go look at the chip specifications. End of story.

virtualgames8, your math is flawed. You need to account for the fact that about 1/3 of the video signal is blanking periods. Do that, and your 2048x1536@85 is suddenly very close to 400 MHz pixel clock.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Besides, reason for why this all is moot #2:

RAMDACs are RAM based digital-to-analog converters, making a given stream of digital pixels into analog RGB signals for a CRT monitor.

They don't "run" at a certain frequency, they just have a maximum frequency up to which they produce quality analog output. This is absolutely nothing to do with performance - it's just that a higher frequency margin lets you have a sharper picture at higher resolutions. If the card's manufacturer didn't fvck up the analog circuitry that comes after the RAMDAC did its work - and that's a complicated and (on cheaper cards) well neglected area of the fine art of making a proper VGA circuit board.