Graphics Card -> Screen... How?

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
I've got my poor little brain in a twist trying to figure this one out.

My CRT is currently scanning at 85Hz. My question is, how does it know what to scan? Is it passed information a line at a time from my RAMDAC, or does it have some kind of buffer, so the same picture remains the same until an update appears on the scene?

In games as well, what happens if the Graphics card is producing 73 FPS? How does this synchronise with the 85Hz the monitor is trying to create? Also, what happens when the FPS exceeds the refresh rate?

Yes folks, I'm so confused I cant frame my question properly...
Any insight would be greatly appreciated.
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Monitors draw images on the screen line by line not frame by frame. A typical monitor running 1280 x 1024 at 85Hz draws a horizontal line every 11 microseconds or 91146 times per second.

Depending on the hardware and software used it is possible to start drawing lines from frame number one then update and start drawing lines from frame two and so forth on a line by line basis. Although I doubt this process is linear, it seems realistic to me that more than 85 frames per second could be interpreted and displayed and would give a better mage on the screen.


Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Along with the red, green and blue pixel data, the CRT monitor receives horizontal and vertical synchronization pulses. The CRT must sync up to that, and run the beam across the screen accordingly, producing exactly those red, green and blue intensities that are on the cable at the very instant. Ain't no buffering going on.

regards, Peter
 

JustinLerner

Senior member
Mar 15, 2002
425
0
0
FPS of images doesn't correlate with the monitor refresh rate in hz although there is obvious confusion by many in correlating the two.

FPS is a carryover from film projection (movies) and animation that stuck with video when it started, probably because of the need to convert film to video and vice versa and some standardization. I believe standard NTSC video (TV/vcr) operates at approx 30fps (29.97) interlaced (interlaced is typically two alternating adjacent fields at refresh rate) while sound film operates at 24fps.

If you noticte, it is true that the computer monitor draws the image on the screen line by line based upon the refresh rate (hz), but video game makers refer to fps of images. This has to do, perhaps more with either the video or animation concept (or both) as well as how images are buffered, stored, retreived and created on the card. Frames may refer to way the bufferred data is handled and sequenced by the video card to display the animated images which are then physically drawn on the screen line by line. (Video monitors for computer use are generally non-interlaced while TV is interlaced). Buffering in video cards is to store, compare, and arrange blocks of data that are to be displayed in sequence. This allows the video cards to play back the data with less flicker by having the capability to have sufficient image data on hand for the video card to display. The more complex the image, the slower the FPS. The less complex, the faster the FPS.

So yeah, the video card in your computer controls the images sent to the monitor and essentially maintains the image, giving the appropriate image info at the right time to the monitor so that the scans and images are updated properly. The video card controls the images which the monitor actually draws (scans) line by line.

Also a couple of notes about DVD. With DVD there is the possibility to play progressive (just like a non-interlaced computer monitor) video which produces a better effect in conjuction with the DVD decompression and motion algorithms because alternating fields are not compared in the algorithms creating extra artifacts by the inconsistencies of two different fields. So the motion looks better with the algorithms on progressive, and one field is displayed for the full resolution to be used. DVD plays back at 30fps, but progressive scanning (same as non-interlaced) requires a special TV monitor, like a progressive HDTV. Typically, most DVD players play back interlaced for your regular TV, but some can also play back progressive scans for a progressive and more expensive TV.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
To put that in short, the CRT monitor syncs up to the _output_ frame rate of the graphics card.
This in turn is not necessarily in sync with the rate at which the picture to be displayed is being changed
inside the computer (the _input_ frame rate, also referred to as FPS in benchmarking).

For best visual quality, the input frame rate should match the output frame rate ... that's what the
"VSync" feature often found in graphics card drivers does: Throttle the rate at which the scenario is
updated to be exactly in sync with the output frame rate. (That's also why we'll never see more than
30 or 25 FPS from a games console, these always run completely sync'ed to the TV refresh rate of 60 Hz
interlaced, 50 Hz in Europe.)
regards, Peter
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Strictly speeking, only one pixel is drawn at a time.

On computer screens, the upper left most pixel is the first pixel drawn, then the pixel to the right, then the next pixel to the right, then....until the end of the row. Then there's the sideband delay as the beam is repositioned on the left.

And remember that all this is analog, not digital.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
The Ramdac just outputs a signal, R, G, B, H, and V. Red, Green, Blue, Horizontal Sync and Vertical Sync. There's never a hickup in the drawing process because your grahpics card I assume (unless it's one of those ancient half meg cards) is double buffering. That means that it draws a frame, puts it in buffer 1, draws another frame, puts it in buffer 2, and sends buffer1 to the ramdac, and repeats this over and over and over again. With tripple buffering, you just add another buffer to the process so that you have 2 completed frames done while the third one is drawing. What happens when FPS exceeds refresh rate? Well, a screen is redrawn from the top before it finishes the previous frame, causing the "Tearing" you get with Vsync off. With Vsync on, it feeds buffered frames to your monitor at your refresh rate (85 hz, in this situation) so if your graphics card was drawing 170FPS half of the time it would be twiddling it's thumbs because with tripple buffering it's already got 3 frames done but the frame isn't finished being outputted to the screen so it's gotta wait for that to finish so it can have an empty frame to work on. If it's only drawing 45 FPS then your grahpics card will keep repeating the last fully drawn frame until another fully drawn one is avaliable.
 

JustinLerner

Senior member
Mar 15, 2002
425
0
0


<< . . . For best visual quality, the input frame rate should match the output frame rate ... that's what the
"VSync" feature often found in graphics card drivers does: Throttle the rate at which the scenario is
updated to be exactly in sync with the output frame rate. (That's also why we'll never see more than
30 or 25 FPS from a games console, these always run completely sync'ed to the TV refresh rate of 60 Hz
interlaced, 50 Hz in Europe.)
regards, Peter
>>


However, when I set my computer monitor (15" SVGA) to run at 60Hz refresh rate, my video card can display full page images (640x480) at 180fps w/ 32bpp. Increasing the resolution to 800x600pixels w/32bpp causes images to display at 170fps.

The higher the frame rate, the better the image. The higher the refresh rate (related to the the scan rate) the better the image. [When I enabled Vsync (called wait for Vblank on my video card), I didn't notice any appreciable frame rate difference (+/- 2 fps). Oh well, doesn't work as well as it should, but then maybe it only works in OpenGL.]

Here's part of technical book on Video:
http://www.video-demystified.com/
Video demystified: a handbook for the digital engineer / by Keith Jack.-- 3rd ed. /ISBN 1-878707-56-6]
http://www.llh-publishing.com/

Excerpt:
Video Timing
Although it looks like video is continuous motion, it is actually a series of still images, changing fast enough that it looks like continuous motion, as shown in Figure 2.1. This typically occurs 50 or 60 times per second for consumer video, and 70?90 times per second for computers. Therefore, timing information, called vertical sync, is needed to indicate when a new image is starting. Each still image is also composed of scan
lines, lines of data that occur sequentially one after another down the display, as shown in Figure 2.1. Thus, timing information, called horizontal sync, is needed to indicate when a new scan line is starting.
The vertical and horizontal sync information is usually transferred in one of three ways:
1. Separate horizontal and vertical sync signals
2. Separate composite sync signal
3. Composite sync signal embedded within the video signal.
. . .
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81


<< The higher the frame rate, the better the image. The higher the refresh rate (related to the the scan rate) the better the image. >>



I'm not entirely sure what you're saying, but if you mean what I think you mean, i'd have to disagree. higher framerates translate into smoother apparent motion, up to a certain limit. usually between 60-70fps will appear perfectly smooth to the human eye... some anandtech'ers claim to see differences up to 100fps.

the refresh rate determines how much flicker is visible. some people can't see a 60hz refresh, but I need ~80Hz before the flickering goes away.

one exception to the 60-70fps rule is fast spins in games... sometimes you can see it chop. However, I think that if you actually displayed a framerate such that every pixel change created a new frame, your eye would just percieve streaked motion such as when something moves very fast at close range. you don't really gain any information.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Sorry to tell you folks that on a CRT monitor that runs X (e.g. 85) Hz refresh rate, all you get from a FPS rate above X Hz is tearing artefacts, CTHo has it right.

The technical reason behind it is that the monitor refreshes the display exactly X times per second, so generating more than X images per second inherently doesn't make sense. The electron beam passes each pixel spot X times per second, if the pixel changes more than once in that interval, noone will notice.

So with a graphics card that has enough crunch to exceed useful CRT frame rates (75 to 85), enable VSync to cap the framerate at your CRT's refresh rate and in turn get rid of tearing altogether.

regards, Peter

 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
to add one more thing:
for a perfect setup, I would need 80fps even though I cannot see above ~60, just because i need that high refresh rate for a flicker-free picture.
of course, with vsync off, 80hz/60fps works fine since tearing occurs at a fast enough framerate, the tear isn't there long enough to notice.