• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

video mode affects CRT power consumption

glugglug

Diamond Member
At idle CPU, my entire system, including speakers, monitor, router, and cable modem, is currently using 251W.

This varies depending on the image on the screen in addition to the resolution and even the refresh rate.

It seems to depend on 2 things:
a) average pixel brightness, and
b) horizontal sync frequency.

At 640x480@60Hz (hsync=30kHz), with a black background looking at all the icons on my desktop, the system uses only 218W.

In 2048x1536@70Hz (hsync=111kHz), with a white background, and all windows minimized, this increases to 277W. (with a black background 251W).

The 26W based on the currently displayed image (black vs. white pixels) doesn't suprise me much, I mean if the image is brighter, that light energy has to come from somewhere.

But the 33W difference by changing the resolution & refresh is a surprise, and yes, I tried different refresh rates and the energy consumption actually varies based on refresh rate at any given resolution. 640x480@200Hz (106kHz) uses 248W with a black background, 30W more than the same image at 60Hz. This extra energy is obviously not the difference in light output at the screen itself because the image isn't any brighter at a higher res/refresh - so at what parts would all this extra heat be generated, and how is the heat dissipation handled within the CRT, which has no fans? (I am guessing this extra heat is coming out of an area no larger than your CPU?....)
 
Originally posted by: glugglug
But the 33W difference by changing the resolution & refresh is a surprise, and yes, I tried different refresh rates and the energy consumption actually varies based on refresh rate at any given resolution. 640x480@200Hz (106kHz) uses 248W with a black background, 30W more than the same image at 60Hz. This extra energy is obviously not the difference in light output at the screen itself because the image isn't any brighter at a higher res/refresh - so at what parts would all this extra heat be generated, and how is the heat dissipation handled within the CRT, which has no fans? (I am guessing this extra heat is coming out of an area no larger than your CPU?....)

If the electron beam were a physical object with mass and inertia, I'd say that sweeping it faster is more work per time and thus more power (think about holding a broomstick and rapidly swinging it back and forth), but it's not, and I don't know about energy requirements to sweep an electron beam back and forth. I'd suspect it's a similar effect.

If you look inside a monitor, you'll find a lot of big heatsinks. The hot components in monitors are also desinged to operate at higher temperatures than CPUs. If you hold your hand over a monitor, you can sometimes feel the convection, which gives you a decent amount of air moveemnt.
 
As CTho said having to refresh the screen more rapidly would make all of the components in the monitor to work more frequently. This is not just limited to the electron gun, but includes all of the compnents that process the signal before it is sent to be painted.

I wonder how much this test would vary from CRT to CRT. It would probably give you a good idea of which CRTs use either better methods/higher quality components to process signals. Additionally I wonder how this would apply to other monitors (LCD, OLED, etc.)
 
Originally posted by: Captain_Howdy
As CTho said having to refresh the screen more rapidly would make all of the components in the monitor to work more frequently. This is not just limited to the electron gun, but includes all of the compnents that process the signal before it is sent to be painted.

Yes I'd agree with that. It's likely that there would be a noticable increase in the loss occurring in the horizontal deflection amplifiers.
 
Absolutely. The horizontal deflection coils can represent a significant part of a CRT's power consumtption. Essentially, every time the beam is crossed the screen, the electromagnets have to be charged with electrical energy. To bring the beam back to start the next line, the electromagnets are discharged (most of the energy ends up as heat in the electronics, but some is captured and used to generate the tube anode voltage). The more frequently you charge the electromagnets up, the more power you need, and the hotter the electronics get.

On very old CRT monitors, the monitors didn't have any protection circuitry and would try to lock on to any resolution that was sent to them. It was possible to cause such a monitor to catch fire by setting the refresh rate too high. I believe there was even a virus which could do this.
 
Originally posted by: Mark R
Absolutely. The horizontal deflection coils can represent a significant part of a CRT's power consumtption. Essentially, every time the beam is crossed the screen, the electromagnets have to be charged with electrical energy. To bring the beam back to start the next line, the electromagnets are discharged (most of the energy ends up as heat in the electronics, but some is captured and used to generate the tube anode voltage). The more frequently you charge the electromagnets up, the more power you need, and the hotter the electronics get.

On very old CRT monitors, the monitors didn't have any protection circuitry and would try to lock on to any resolution that was sent to them. It was possible to cause such a monitor to catch fire by setting the refresh rate too high. I believe there was even a virus which could do this.

Wow, just, wow :Q
 
Back
Top