At idle CPU, my entire system, including speakers, monitor, router, and cable modem, is currently using 251W.
This varies depending on the image on the screen in addition to the resolution and even the refresh rate.
It seems to depend on 2 things:
a) average pixel brightness, and
b) horizontal sync frequency.
At 640x480@60Hz (hsync=30kHz), with a black background looking at all the icons on my desktop, the system uses only 218W.
In 2048x1536@70Hz (hsync=111kHz), with a white background, and all windows minimized, this increases to 277W. (with a black background 251W).
The 26W based on the currently displayed image (black vs. white pixels) doesn't suprise me much, I mean if the image is brighter, that light energy has to come from somewhere.
But the 33W difference by changing the resolution & refresh is a surprise, and yes, I tried different refresh rates and the energy consumption actually varies based on refresh rate at any given resolution. 640x480@200Hz (106kHz) uses 248W with a black background, 30W more than the same image at 60Hz. This extra energy is obviously not the difference in light output at the screen itself because the image isn't any brighter at a higher res/refresh - so at what parts would all this extra heat be generated, and how is the heat dissipation handled within the CRT, which has no fans? (I am guessing this extra heat is coming out of an area no larger than your CPU?....)
This varies depending on the image on the screen in addition to the resolution and even the refresh rate.
It seems to depend on 2 things:
a) average pixel brightness, and
b) horizontal sync frequency.
At 640x480@60Hz (hsync=30kHz), with a black background looking at all the icons on my desktop, the system uses only 218W.
In 2048x1536@70Hz (hsync=111kHz), with a white background, and all windows minimized, this increases to 277W. (with a black background 251W).
The 26W based on the currently displayed image (black vs. white pixels) doesn't suprise me much, I mean if the image is brighter, that light energy has to come from somewhere.
But the 33W difference by changing the resolution & refresh is a surprise, and yes, I tried different refresh rates and the energy consumption actually varies based on refresh rate at any given resolution. 640x480@200Hz (106kHz) uses 248W with a black background, 30W more than the same image at 60Hz. This extra energy is obviously not the difference in light output at the screen itself because the image isn't any brighter at a higher res/refresh - so at what parts would all this extra heat be generated, and how is the heat dissipation handled within the CRT, which has no fans? (I am guessing this extra heat is coming out of an area no larger than your CPU?....)