• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

those using powerstrip with there monitors

MiExStacY

Senior member
okay so today i picked up a dell trinitron '19 monitor from egay,and i was wondering my monitor only supports up to 1600x1200@75 but for games i use 1153x864@100 thru powerstrip but my monitor only supports 1153x864@85 default res. so i set it @100 becuase it looks really nice like that.would this harm my monitor in the long run using 100hz instead of 85hz that was recommend by manufacture?
 
Hmm... interesting question. I don't pretend to know the technical details here, perhaps someone could elaborate for me?

My impression was that a monitor has an upper limit to its clock (in terms of pixels per second) that you don't want to exceed. You can calculate the clock rate by simply multiplying the resolution numbers with the refresh rate. So if your monitor can do 1600x1200@75Hz, that means the screen has 1600x1200 = 1,920,000 pixels, which are refreshed 75 times per second, for a total of 1920000x75 = 144,000,000 pixels per second. But, 1153x864x100 = 99,619,200 pixels per second, well under the pixel/sec rate at max resolution. This would seem to imply that 1153x864@100 is OK for the monitor. Now, presumably, the manufacturer has good reason to recommend 85Hz for that resolution. Is there another reason to run at a lower pixel/sec rate, or is the manufacturer just hoping people will settle for 85Hz, putting less stress on the hardware (?) and causing fewer returns?
 
Back
Top