Hey y'all. I just finished building a new system. The mobo is a Gigabyte GA-G31M-ES2L. I have updated to its latest BIOS. My CPU is an Intel C2D E7400 Wolfdale that should run at 2.8GHz (266x10.5). In the BIOS options, I can set the clock speeds, multipliers, etc. By default , the CPU seems underclocked at 2.0GHz running at 266x7.5. I can manually set the multiplier to 10.5 (the way it should be, right?), and it will go back to 2.8 GHz. I have reverted it back, but don't know why it was like this before.
The CPU speed in Windows seem to vary as well. As shown in CPU-Z, it seems to idle around 1.6 GHz which is part of the SpeedStep technology (CPU throttling when idling), I believe. But at full power (when intensive, e.g. playing video games, it jumps back up to 2.80 GHz). Before I made the switch, when it was intensive...it only revved up to 2.0 GHz as well. This made me think that it wasn't just the "throttling" Intel specifications, but rather that my CPU was indeed underclocked or not running at its suppose to. During startup, it showed 2.0GHz which made me think "why would it run at a lower speed even before Windows started...?" I guess the multipliers were different in the first place.
My question is then: why does the mobo underclock these values by default? Is it doing it on purpose to conserve energy or just a mere bug? I just found it overall strange that my mobo underclocks my CPU by DEFAULT. Hmm... not sure if it would have any consequences of switching back to its "normal" supposed-to-be config (@ 2.80GHz) rather than default.
The CPU speed in Windows seem to vary as well. As shown in CPU-Z, it seems to idle around 1.6 GHz which is part of the SpeedStep technology (CPU throttling when idling), I believe. But at full power (when intensive, e.g. playing video games, it jumps back up to 2.80 GHz). Before I made the switch, when it was intensive...it only revved up to 2.0 GHz as well. This made me think that it wasn't just the "throttling" Intel specifications, but rather that my CPU was indeed underclocked or not running at its suppose to. During startup, it showed 2.0GHz which made me think "why would it run at a lower speed even before Windows started...?" I guess the multipliers were different in the first place.
My question is then: why does the mobo underclock these values by default? Is it doing it on purpose to conserve energy or just a mere bug? I just found it overall strange that my mobo underclocks my CPU by DEFAULT. Hmm... not sure if it would have any consequences of switching back to its "normal" supposed-to-be config (@ 2.80GHz) rather than default.