• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Power efficency of gaming PCs is bad

https://docs.google.com/viewer?a=v&...VlbmluZ3RoZWJlYXN0fGd4OjdhNTI3NzQ0MTJjYzY3MDE

See the paper. I don't necessarily agree with the paper, but it is kind of obvious that gaming PCs are a terrible waste of power if you are using it for anything other than gaming. Most people would be better off with a cheap power efficient PC (even something like Skylake T which is hardly slow along with a power efficient mobo and DC power?) and then use the gaming PC for just games. Then again, if you only use your $1000+ gaming PC for just games, isn't that basically a really expensive console?
 
I don't mind as long as PC/laptop power consumption scales with usage. If it's burning 300 watts while on idle, yeah that's bad. If on the few hours a day or week while gaming it's hitting 1Kwatt, who cares as long as it drops down to sub 100 watts while I'm typing this message.
 
Even 500 watt idle power use doesn't offset the cost of buying another more efficient pc just for low load use, though odds are you probably have one anyway, or at least a smartphone.
 
My i7 3770K OCd to 4.5GHz with a R9 290 used about 480W while gaming, for a couple hours that's nothing.

But I am into more efficient systems as of late.
 
https://docs.google.com/viewer?a=v&...VlbmluZ3RoZWJlYXN0fGd4OjdhNTI3NzQ0MTJjYzY3MDE

See the paper. I don't necessarily agree with the paper, but it is kind of obvious that gaming PCs are a terrible waste of power if you are using it for anything other than gaming. Most people would be better off with a cheap power efficient PC (even something like Skylake T which is hardly slow along with a power efficient mobo and DC power?) and then use the gaming PC for just games. Then again, if you only use your $1000+ gaming PC for just games, isn't that basically a really expensive console?

The measured power consumption in the paper looks off. On pg 13/14, the peak power consumption was 512W in gaming mode. The graph on the following pages shows a little above 200W in web browsing/video streaming mode. Even for a 2011 platform, that seems way too high to be correct.

The system specs:
12 Measured power and energy use for each mode of oper-
ation. The active gaming value is an average observed during the
benchmark trials described below, with adjustments to reflect an
80 % efficient PSU and 1.4 GPUs (average in use). Components:
PSU (Seasonic G Series, 550 W), CPU (Intel Core i7
4820 K—quad core, 3.7 base GHz), GPU (NVIDIA Reference
Geforce GTX 780, 900 MHz boost), motherboard (ASUS P9X79-
E WS), RAM (32GB (8×4 GB) Kingston HyperX Beast
1866 MHz, 1.65 V), display (Apple HD Cinema, 23 in.). Operat-
ing system: Windows 7 Professional 64 bit; BPower saver^ energy
management settings in Windows 7 OS. Operating hours: active
gaming (Open Gaming Alliance 2015), Web browsing and video
streaming (Short 2013), idle from Urban et al. (2014), and off/
sleep is residual divided equally. Assumes one display
 
Well, according to my UPS, when just doing light computing/web browsing, etc., my system (in my sig) draws less than 130W, including my ROG Swift monitor but not including my audio system. Seems pretty efficient. Not sure what it is during gaming but would most likely be around 500W or more.

Edit: Just tested it quickly in Diablo 3, game bounces between 330 to 450W, never over 500W. Admittedly D3 isn't the most graphically intense game, but my system still seems to be reasonalby power efficient.
 
Last edited:
I think we all know that gaming PCs specially are hummers in that regard.

I am surprised we dont have setups, when the vast majority got IGPs. So the discrete GPU is turned off for example under any non gaming.

My system idles (browsing etc usage) around 40-50W or so. And I guess half of that is from the GPU alone.
 
Yeah. My G3258 @ 4.2Ghz, and 7950 card, idle in Win7 at around 78W according to my UPS, that's including the 24" HDTV LCD monitor, which takes 25-30W by itself.

Full distributed-computing load (both cores + GPU), is around 280W.

Still not TOO bad. My Q9300 with HD4850 card was a bit higher, IIRC. At least, load might have been similar, but idle was much higher with the Q9300.
 
I use my integrated Xeon E3-1245V3 graphics because even in internet browsing, email, and stupid stuff a descrete graphics card is adding an additional 100w or so more.

Intel has been able to do amazing thins with CPU power during idle and basic tasks, ramping up when needed. GPU companies need to get with it! Idle / park the entire card for basic apps, and ramp it up when needed.
 
I saw this the other day and yes gaming PCs when run 24/7 use power but they are not even close to the largest power uses in peoples houses

DVRs are still the kings of that, they use more power then a typical new fridge
 
At least it would stop AMD and NVidia wasting significant driver developer time on a feature that <1% of the market uses.
You can surely bet then that they will cap power consumption of singular cards. Any time the government gets involved, things go downhill.
 
I think we all know that gaming PCs specially are hummers in that regard.

I am surprised we dont have setups, when the vast majority got IGPs. So the discrete GPU is turned off for example under any non gaming.

My system idles (browsing etc usage) around 40-50W or so. And I guess half of that is from the GPU alone.


Yea, I have never understood why the IGP cannot be used when not gaming. Something like how optimus works for laptops. Or even a way to do it manually when you were not planning to game for a while.
 
I use my integrated Xeon E3-1245V3 graphics because even in internet browsing, email, and stupid stuff a descrete graphics card is adding an additional 100w or so more.

Intel has been able to do amazing thins with CPU power during idle and basic tasks, ramping up when needed. GPU companies need to get with it! Idle / park the entire card for basic apps, and ramp it up when needed.

They already do. You must have been using a seriously-outdated video card (8800GT?), because even my friend's out-dated GTX460 2GB card clocks WAY down (like 50-100Mhz core clocks) when not doing anything intensive.
 
They already do. You must have been using a seriously-outdated video card (8800GT?), because even my friend's out-dated GTX460 2GB card clocks WAY down (like 50-100Mhz core clocks) when not doing anything intensive.

They still use quite a bit of power. And dont show me some long idle modes with the screen turned off.
 
What they need to do is porting nVidia Optimus or AMD Hybrid Graphics to Desktop. I always thought that it is extremely retarded that the most power inefficient thing you can do to a computer (Besides highs overclocking when you need to add 0.1V for a mere 100 or 200 MHz), is adding a Video Card that will always be annoying you with extra noise and power consumption. If would be interesing to get a working config where it can be turned on/off on demand for people that doesn't care about some extra delay.
 

Look at figure 13. There's a graph of power usage between a GTX 780 (Kepler) and a GTX970 (Maxwell).

Rather than regulations on SLI/CF, could we instead see a "cash for clunkers" program, for older, less energy-efficient PC components like GPUs? Free gov't-sponsored GPU trade-in program, everyone gets a free upgrade to 16nm Pascal?

I haven't read the whole paper yet, but do they analyze the historical energy-efficiency changes that have come about from node-shrinks?

I notice that they mention I think the PS3 console took 180W in Game Mode when it came out, but later iterations only took 70W. But they didn't explain further, that the primary driver behind that energy efficiency was node shrinks.

At least in the USA, mandatory energy efficiency standards do not
exist for any components found in gaming computers.
Maybe that's their angle? I would prefer, instead of "mandatory energy-efficiency standards", that instead, we have some sort of labeling standard for (especially) video cards, as to their actual maximum, as well as average power draw, in watts. The free market will do the rest.

IMHO, the industry is already making leaps and bounds as far as energy efficiency is concerned. Those lucrative super-computer contracts sure are good motivators for perf/watt increases.
 
Last edited:
The system in my signature with a mild overclock draws around 45w (plus monitor and speakers) under normal desktop usage, and less than 200w even under worst-case scenarios. Granted, this isn't a high-end machine, but even moving up to a GTX970 wouldn't change those numbers considerably; maybe 50w extra under load.

What they need to do is porting nVidia Optimus or AMD Hybrid Graphics to Desktop. I always thought that it is extremely retarded that the most power inefficient thing you can do to a computer (Besides highs overclocking when you need to add 0.1V for a mere 100 or 200 MHz), is adding a Video Card that will always be annoying you with extra noise and power consumption. If would be interesing to get a working config where it can be turned on/off on demand for people that doesn't care about some extra delay.

When not running a game, most modern video cards only add 10-15w to the total system power consumption. Most modern components don't draw much power at idle. I imagine the big offenders at low-load are probably desktop motherboards and power supplies. Moving from a full ATX board to ITX and from a 650w PSU to a 380w dropped my idle power consumption in half.
 
Back
Top