First off, the game I play, SC2, is extremely CPU bound. An increase of 100mhz yields rougly about 2-4fps gain, in a game where a stock i5-3570K hits a minimum of ~30fps (note, minimums are the most important thing in RTS, as it basically means what FPS you have during big battles that are micro-intensive).
On top of gaming, I stream H264, which is extremely CPU dependent. Even an overclocked i7-3770k at 4.5ghz on a standard system, is going to run around 40fps, not to mention that it's not even strong enough to stream 1080p@60fps.
Furthermore, I do a lot of computational work, HWbot submissions (no, I'm not some epeen bencher, I actually enjoy benching the furthest possible on what you can do on a 24/7 setting), and simply enjoy overclocking (imagine that!).
So I'm one of the rare people out there, that actually can appreciate the power of more than an i5-2500K, although even myself, I could do just fine with an i5-2500K, or even a Phenom X4 (and have). However, there is definitely a huge jump in performance for me to use 5ghz overclocks over 4.8ghz, or 2400mhz RAM over 1600.
My point, is that in a game that runs ~30fps on a standard overclocked i5, while streaming which just increases your CPU load tenfold, especially if you have a less-than-optimal upload and thus reduce the preset and increase your CPU load further, every extra 100mhz gains me a good 5+ fps or so, and it makes a huge difference (for reference, I can stream SC2 720@60fps fast preset at 50+ fps, which is very, very good).
Considering that even a i7-3770K@5ghz, 2400mhz CL8 RAM, runs at 45-50fps, yes, every bit counts, because I'm still below 60fps which is considered 'smooth' (god forbid I used a 120hz monitor). And gains of 2-5fps is actually quite large. So 100mhz itself isn't big, but 100mhz overclock with high speed RAM, that makes a difference, and 200mhz with high speed RAM actually makes a very noticeable difference.
Furthermore, I really didn't spend much money to do all of this. I built my i7-3770K, 5ghz@1.55v system for $500 (microcenter, watching deals) total. I used a Z77X-UD5H for $79 from microcenter, a $209 i7-3770K, and $50 PSC RAM, and I got a system stronger than the vast majority of people for about half of what most people spent. I also used an Nh-D14 (yes, I know it's outdated and crappy, but you can commonly find them for only $40-50, as I did).
So with a delid (free), and an Nh-D14, and a high quality board, I was free to pump 1.55v in my chip, it managed to stay cool so I didn't endanger it at all, and got a great overclock.
My 230w+ figure comes from VTT, IMC, IGPU, and motherboard efficiency. Here's a picture of 5ghz@1.5v, where my chip core pulls about 167w (i know, software, but killawatt said same thing). You can see Power Input shows 185w that the board pulled. When I raised voltage to 1.55v, this was more like 180/200w, and that isn't including the ~20w from VTT/IMC/IGPU and ~30w of the RAM.
You can also check Sin's Ivy Bridge Overclock guide, he has the following table:
There's a few others in his guide showing the relationship of overclock, voltage, temps, and power consumption, but a 4.7ghz@1.3v is going to consume a good 130w+.
In all, my power consumption isn't going up 300% - it's not really accurate to compare to stock, but to a moderate overlcock, perhaps. So my power draw goes up by 50-80w. That's a lot less dramatic sounding than '300% increase'. It's definitely a sizeable one for sure, but it's half a light bulb.
not to mention other components with increased draw like mobo, chipset and RAM and eventually also faster spinning FANs)
Yes, exactly why buying a higher quality motherboard is important. That's my entire point. I'm not saying spend more money on a motherboard, I'm saying spend your money on the best board at the price point. A higher quality board, like a UD3H (or the UD5H I used, which is even better), is really not going to consume any more power on a high overclock than a moderate one. My UD5H never went above 50C on the VRMs (i had both software and hardware based thermal sensors on it).
However, like I said, my Z77a-G41 hit over 80C on just 4.5ghz@1.25v, and as we all know by Sin's guide, the above pics, using a killawatt, and studies on the relationship of temps and power draw, especially on mosfets (audiophiles have all this stuff documented, as they use the same mosfets as we do), an extra 10C increase in temps on a 150w component results in about a 20w+ increase in power consumption, quite a big deal as that's certainly a couple bucks a month.
Now there is an increase in power consumption with higher voltage, sure, but a very small one. The difference is that with using a high quality board, you are getting something for the increased power consumption (higher overclock).
Chipset consumes basically zero power, not even 5-10w. Fans barely use any power either, you are talking ~2-3w for a 2200RPM 120mm fan. You literally save energy and money, by running your fans at 100%, and dropping temps, than any increase in energy costs from running a fan at 100% instead of 50% (ie 1-2w increased power consumption). Of course it depends on the cooling that fan contributes, a 5th case fan might not save you money if it only drops your temps 1-2C but that pull fan on a radiator dropping your temps by 5C definitely will.
And only stress testing comes close to heating up your system. Even with my 5ghz@1.55v overclock, sure, when stress testing for stability for 25+ hours, I'm running 5x fans at full blast. But then, I never run them max again, as even in gaming I won't come even 20C close to the max temps I had in prime95 small fft, so I'm running just a single fan most of the time (i even have them set to turn off on low load/idle).
%(not to mention other components with increased draw like mobo, chipset and RAM and eventually also faster spinning FANs). In that sense, buying cheaper boards, cheaper heatsinks, 4770 non K and build 2 separate computers connected in parallel setting from them would give you 200% performance increase for much less acquisition cost and power consumption of both of them probably would probably even didn't get over the 200W and noise of their fans together would also be nowhere near overclocked rig.
It all depends what you can get a motherboard for, and it all depends on what your usage is, but I'm actually extremely value oriented and my set-up is actually very value oriented.
So I spent $80 on my Z77X-UD5H. I could have instead spent $20 on a MSI Z77A-G41. For $60 more, I got a board capable of doing 5ghz instead of just 4.5ghz - a 500mhz overclock for only $60 is pretty good.
I would say that on AMD, (ie phenom x4, bulldozer, etc), 100mhz = $10. On Intel, 100mhz = $20-30. 500mhz increased overclock = $100 - 150. I'd say that makes spending an extra $60 on a higher qualit ymotherboard justified.
For those that don't live near microcenter, the choice is even easier. Spend $90 on a Pro3 or MSI Z77A-G41 on Newegg/etc, or spend $125 for a D3H/UD3H, and get that same 500mhz increased overclock. For only $35 instead of a $60 price difference, it's a no brainer in value.
But thats assuming you got a good chip, so let's say it's 300mhz only. Again, 300mhz = $60-90, that's equal to 60, and more than 35.
A higher quality board consumes significantly less power, as it runs significantly cooler. Let's calculate how much money you save using a higher quality board, using the following calc and the average electric rate of 11 cents/kwh (that's what my rate is)
http://www.handymath.com/cgi-bin/electric.cgi?submit=Entry
A 200w board is $192/yr, although that's not really accurate at all because 90% of the day, even on a heavy gaming day, you run low load/idle (and you sleep for 6+ hours too!), you really only run at max load very rarely on a day of heavy gaming. But whatever, that's nonsense, even though real figures and differences are a quarter of what we describe in the following, for the sake of brevity and argument.
Now a 257w board is $250/yr. Wait, where did I get that figure?
Let's say your chip needs 180w on a 5ghz@1.5v overclock, like mine did, as pictured. A 90% efficient VRM like the Gigabyte UD3H, will consume 200w. Meanwhile, a 70% efficient VRM like the Extreme4 (which is extremely generous, more than likely the board will fry out than run 70% efficient, it'll run 0% efficient) will need 257w to supply 180w to the chip.
Huge difference! So yea, you make that small price difference up, rather dramatically, with electricity costs. In reality, you run full load very rarely (just look at your averaged Core load, it'll be more like 10-30% than 100% even on a day of heavy gaming), so the cost difference is not $58/yr but more like $5.8-11.6 a year, but there's a difference nonetheless. 5 years, $10/yr, $50... I mean which way saves you money, no?
I'm all for buying what's the right value. No where do I say you need to spend a lot of money on a motherboard, I say the exact opposite. All I'm saying here, is buy the better motherboard.
Not to mention, for the ~$30 price increase of the UD3H, you get way better aesthetics, better build quality, better RAM overclocking, etc etc...
Now, are you telling me is it worth it to run 5ghz daily? It all depends on what kind of user you are. For me, it is definitely worth it, when even at 5ghz I can't hit 60fps, and I run a popular stream with lots of viewers in HD. I also paid only $50 for my PSC RAM, they still go for only $50, and I only paid $56 for my 2600mhz CL9 Hynix CFR ram, defnitely worth it.
I paid $500 for my i7-3770K rig. If you can build anything stronger than it for the same amount of money, please, I'm all ears. Good luck building 2 x i5 rigs for under $500!
If you speaking of why to have 2 computers instead of 1 overclocked, it is just as relevant as using USB adapter on board without enough USB ports. If you want to have dedicated rig for distributed computing, OC is just bad way to go.
I don't do distributed computing, I stream and play a heavily CPU dependent game. In which case, overclocking makes a huge difference. It also adds a ton of value. The extra light bulb in electricity cost is definitely worth it to me, and I'm getting a very real return of a noticeable performance boost for that increased electricity cost, whereas the Extreme4 and other low quality boards use more electricitry for the same overclock, or similar electricity for less.