Rinaun
Golden Member
Yes I understand voltage leakage; that's why I suggested he buy a USED old Intel based chip. Intel has penny pinched recently, but I'd still suggest theirs over AMD just throwing voltage hoping for it to stick. You really think they've done 5 year runs at 200W? Add in Overclocking and that sounds like a recipe for failure long-term. I'm also wondering if you have experience with the newer AMD processors like I do. They really heat up a 10x10x9' room in 2 hours of gaming.Superior cooling increases efficiency. If you aren't yet familiar, review the concept of "voltage leakage". You cite this as an example of AMD's deficiency where I would instead cite it as an example of Intel's penny pinching. Another good example is the TIM on the 4770Ks.
Can you show me any statistics on this? I think you are confusing people who buy titans (a incredibly small fraction of consumers) with people who buy 200$ video cards, like normal consumers. Also you are slightly right in that people don't care about TDP because why would a 14-23 year old living at their parents house care about the electricity bill? For me (and others), I live in CA and since I'm paying extra for my electricity like most in CA, energy IS a concern for me. Just because everyone around you doesn't give a hoot about TDP doesn't mean that's the overall opinion of the market.no one building a gaming rig seriously considers thermal or electrical efficiency when designing these rigs.
So listing my relevant industry experience makes me a big shot? Heh, ok.What a big shot you are. Please teach me more about overclocking OK? I've been watching everything I can on the utubez and I just can't get those 5 settings right.
This whole conversation degenerated from my point that no one building a gaming rig seriously considers thermal or electrical efficiency when designing these rigs. None of your rant in this paragraph do anything to counter that point. They just make you look like a kid.
Dropping down the 2d mode on a 9800gtx still consumes, what, 40W? That's 40W of totally wasted power, easily equivalent to the inefficiency of the FX-8320.
Yeah see, you have no idea what you are talking about at all and this is why this will be my last reply to you.
I gave the client a free card. My client DOES NOT run their PC 24/7 like I do, thus there isn't any real reason for them to even consider the loss of "40 watts". Secondly, you might want to look up you know....actual statistics like I do.
http://www.guru3d.com/articles_pages/geforce_9800_gtx_sli_review_(bfg),3.html
See that? Do you see the difference in idling between one card, and two? I'll type it out so we all understand you saw it 🙂
9800 GTX 512 MB load system TDP 306 Idle 165
9800 GTX 512 MB SLI load system TDP 419 Idle 168
it's not a gtx+ which is ACTUALLY more energy efficient. So no, your analogy is STILL completely wrong. for 3 watts you get 50% more for free. You can argue until you are blue but yeah have fun because I'm done 🙂.
Last edited: