frozentundra123456
Lifer
- Aug 11, 2008
- 10,451
- 642
- 126
I think the success or failure of Intel or AMD is only partially related to the quality and capabilities of their products. Largely, consumers buy computers and care more about the size of the monitor and the brand than anything else. The existence of Atom CPU based desktops should be enough to show that many consumers just don't care at all about what CPU they get. Intel does a much better job at selling it's product to the OEMs, which results in lots of computer buyers ending up with Intel systems.
I assumed you were talking about AMD with your real work comment, because this has been gone over repeatedly already and you didn't address any of the points.
Total cost of ownership, think about that. Depending on your source of computers, you may save $100 or more buying AMD. You admit that an AMD computer would work, you just don't like the increased power usage.
If you rationally look at the big picture, you might find that the extra $8 per year you pay for your power-hungry AMD CPU still allows you to save money overall, because you paid $100 less upfront and you aren't going to use the computer for more than 10 years. This is the whole point that has been made through the thread. It's only when silly corner-case examples like $.40/kwh costs that start to make Intel look like the better value.
Again, I ask for an actual real-world example, I don't know why you refuse to answer me.
Does "work" means excel, word, web browsing, and powerpoint? If so, your power usage difference is going to be insignificant and/or possibly even favor AMD, as those leave the CPU largely idle. Does your work mean 24/7 video encoding? Then you have one of the corner cases where the Intel power usage makes a huge difference. I'd argue that such cases are rare though, and certainly not a standard consideration for the average CPU buyer.
The primary case is gaming, which in the overall scheme of things is a "corner case" as well I suppose, but not to users on these forums.
Disclaimer: my reasoning refers to gaming.
1. 3570k is not 100.00 more expensive than the 8350. More like 20 dollars.
2. Except for a few "corner cases" as you call them, 8350 performs worse in gaming than 3570.
3. Small initial cost advantage of 8350 disappears over the life of the processor due to increased power cost.
Most people are not as hung up on power savings as AMD fans think they are. If the 8350 offered clearly superior gaming performance across a wide spectrum of games, I could accept the increased power consumption, as I expect most other users would as well. I have a quad core CPU and a discrete card. That certainly uses more power than an i3 with no discrete card. I accept that for the vastly superior gaming performance.
Problem is, with the 8350, you get generally worse performance AND higher power consumption, and it is only slightly cheaper initially. If you overclock, the comparison becomes even worse.
Its like buying a car that is slower but also uses more gas.