a10-5800k+hd7770?

Discussion in 'CPUs and Overclocking' started by Reid Hershel, Dec 10, 2012.

  1. AtenRa

    AtenRa Lifer

    Joined:
    Feb 2, 2009
    Messages:
    12,129
    Likes Received:
    755
    @SlowSpyder

    Dont think so, 22nm Core i3 will have lower power consumption even at 100% load. But and that is a BIG but, FX6300 power consumption in gaming will not be that much higher. Using 970 Chipset instead of the 990FX will lower power consumption between 10W to 15W at full load. At idle FX6300 at default 3500MHz on a 970 chipset motherboard with HD7770 + 128GB SSD was measured at ~50W.
     
  2. pauldun170

    pauldun170 Diamond Member

    Joined:
    Sep 26, 2011
    Messages:
    3,837
    Likes Received:
    137
    My i3-2100 and MSI HD7770 idles around 50W. This is with ssd + 2 WD black 1tb drives. I haven't measure load but it certainly doesn't stress out my EA380 psu. I use this box as HTPC\GAMING\general use box.

    An i3 is better for gaming with the stock cooler.
    You can overclock the A8-5600K (or comparable) to achieve parity or slight better than parity with the i3 but then you really should think about aftermarket cooling which pushes up cost. You also need to consider the quality of the board you pair it up with to make sure it can handle the overclock.

    An i3 can be paired with cheapest of cheapo boards, paired up with whatever vid card meets your needs and be a solid gaming box and an excellent general use PC.

    If you want to go the AMD route, thats fine and they will get the job done. However as those in the AMD camp have already noted, if you want to really compete you should overclock and if you overclock you should set a little more money aside for quality board and aftermarket cooler.

    In summary...buy whatever is on sale and buy a nicer vid card if this will be a gaming box (screw crossfire\sli....just buy one good vid card for a budget box)
     
  3. frozentundra123456

    frozentundra123456 Diamond Member

    Joined:
    Aug 11, 2008
    Messages:
    9,408
    Likes Received:
    188
     
  4. AtenRa

    AtenRa Lifer

    Joined:
    Feb 2, 2009
    Messages:
    12,129
    Likes Received:
    755

    Really ???

    Post 20,
    http://forums.anandtech.com/showpost.php?p=34351894&postcount=20

    Two games from AT review that shows FX6300 vs Core i3 3220, FX6300 at default clocks (3500MHz) is faster.

    Post 24
    http://forums.anandtech.com/showpost.php?p=34352017&postcount=24

    I have provided Hardwarecanucks review link, FX6300 is on par at 1080p gaming with Core i3 3220. It wins in some it losses in others, at default clocks they are equals in gaming.

    Also, uk.hardware.info review shows the FX6300 (default 3500MHz) on par in 1080p gaming with Core i3 3220

    Post 39
    http://forums.anandtech.com/showpost.php?p=34356031&postcount=39

    Post 39 shows how fast FX6300 @ 4.2GHz is in MT applications.

    Again, FX6300 at 4.2GHz is faster than Core i3 in MT applications and in the majority of Games. At the same price point there is no competition, the FX6300 is the clear winner.

    ps: for the 100th time, no need for better cooling for 4.2GHz OC.
     
  5. Mallibu

    Mallibu Senior member

    Joined:
    Jun 20, 2011
    Messages:
    243
    Likes Received:
    0
    You cherrypick 3-4 graphs from each review to prove your point. Your first Skyrim graph has i3 destroying the FX, and then you link another site where in Skyrim again, the FX is 1 fps faster, essentially proving your own self wrong.
    The hardware cunnucks review also consistently shows the i3 2100 faster than 3220, and the i5 2400 faster than higher i5 cpus, therefore proving there's error and false results in their methodology.

    You also spam 7zip, x264 (2nd pass, no 1st:rolleyes:) and PovRay, and spamming the same phrase all over again "In MT apps".
    There are no "MT" and "ST" apps, there is no 0 and 1 only. 99% of Real world consist of lightly to medium threaded apps (2-4) where the FX is equal or slower to the i3, and games where the vast majority are faster with the i3. (don't bother linking the exceptions again).

    Conclusion: Yes FX is a nice competitor to the i3. You need aftermarket cooler since the stock cpu cooler makes A LOT of noise. I would probably also take the FX 6300 over an i3 with the usage I do.
    However, you took your fanboysm too far trying to say that it's also better than an i5 which is not the case, since minus some exceptions, the i5 destroys the FX in overall performance in apps and games, power consumption, and overclocking.

    Yes the FX 6300 is a sweet value for money but your exaggerated tries to market it aren't helping. I know you are selling AMD cpus, but your hyperbole-filled advertising gets tiring after a while.
    A good product can stand on it's own.
     
    #55 Mallibu, Dec 12, 2012
    Last edited: Dec 12, 2012
  6. pcsavvy

    pcsavvy Senior member

    Joined:
    Jan 27, 2006
    Messages:
    298
    Likes Received:
    0
  7. NTMBK

    NTMBK Diamond Member

    Joined:
    Nov 14, 2011
    Messages:
    7,217
    Likes Received:
    138
  8. SPBHM

    SPBHM Diamond Member

    Joined:
    Sep 12, 2012
    Messages:
    4,159
    Likes Received:
    46
    FX 6100 is $15 cheaper than the 4170
    http://www.newegg.com/Product/Produc...&Tpk=fx%206100

    now it's slower without overclock for gaming, but you probably can overclock it quite easily to around 4GHz, and it will require probably the same level of motherboard/cooling for operating with stability (?) than the 4170, and I think it's a better CPU...
     
  9. NTMBK

    NTMBK Diamond Member

    Joined:
    Nov 14, 2011
    Messages:
    7,217
    Likes Received:
    138
    Meh, with the obvious improvements of Piledriver over Bulldozer I wouldn't recommend a 6100 over a 6300.
     
  10. inf64

    inf64 Platinum Member

    Joined:
    Mar 11, 2011
    Messages:
    2,583
    Likes Received:
    507
    FX6300 is just much better chip than any FX61xx/FX41xx. It clocks to the same level as FX83xx and can do ~4.2Ghz with minimal or no voltage increase(just multiplier change). Couple the clocking headroom with higher IPC(games especially) and with lower power draw (either stock or OCed) and you should never think of first gen. Bulldozer again :).
     
  11. sm625

    sm625 Diamond Member

    Joined:
    May 6, 2011
    Messages:
    7,849
    Likes Received:
    61
    If you game or encode for 4 hours a day, an FX6300 will easily consume an extra 400 watts per day. At 4.2 GHz it could be as much as 600 watts a day. Even at 400 extra watts a day, you're looking at an extra 5 cents a day or $60 over 3 years. So for anyone who games or encodes for more than a few hours a day, and intends to do so for 3 or more years, the FX6300 is directly competing with an i5-33xx or even a 3570k (it all depends on exactly how much power you're using). That's why so many enthusiasts are so hard on AMD right now. Once you factor in the amount of power that the average enthusiast is using, the comparisons become laughable.
     
  12. NTMBK

    NTMBK Diamond Member

    Joined:
    Nov 14, 2011
    Messages:
    7,217
    Likes Received:
    138
    Given that you can't get the unit for energy right, I'm not inclined to trust your pulled-out-of-your-arse estimates for energy costs.

    Clue: watts are a unit of power.

    EDIT: Okay, let's do this right.

    First off, I'll assume a worst case scenario for power usage- running x264 constantly, meaning your CPU is at 100% load all of the time. This is significantly more than the power usage during gaming, might I add.

    According to Anand's review, the difference in power usage between the FX-6300 system and the i5-3570k system is 44.4W, or 0.0444kW. (About the same as a lightbulb.)

    According to Wikipedia, US energy prices range from 0.08$/kWh to 0.17$/kWh. I shall pick a number in the middle of this, $0.12, for my calculations, but adjust this for your local energy prices.

    Finally, we shall use your figure of average 4 hours' use a day over 3 years, for 4380hrs. This gives us our final sum:

    Cost over 3 years = 0.12 * 0.0444 * 4380 = $23

    So far lower than your guesstimate. And let me reiterate, that is assuming the PC is at full load the entire time- a highly unrealistic assumption for anyone not running Distributed Computing, a rendering farm, etc. The difference in idle power consumption is only 14.7W, about a third that at load, and we can assume that average power consumption will be somewhere between the two.

    Next time, do the maths instead of making it up and hoping no-one will call you on it.
     
    #62 NTMBK, Dec 13, 2012
    Last edited: Dec 13, 2012
  13. SPBHM

    SPBHM Diamond Member

    Joined:
    Sep 12, 2012
    Messages:
    4,159
    Likes Received:
    46
    well, that's more expensive, while the 6100 is cheaper than the a8 5600k,
    the A8 might have an improved architecture but it lacks l3 cache (significant for gaming) and an extra module,
     
  14. inf64

    inf64 Platinum Member

    Joined:
    Mar 11, 2011
    Messages:
    2,583
    Likes Received:
    507
    From where you pulled those 400W and 600W numbers? It's just ridiculously high. Also OCing FX6300 with stock Vcore with just a multiplier will not raise the power draw by 200W, that sort of OC would raise the drawn power by CPU alone by a factor of 1.1-1.2 (depending what clock you take for reference point). And stock FX6300 TDP rating is 95W so at most 95x1.2=115 or 20W more. Linear increase in clock without Vcore change results in linear icnrease in power drawn(roughly).
     
  15. sm625

    sm625 Diamond Member

    Joined:
    May 6, 2011
    Messages:
    7,849
    Likes Received:
    61
    My units of "watts per day" is perfectly valid when describing differences in power consumption. You're just nitpicking in an attempt to sound like you know something other than how to sound condescending. First of all, I was comparing an i3-3220 vs an FX6300, genius. Those are the chips the OP is trying to choose from. The difference in power between those two chips is about 80 watts under a gaming load. And second, mr genius, no one pays 12 cents a kwh. When you itemize out your electric bill you find that with delivery and surcharges, everyone is paying 15-20 cents per kwh. My guesstimate is exactly that, a good guesstimate, and it is a hell of a lot more accurate than yours. Someone who games 4 hours a day on an i3-3220 will save very close to $60 on their electric bill over 3 years, not the $23 you claim as accurate. So take your condescending blathering and stick it.

    The i5 was thrown in there to point out the fact that this is what you could buy with the extra money from electricity savings; I never made any claims that this much higher performing i5 would not consume extra power; of course it would.
     
    #65 sm625, Dec 13, 2012
    Last edited: Dec 13, 2012
  16. frozentundra123456

    frozentundra123456 Diamond Member

    Joined:
    Aug 11, 2008
    Messages:
    9,408
    Likes Received:
    188

    You are using the same units that he was. His calculation process is correct. I dont feel it was necessary to be so condescending to his mathematical capabilities. The difference is that you are estimating a different delta in power usage and using only 4 hours per day at load vs 8 hours per day for the other poster. If you estimate 8 hours per day at load, and your numbers rounded to 50 watts per hour and 12 cents per kwhr I come up with approximately 18.00 per year. That is not including the taxes, surcharges, etc which are invariably tacked on to energy bills that could increase the cost another 10 to 20 percent.
    And no one knows how much energy prices will increase in the future. Granted they have been stable or declining recently, but that trend will reverse at some point.
     
  17. NTMBK

    NTMBK Diamond Member

    Joined:
    Nov 14, 2011
    Messages:
    7,217
    Likes Received:
    138
    No it isn't. Watts is rate of consumption of energy. Watts per day is the rate of change in the rate of consumption of energy. Watts == energy/time, watts per day == energy/time^2. Watt-hours on the other hand would work.

    Heh, that one is a good spot. The conversation went into an i5 vs 6300 rant at some point, and I got muddled on which one you referring to.

    I'd like to see some figures to back that one up. Given that (again according to the Anandtech review) the difference at 100% load is 65W, I sincerely doubt that it's about 80 watts in a gaming load.

    Hey, I was just using the figures I could find online from Wikipedia (sourced from a US Government report, so hey), and I deliberately shot for the middle of the range they quoted. I am a European though, so I will admit to a lack of knowledge of how US pricing tariffs work.

    Again, you are basing your figures on the energy consumption at 100% load on your CPU (in fact, slightly over that amount), which is not a realistic figure for gaming.

    His post stated that he was basing it on 4 hours a day, not 8. Not really sure where you got 8 from. :confused:

    I will admit I got a bit condescending and snippy. I did Physics as a degree, so seeing someone misuse power and energy units is like a red flag to a bull for me (I see it all the time in news reports).

    As I said above, I am a European, so I'm just basing my pricing of units off online figures.

    We'll see; the latest predictions actually have US energy prices dropping, due to the expansion of shale gas extraction providing cheap gas. A single misjudged war on Iran could easily cancel that out though, of course!
     
  18. frozentundra123456

    frozentundra123456 Diamond Member

    Joined:
    Aug 11, 2008
    Messages:
    9,408
    Likes Received:
    188
     
  19. AtenRa

    AtenRa Lifer

    Joined:
    Feb 2, 2009
    Messages:
    12,129
    Likes Received:
    755
    That means that the FX6300 uses 100W more than Core i3 while gaming.

    4 hours / 400W = 100W

    That means that the FX6300 @ 4.2GHz will use 150W MORE in gaming than Core i3. ARE YOU SERIOUS ??? where did you see those numbers ??? o_O


    Even FX8350 at full load DON'T consume more than 115W than Core i3 in x264 and it is 2X faster.
    Meaning that FX8350 will consume almost the same power as Core i3 to finish the same job but at HALF THE TIME.

    [​IMG]

    [​IMG]

    ps: FX6300 @ 4.2GHz will not use more power than FX8350.
     
  20. NTMBK

    NTMBK Diamond Member

    Joined:
    Nov 14, 2011
    Messages:
    7,217
    Likes Received:
    138
    Highly technical?! Any 14 year old knows how to get their units right!

    EDIT: Dammit, I need to stop going into cranky physicist mode! That came out harsher than intended. :p It's still not "highly technical", though.

    EDIT 2: Okay, useful analogy time! The difference can be compared to the differences between distance, speed and acceleration.

    Distance == m ~ Energy == J
    Speed == m/s ~ Power == J/s == W
    Acceleration == m/s^2 ~ Power/time == W/s

    Where ~ indicates "is analogous to".

    [/ramble]
     
    #70 NTMBK, Dec 13, 2012
    Last edited: Dec 13, 2012
  21. frozentundra123456

    frozentundra123456 Diamond Member

    Joined:
    Aug 11, 2008
    Messages:
    9,408
    Likes Received:
    188
    I have a minor in physics, with several semesters of calculus, so I know about rates of change, (speed= first Dx, acceleration= second Dx). And I can balance equations and cancel out units as well as the next guy. But I dont really care about that. All I am saying is that he did the calculation correctly, albeit with some very strange estimates of the power differences between the two processors.

    Edit: I definitely agree with your first edit, and a bit of a sense of humor wouldn't hurt either. I was only being facetious about moving it to the highly technical forum. I thought that would be obvious from the quote marks and the way I made the statement.
     
    #71 frozentundra123456, Dec 13, 2012
    Last edited: Dec 13, 2012
  22. SlowSpyder

    SlowSpyder Lifer

    Joined:
    Jan 12, 2005
    Messages:
    10,170
    Likes Received:
    15
    Remember gaming often only uses two cores. Sometimes it uses four cores. And when a game does want more than four cores, then the FX6300 will use more power, but should also deliver better performance. I think some of you overestimate CPU load while gaming... especially if the OP is looking at a 7770 level card, that will often be the limiting factor.
     
  23. NTMBK

    NTMBK Diamond Member

    Joined:
    Nov 14, 2011
    Messages:
    7,217
    Likes Received:
    138
    Heh, yeah. I need to stop going into "arguing with everyone" mode.
     
  24. lyssword

    lyssword Diamond Member

    Joined:
    Dec 15, 2005
    Messages:
    5,737
    Likes Received:
    4
    My buddy bought $130 6300, stock volts/stock cooler @4.1 ghz mild OC, with a $70 fx970 mobo
     
    #74 lyssword, Dec 14, 2012
    Last edited: Dec 14, 2012
  25. infoiltrator

    infoiltrator Senior member

    Joined:
    Feb 9, 2011
    Messages:
    704
    Likes Received:
    0
    Which 970 motherboard and is he happy?
    Does it play the games he wants to play as well as he wishes?
    Cheers