AMD A10-5800k

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

inf64

Diamond Member
Mar 11, 2011
3,698
4,018
136
How about losing in every test against a i3-3225 /w 6670D, while consuming 40% more power with the same graphics card?

http://hexus.net/tech/reviews/cpu/46157-amd-a10-5800k-dual-graphics-evaluation/


Dat dual gpu scaling, either cpu bottleneck or AMD drivers take your pick.
From the Conclusion part of the article :
Dual Graphics is a feature of AMD Trinity APU systems that enables users to add a circa -£50 discrete Radeon graphics card - HD 6600-series, preferably - and lash it alongside the similar HD 7660D graphics built into the chip. Going down this route facilitates CrossFire multi-GPU rendering, useful for potentially increasing gaming performance without any further financial outlay. It's important to understand this feature is not available if choosing a price-comparable Intel platform.
The usefulness of Dual Graphics as a means of providing more gaming performance is wholly dependent on how well the title scales through CrossFire software technology. The best-case scenarios, such as DiRT Showdown, show a 40 per cent frame-rate increase over using the discrete card alone, as you would do if installing it on, say, an Intel Core i3 machine. On the flipside, Batman: Arkham City is indifferent to the charms of Dual Graphics, to the extent that performance actually drops off.
Our examination also finds that an Intel Core i3-3225 platform performs a smidge better than an AMD A10-5800K when evaluated with a discrete HD 6670 in the PCIe slot. What's more, the Core i3's power consumption is better than AMD's. Swings and roundabouts, eh?
We believe that an AMD APU's Dual Graphics capability is a useful feature if you happen to have an add-in card that closely resembles the on-board graphics' architecture. There's a reasonable chance of gaining extra performance, as shown by our benchmarks, at no extra cost. Dual Graphics, then, makes most sense in fixed-specification A10-5800K-powered base units that ship with the necessary HD 6600-series supporting cards.
Readers who consider themselves proper gamers would still be best-advised at spending at least an APU-matching £100 on a mid-range graphics card, because a Radeon HD 7850 or GeForce GTX 660 assuredly knock the spots off anything a well-matched Dual Graphics configuration can offer.
As they correctly noted, "a smidge" better it really is. With discrete card and no dual graphics,the performance delta is practically non existent. So your "performs worse" is correct but only if you consider "better" to be margin of error better. Both CPUs run the games (with discrete GPU and no dual graphics on APU) practically the same in 720p.In 1080p with dual graphics 5800K was substantially better.
Power draw delta exists and for power conscious people might be important. Under game load scenario it is around 43W according to the review. If 43W is important to someone than it's i3 all the way. If not 5800K is clearly a better chip ,supporting dual graphics and having better performance in modern game titles that support 4+ threads. Also it has better performance in ever so more threaded desktop workloads and full support for OpenCL accelerated GPGPU workloads.
 

Abwx

Lifer
Apr 2, 2011
10,947
3,457
136
How about losing in every test against a i3-3225 /w 6670D, while consuming 40% more power with the same graphics card?

http://hexus.net/tech/reviews/cpu/46157-amd-a10-5800k-dual-graphics-evaluation/


Dat dual gpu scaling, either cpu bottleneck or AMD drivers take your pick.

Two games , that s every test , indeed....

You will notice that the A10 has playable framerates with the integrated
GPU contrary to the i3 (and likely with less power draw...) and that s what matters as you wont find one i3 out of 20 that has a discrete GPU in commercial offerings , hence people would be much better served with
a A10 than a i3 that is already outdated in respect of the A10 for current
games evolutions.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
From the Conclusion part of the article :
As they correctly noted, "a smidge" better it really is. With discrete card and no dual graphics,the performance delta is practically non existent. So your "performs worse" is correct but only if you consider "better" to be margin of error better. Both CPUs run the games (with discrete GPU and no dual graphics on APU) practically the same in 720p.In 1080p with dual graphics 5800K was substantially better.
Power draw delta exists and for power conscious people might be important. Under game load scenario it is around 43W according to the review. If 43W is important to someone than it's i3 all the way. If not 5800K is clearly a better chip ,supporting dual graphics and having better performance in modern game titles that support 4+ threads. Also it has better performance in ever so more threaded desktop workloads and full support for OpenCL accelerated GPGPU workloads.

What?

Two games , that s every test , indeed....

You will notice that the A10 has playable framerates with the integrated
GPU contrary to the i3 (and likely with less power draw...) and that s what matters as you wont find one i3 out of 20 that has a discrete GPU in commercial offerings , hence people would be much better served with
a A10 than a i3 that is already outdated in respect of the A10 for current
games evolutions.


What?
 

Hubb1e

Senior member
Aug 25, 2011
396
0
71
People on forums are overly sensitive to power consumption. 8350 power vs i5 3570k is a bit of a concern because the FX chip requires robust motherboards and PSUs to overclock, but with Trinity vs i3 You'd save more than 43W simply by turning the lights off in the room that you game.
 

svenge

Senior member
Jan 21, 2006
204
1
71
Or perhaps, you're just a troll that will mock AMD whenever possible. Take your pick. But you're a joke. :rolleyes:

No, the real joke is this:

z
 

inf64

Diamond Member
Mar 11, 2011
3,698
4,018
136
What does that chart have to do with 5800K not being a bottleneck in that review?
I could post some non-related charts too, just to spam a bit on the forums ;). Would not change a thing in this matter.
 
Aug 11, 2008
10,451
642
126

Well, in the time periods pictured, AMD had dropped to 1/3 of its high, while intel has dropped from 28 to 21 or 75% of its comparison price, so I dont know if I would really show that chart to support AMD.

The shape of the charts looks the same, but any chart that does not start at zero on the vertical axis can be deceptive. Edit, not to mention that the y axis on the AMD chart is not linear. So the apparently similar slopes of the lines mean nothing except that the trend over time is similar, but AMD is declining by more than 2 times as much percentage wise.
 
Last edited:

Centauri

Golden Member
Dec 10, 2002
1,655
51
91
Well, in the time periods pictured, AMD had dropped to 1/3 of its high, while intel has dropped from 28 to 21 or 75% of its comparison price, so I dont know if I would really show that chart to support AMD.

The shape of the charts looks the same, but any chart that does not start at zero on the vertical axis can be deceptive. Edit, not to mention that the y axis on the AMD chart is not linear. So the apparently similar slopes of the lines mean nothing except that the trend over time is similar, but AMD is declining by more than 2 times as much percentage wise.

So my irrelevant BS chart is as irrelevant as svenge's irrelevant BS chart. Agreed.

Moving on.
 

Maragark

Member
Oct 2, 2012
124
0
0
Well, in the time periods pictured, AMD had dropped to 1/3 of its high, while intel has dropped from 28 to 21 or 75% of its comparison price, so I dont know if I would really show that chart to support AMD.

The shape of the charts looks the same, but any chart that does not start at zero on the vertical axis can be deceptive. Edit, not to mention that the y axis on the AMD chart is not linear. So the apparently similar slopes of the lines mean nothing except that the trend over time is similar, but AMD is declining by more than 2 times as much percentage wise.

I'm sick of seeing all this type of crap in this sub-forum. I come here to discuss CPUs, not sales and finance data.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
So tell us, how do you design a CPU without billions of dollars of sales to finance that cpu design?
 

Maragark

Member
Oct 2, 2012
124
0
0
So tell us, how do you design a CPU without billions of dollars of sales to finance that cpu design?

You don't. That doesn't mean I want to talk about sales or financial data in the CPU and Overclocking sub-forum. Have that discussion somewhere that's actually relevant. That is of course if you really do want to discuss it and are not simply using it as an excuse because you can't counter someone's point.
 
Aug 11, 2008
10,451
642
126
I'm sick of seeing all this type of crap in this sub-forum. I come here to discuss CPUs, not sales and finance data.

I didnt post the charts. I was just pointing out how they did not portray an accurate picture of the relative price declines. If you want to accuse someone of posting "crap" attack the original posters, not me.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
You don't. That doesn't mean I want to talk about sales or financial data in the CPU and Overclocking sub-forum. Have that discussion somewhere that's actually relevant. That is of course if you really do want to discuss it and are not simply using it as an excuse because you can't counter someone's point.

Which subforum would you suggest we talk about the sales of CPU's in? Maybe PC gaming? :colbert:
 

Maragark

Member
Oct 2, 2012
124
0
0
Which subforum would you suggest we talk about the sales of CPU's in? Maybe PC gaming? :colbert:

None. The topic for this forum is Hardware and Technology, not Sales and Marketing, nor Finance and Investing.

I've said my piece and will say no more about this.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Intel mob in defense mode LOL. A chart showing AMD in trouble is ok, but when another shows Intel not doing so well themselves is chaos. They had to intervene as usual.
 

inf64

Diamond Member
Mar 11, 2011
3,698
4,018
136
I have said it yesterday,I will say it again. Trolling against AMD on AT forum is getting out of the hand unfortunately . No matter what the topic is their GPUs microstutter,their CPUs bottleneck and they are going bankrupt every few months (for 6 years now!). It's amazing.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Or perhaps, you're just a troll that will mock AMD whenever possible. Take your pick. But you're a joke. :rolleyes:

Well given my history with the 245, 555, 965, 1090T, as well as i3-540 and i5-2500k and real world actual usage I can safely say I have no issue with your bias attempt to box me up.

Generally speaking when GPU limited AMD's cpus tend to get slightly higher fps than Intels, this is not the case at any point in the review I linked. So barring some logic changing difference with A10 vs PhenomII/Bulldozer/Piledriver it seems pretty obvious the AMD cpu is struggling in mins which is bringing average below the Intel cpu.

Since it's impossible to find A10 /w 6870 reviews I used something else. It's also impossible to find 8350 reviews with SLI/CF, there is an obvious reason for this after the [H] SLI review of the 8150, but I'll assume diving further into this discussion will just result with more labels and less actual point refuting from you.

That said I wholly disagree with the general use of avg fps in CPU reviews that currently takes place. From my own personal experience with many of the chips used in reviews I find their results disingenuous to the actual realities of the games themselves. Often you'll find reviews like WoW where the AMD cpu is getting 70+ fps and the Intel is getting 120+, people will of course say the AMD cpu is providing enough fps for it to not matter, however in my own personal experience in raiding, towns, and other high pop areas the real performance of the cpu is considerably lower to the tune of about 1/3 the reported average in many reviews.
 
Last edited:

bononos

Diamond Member
Aug 21, 2011
3,889
158
106
Intel mob in defense mode LOL. A chart showing AMD in trouble is ok, but when another shows Intel not doing so well themselves is chaos. They had to intervene as usual.

Eh? The Intel chart was a misrepresentation, probably the 1st trick in book of lying by statistics. AMD is fighting for air at this moment and had to slash prices by 1/3 to stay competitive. AMD being in trouble is the simple truth.
 

nerp

Diamond Member
Dec 31, 2005
9,866
105
106
Who cares? Until they're gone, let's talk about their chips. Personally, I'm buying one of these for a new box to play simcity 5 and stream live HDTV using a networked tuner from my Ceton arrays powered by my Llano machine upstairs. My i3 iMac doesn't cut it for games, even in boot camp. The A10-5800K will play the game just fine for me, handle streaming the TV better than the iMac does virtualized in w7 on OSX Mountain Lion and the whole package, mobo, cpu, ram, will cost me less than $250.

Intel has nothing comparable. Nothing.

(I'm reminded I need to update my sig.)