Xbitlabs: Comparison of current APUs

Status
Not open for further replies.

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I wonder why they didnt use any Intel CPUs faster than the i3-3225.
Was it under a certain amount $ to give a idea of performance/$ ?

Anyways:
borderlands-2.png



f12012-2.png


hitman-2.png


sleepingdogs-2.png




The OpenCL and DirectCompute accelerated programs AMD leads too.
When it comes to non accelerated stuff, Intel i3-3225, is around 5-25% faster than the A10-5800k.

Yeah its pretty much as you would expect.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
My i5-3470 and GTX650Ti use less than 165w in Unigine + IBT load test, and even less in BF3.

Not even bad power consumption, but abysmal, although their PSU will be running far from optimal efficiency.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Perfect chips that destroy the gimped i3s and the low end lineup. I dont see the fuss about the power consumption in IDCs post, it was well known and frankly noone cares about it compared to the prices and the graphics & cpu performance of the chip. This is supposed to be a Desktop PC forum thread where power hungry Core i7 920s ruled the roost some years ago and not a laptop/tablet suggestions thread. What changed since then? Intels 22nm process advantage and everybody got sentimental about power consumption with their 800w PSUs?
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
I wonder why they didnt use any Intel CPUs faster than the i3-3225.

It has the HD4000, and we all know there's virtually no difference between the Core CPUs when using the same graphics card in most games. It's also in the same price bracket as the APUs. The cheapest (only?) i5 with HD4000 is the 3570k.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Its so bad, its not even funny. My 3770non-K, same MB (P8Z77-V Deluxe) and HD7870 uses ~200W...


The "peak" of a 7870 is around 144 watts (GPU alone).
A Intel 3770k uses about 130watts when stressed abit (bench).

What you mean is your system if you stressed to max would probably be atleast 144watts + 130watts = 274watts (likely more).


Yes thats more than the A10-5800k's 165watts or so.
But you probably have alot more performance than the this APU does.



The reviewer should have show power usage of the APUs while gameing.
Then maybe done a FPS avg / Watts used avg = performance/watt compairsion.

I think the A10-5800k would come out on top, in terms of performance/watt
for those chips compaired.


from techpowerup:
power_maximum.gif
 
Last edited:
Mar 6, 2012
104
0
0
The A10-5700 keeps on looking strong, AMD would have been better off making it the flagship trinity instead of the marginally better performing but power hungry 5800k.

I'm a bit surprised that the lowly A4-5300 actually beats out the i3-3225 in several scenarios, I had figured the HD4000 would best it across the board.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
I wonder why they dont mention that AMD supports all extensions where none of the Intel chips supports the AES extension since Intel wants you to pay up for a quadcore and bitch about AMDs single thread and power consumption.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
The A10-5700 keeps on looking strong, AMD would have been better off making it the flagship trinity instead of the marginally better performing but power hungry 5800k.

I fully agree.

The 5700 performance is almost the same as the 5800k.
Yet the 5800k consumes alot more power for some reason.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
A Intel 3770k uses about 130watts when stressed abit (bench).

What you mean is your system if you stressed to max would probably be atleast 144watts + 130watts = 274watts (likely more).

You're very much mistaken, my friend. A 3770non-K minus the IGP uses ~51W under full load and almost exactly 69W at 4.3GHz... ;)

Edit;

The A10-6700 looks very promising. 5800K performance in a 65W TDP.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Perfect chips that destroy the gimped i3s and the low end lineup. I dont see the fuss about the power consumption in IDCs post, it was well known and frankly noone cares about it compared to the prices and the graphics & cpu performance of the chip. This is supposed to be a Desktop PC forum thread where power hungry Core i7 920s ruled the roost some years ago and not a laptop/tablet suggestions thread. What changed since then? Intels 22nm process advantage and everybody got sentimental about power consumption with their 800w PSUs?

My post was about the products evaluated in the xbitlabs article, you can't say the same about yours.

If the power consumption of the reviewed products are as irrelevant as you claim then why did xbitlabs bother to collect and present the data in their review?

For some reason xbitlabs felt it was relevant and so they included the data in their review article...why am I not afforded the same without being characterized as making a fuss and being sentimental?

If you have a problem with the message (the data graphs I embedded) then take it up with the people responsible for the message (AMD and xbitlabs).
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
My post was about the products evaluated in the xbitlabs article, you can't say the same about yours.

If the power consumption of the reviewed products are as irrelevant as you claim then why did xbitlabs bother to collect and present the data in their review?

For some reason xbitlabs felt it was relevant and so they included the data in their review article...why am I not afforded the same without being characterized as making a fuss and being sentimental?

If you have a problem with the message (the data graphs I embedded) then take it up with the people responsible for the message (AMD and xbitlabs).

Considering the poor performance of the Intel iGPU, you won't be playing many games anyway, if any at all, and that will save even more electricity! :awe:
 

Galatian

Senior member
Dec 7, 2012
372
0
71
Looking at the FPS I considered neither processor to deliver enough power to run games at full HD even on low quality. The 5800K might hit 30 FPS on average, but with a a value that close to 30 FPS I bet you will definitely experience slow downs during real gaming.

The thing I'm still missing are comparisons between the mobile chips. It is my understanding that the mobile HD 4000 does not loose significant power compared to the desktop version where's the graphic cores from the mobile APUs do.

Also power consumption is of concern: the way I see it the APU offers more performance graphical wise, but not enough to really making gaming on a HTPC at Full HD possible. At the same time it uses a lot more power to achieve this. Call me a fanboy but I simply don't see the point of an APU...maybe somebody can enlighten me?
 
Aug 11, 2008
10,451
642
126
Perfect chips that destroy the gimped i3s and the low end lineup. I dont see the fuss about the power consumption in IDCs post, it was well known and frankly noone cares about it compared to the prices and the graphics & cpu performance of the chip. This is supposed to be a Desktop PC forum thread where power hungry Core i7 920s ruled the roost some years ago and not a laptop/tablet suggestions thread. What changed since then? Intels 22nm process advantage and everybody got sentimental about power consumption with their 800w PSUs?

Borderlands 2 barely playable (and the other games as well) at 1080p, low quality. I'll pass. Give me a discrete card. I can only imagine how they will struggle with newer titles. They didnt even test graphically demanding current games.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
I still don't hold APUs to be up to snuff for desktops. In laptops with piss-poor resolutions, they get the job done. But you're making one helluva compromise by going with an APU-only on a desktop.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
I still don't hold APUs to be up to snuff for desktops. In laptops with piss-poor resolutions, they get the job done. But you're making one helluva compromise by going with an APU-only on a desktop.
To put it in context, the A10-5700 is still better in many games than a HD4670, a GT440 or any (ATI) HD3000, 8800 or 9800 series card. I'ts not that helluva compromise, if you think about it.
 
Aug 11, 2008
10,451
642
126
To put it in context, the A10-5700 is still better in many games than a HD4670, a GT440 or any (ATI) HD3000, 8800 or 9800 series card. I'ts not that helluva compromise, if you think about it.

How old is the HD4670? Was it not a low/mid range card even when it was new? Would anyone build or buy a modern system for gaming and put in a 9800GT?

I have thought about it and it is a "helluva compromise" when an i3, FX6300, or even a pentium with a HD7750 or HD7770 will deliver one and a half to two times the performance for only a small additional cost and very little extra power usage.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
How old is the HD4670? Was it not a low/mid range card even when it was new? Would anyone build or buy a modern system for gaming and put in a 9800GT?

I have thought about it and it is a "helluva compromise" when an i3, FX6300, or even a pentium with a HD7750 or HD7770 will deliver one and a half to two times the performance for only a small additional cost and very little extra power usage.
How come people never look past their own situation...

http://store.steampowered.com/hwsurvey/videocard/
Look at that survey. It's not representative, but I covered a lot of cards and systems which are still in use. I'm not saying that they should replace their rigs with a Trinity system. But I am saying that there's quite the large group of gamers out there who don't live in your performance expectation. Heck, I had a HD4670 up until recently, it games just fine.

Btw a HD7770 alone is 70~90W at load, just saying.
 

Torn Mind

Lifer
Nov 25, 2012
12,078
2,772
136
I wonder why they dont mention that AMD supports all extensions where none of the Intel chips supports the AES extension since Intel wants you to pay up for a quadcore and bitch about AMDs single thread and power consumption.

Would buyers care more about superior data encryption performance if they don't use that class of software?
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Last edited:
Status
Not open for further replies.