[Donanimhaber] AMD Trinity A10-5800K performance leaked

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jionix

Senior member
Jan 12, 2011
238
0
0
I thought 2133MHz dual channel would be ~34GB/s effective memory bandwidth versus the 72GB/s on the 7750. Btw, did anyone notice that the chip is called the 5800K, perhaps they're trying to fool buyers into thinking this is better than the 3570K and 3770K?

AMD is already trying to fool buyers then, because their top APU part is called 3870K.
 

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
I thought 2133MHz dual channel would be ~34GB/s effective memory bandwidth versus the 72GB/s on the 7750. Btw, did anyone notice that the chip is called the 5800K, perhaps they're trying to fool buyers into thinking this is better than the 3570K and 3770K?
And remember that in practice, Llano also doesn't get anywhere close to the theoretical max bandwidth. AMD's IMC is just not very good, at least compared to Intel's (which is kind of amusing to me considering AMD has been using an IMC much longer than Intel and you'd assume they would have come closer to perfecting it, but I guess not). Here's some Sandra bandwidth benchmarks at various memory speeds for the A8-3850.

http://www.legitreviews.com/article/1652/3/

And here's HardOCP Sandra benchmarks for the 2500K and 2600K using DDR3-1600.

http://www.hardocp.com/article/2011/01/03/intel_sandy_bridge_2600k_2500k_processors_review/3

Note that even with 2x4GB DDR3-1833 the Llano only had about 16GB/s memory bandwidth. With 2x4GB DDR3-1600 SB is able to achieve 21GB/s, though. Memory bandwidth has always been the Achilles heel for AMD's APUs IMO.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
the lunch of haswell and kaveri.... will kill the gpu market D:
 

KompuKare

Golden Member
Jul 28, 2009
1,234
1,606
136
the lunch of haswell and kaveri.... will kill the gpu market D:

I think it will be long time before they are able to kill and have GPUs for lunch...

Admittedly Intel is taking iGPU far more seriously than they have ever done before and although Intel have 'near' infinite resource (compared to anyone else anyhow), that is no guarantee of success. In the past the classic Intel blunders were of course P4, Itanium and frankly the original 8086 (compared to say the 68000), and even recently there was the SATA bug and the SB-E vt-d bug.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,115
136
If the cpu performance is any indication of the improvement 'Piledriver' brings to the table, then we might as well forget the upcoming desktop cpus all together.:thumbsdown:

AMD did say that they were implementing new architectures in two steps. They will implement part of a new arch on their fusion product and the complete arch in the higher end CPUs. We'll just see if they are able to actually execute that plan well.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
I think it will be long time before they are able to kill and have GPUs for lunch...

Admittedly Intel is taking iGPU far more seriously than they have ever done before and although Intel have 'near' infinite resource (compared to anyone else anyhow), that is no guarantee of success. In the past the classic Intel blunders were of course P4, Itanium and frankly the original 8086 (compared to say the 68000), and even recently there was the SATA bug and the SB-E vt-d bug.

gpu market =\= high end gpu

but kavery, will kill mobile gpus for breakfeast, if we see another 50% bumb, and resolutions stay the same
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
resolutions won't stay the same, Apple is making sure of that

OLED would also make it extremely easy for resolutions to explode

I'd wager in a few years iGPUs will basically be back to square one in terms of being able to push resolultions to the fullest. Although the plus side to extreme resolution increases would be that if we can quadruple pixel density, we could then run lower resolutions with less impact on image quality that we have now with scaling blur.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Intel is a LONG ways off from catching AMD. Especially so long as Intel drivers and game support are as worthless as they are now.

But as mentioned above, with Apple now pushing high resolution displays, even these faster iGPU's like Trinity will be hard pressed to provide good performance.

I will be curious to see what GPU Apple goes with for the new MBP's. Maybe they will actually go with AMD this. They were close to using them in the MBA's if it wasn't for AMD's low production capacity.

EDIT: Fixed typos
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
]Itel is a LONG ways off from catching AMD. Especially so long as Intel drivers and game support are as worthless as they are now.[/B]

But as mentioned above, with Apple now pushing high resolution displays, even these faster iGPU's like Trinity will be hard press to provide good performance.

I will be curious to see what GPU Apple goes with for the new MBP's. Maybe they will actually go with AMD this. They were close to using them in the MBA's if it wasn't for AMD's low production capacity.

Not really that far off. Not like it was a few years ago. I agree that (unfortunately) we need Apple to drive monitor resolution gains. People are so freaking cheap and they just want the biggest (and crappiest) for the lowest price. Performance be d@mned.
 

Mopetar

Diamond Member
Jan 31, 2011
8,525
7,785
136
I will be curious to see what GPU Apple goes with for the new MBP's. Maybe they will actually go with AMD this. They were close to using them in the MBA's if it wasn't for AMD's low production capacity.

For the low end models and the airs, they'll just Intel graphics. The larger models will probably use nVidia GPUs since Apple basically alternates between AMD(ATI) and nVidia each generation.