Trinity prices leaked

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
If you want cheap gaming a Celeron G530 and an HD 6670 DDR5 will horribly smash an A10-5800K.

They both use next to no power, too.
Celeron G530 won't horribly smash 5800K,not even close. It's 2.4Ghz locked ,dual core SB with no HT. At default it would horribly lose to 5800K (@ stock too) in MT(2 vs 4 threads,2.4Ghz vs 3.8-4.2Ghz,no chance for SB) and barely beat it in ST workloads(4.2Ghz vs 2.4Ghz). Then you can push 5800K fairly easy to 4.5+Ghz just like FX4100/4170 on air since it's unlocked.

5800K is all round much better option,from price/perf/watt POV as well from enthusiast/"OC on the budget" POV. 6670 may beat the iGPU in 5800K but that GPU is also unlocked and can be paired up with another in-its-class GPU for again much better perf/$ vs the intel option.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I doubt that it will "Horribly Smash" it. I owned a 6670 and its not that great of a card. lol


I rather have a quad core than a dual core anyways.. I played BF3 on a G530 and its not that pleasant. At least I know if I were to add a discrete card to the Trinity setup that I would get even more solid gaming performance rather than a CPU bottleneck due to lack of cores for some games.

Seems to me like, rather, you aren't very informed in this subject. Games care more about single-threaded performance than multi-threaded performance and that's a FACT that is easily corroborated. If anything, the A8 would be more likely to bottleneck because of its low single-threaded performance.

And an HD 6670 DDR5 would mean it would smash it in gaming. Even if the A10's IGP were 50% faster than the HD 6550D, which it is not, it wouldn't be close to having the same performance. Just for a point of comparison, an HD 6550D is lower performance than an HD 6570 DDR3 and the HD 6670 DDR5 is around 2x faster than that card.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Celeron G530 won't horribly smash 5800K,not even close. It's 2.4Ghz locked ,dual core SB with no HT. At default it would horribly lose to 5800K (@ stock too) in MT(2 vs 4 threads,2.4Ghz vs 3.8-4.2Ghz,no chance for SB) and barely beat it in ST workloads(4.2Ghz vs 2.4Ghz). Then you can push 5800K fairly easy to 4.5+Ghz just like FX4100/4170 on air since it's unlocked.

5800K is all round much better option,from price/perf/watt POV as well from enthusiast/"OC on the budget" POV. 6670 may beat the iGPU in 5800K but that GPU is also unlocked and can be paired up with another in-its-class GPU for again much better perf/$ vs the intel option.

Maybe if you cared to actually read before making these uninformed posts.

I said in gaming a Celeron G530 and an HD 6670 DDR5 would smash an A10-5800K, and that is a fact. Seems simple enough.

And overclocking on a Bulldozer-derived architecture: horrible power consumption.
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
If you want cheap gaming a Celeron G530 and an HD 6670 DDR5 will horribly smash an A10-5800K.

LOL Wut? The 5800K will provide a much better overall PC experience, period. Sure, if you all you want to do is play some games it's OK buy a console, but the G530 + 6670 will not outperform the 5800K. It can't even outperform the 3850K (click "Performance" tab under yellow bar), which itself is 100% faster than the G530 in many tests.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
The TDP of the two parts you're considering is higher than a Trinity counterpart. Power consumption, too, would be greater as Trinity is actually more efficient than both Sandy and Ivy in idle and low clocked P-states. In full load you'll get more power consumption and heat with a discrete 6670 and a G530.



Platform cost also implies the other components, not just the discrete GPU and CPU. You're also going to need cooling for both chips as opposed to just one, a longer bigger and wider PCB to support a PCIE card and a beefier PSU. All of these cost money.

A single chip solution allows OEMs more flexibility with what they produce. A lot of new designs are touch-based all-in-ones and having a single chip makes much more sense than a discrete + CPU option. It saves them money on the platform, allows the product to be cheaper, makes for a thinner and sexier look and all the while offering similar performance figures.

You're looking at it from an enthusiast perspective: swappable components + better performance. OEMs don't look at it that way and neither do a majority of consumers. If consumers and OEMs were like us then we wouldn't have Ultrabooks and tablets and nor would laptops outsell PCs at a > 3-to-1 ratio like they are now.

Incorrect. TDP isn't a measure of power consumption, nor was it ever meant to be.

And the cooling requirements or PSU requirements aren't gonna be any higher if they consume similar amounts of power.

By itself, a Radeon HD 6670 DDR5 will consume 50W while gaming.

power_peak.gif


A system running a Celeron Sandy Bridge uses around 70W under full load.
power-2.png


In comparison, a system running an Llano A6 (which would be a good point of comparison if AMD managed to lower power consumption compared to the Llano A8) consumes around 110W. Add those together, the Celeron + HD 6670 combo consumes 120W. Assuming the Trinity A10 has lower power consumption than the Llano A8, that puts it at around 110W.

10 WHOLE WATTS! THIS CHANGES EVERYTHING!!!
No, it doesn't.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
LOL Wut? The 5800K will provide a much better overall PC experience, period. Sure, if you all you want to do is play some games it's OK buy a console, but the G530 + 6670 will not outperform the 5800K. It can't even outperform the 3850K (click "Performance" tab under yellow bar), which itself is 100% faster than the G530 in many tests.

Maybe if you cared to even read. Seems like that's asking too much for some people here...

If you want more CPU performance buy a Core i3-2120 or 3220 instead.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
You forget that since these processors get their work done much faster, it consumes a lot less power overall.

Except the APU would be slower in gaming, of course.

Meaning your statement of "getting work done much faster" is completely inaccurate and therefore meaningless.

If you want productivity you buy an i3-3220 instead. Much lower power consumption than the A10 and better CPU performance overall.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Incorrect. TDP isn't a measure of power consumption, nor was it ever meant to be.

And the cooling requirements or PSU requirements aren't gonna be any higher if they consume similar amounts of power.

I was referring to the TDP, not power consumption. Don't assume I'm the one who doesn't know these things. Remember, it takes only a couple of minutes to pull up your glorious fail thread where you ranted on about IPC but didn't know what it meant :p

If you want to know how much of an affect TDP can have on cooling concerns regardless of load power consumption being high then have a look at the Ivy CPUs labeled as 95W TDP for OEMs rather than 77W.

In comparison, a system running an Llano A6 (which would be a good point of comparison if AMD managed to lower power consumption compared to the Llano A8) consumes around 110W. Add those together, the Celeron + HD 6670 combo consumes 120W. Assuming the Trinity A10 has lower power consumption than the Llano A8, that puts it at around 110W.

10 WHOLE WATTS! THIS CHANGES EVERYTHING!!!

That's 10 extra watts on a Llano, Trinity decreased power consumption significantly.

46671.png


46672.png


That's normalized battery life as well. It skips ahead of SB as far as power consumption goes. The only place where it didn't push the bar with power consumption is video playback

46673.png


Although that depends on the review

battery-video.png


TR shows that it was just a bit ahead of Llano. It's going to be more than 10 watts of savings.

Secondly, your prices are based on current retail prices. When OEMs are buying the processors in 1k+ values they get deep discounts. Buying 1,000 6670s isn't going to net you a big discount on the value of the GPU from an etailer due to the chips having decreased in value tremendously since they were introduced. Opting to go with newer tech, Trinity APU in this case, means larger savings for high volume purchases as well as the other benefits I've laid out here and above which you ignored :)
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I was referring to the TDP, not power consumption. Don't assume I'm the one who doesn't know these things. Remember, it takes only a couple of minutes to pull up your glorious fail thread where you ranted on about IPC but didn't know what it meant :p

If you want to know how much of an affect TDP can have on cooling concerns regardless of load power consumption being high then have a look at the Ivy CPUs labeled as 95W TDP for OEMs rather than 77W.



That's 10 extra watts on a Llano, Trinity decreased power consumption significantly.

46671.png


46672.png


That's normalized battery life as well. It skips ahead of SB as far as power consumption goes. The only place where it didn't push the bar with power consumption is video playback

46673.png


Although that depends on the review

battery-video.png


TR shows that it was just a bit ahead of Llano. It's going to be more than 10 watts of savings.

Secondly, your prices are based on current retail prices. When OEMs are buying the processors in 1k values they get deep discounts. Buying 1,000 6670s isn't going to net you a big discount on the value of the GPU from an etailer due to the chips having decreased in value tremendously since they were introduced. Opting to go with newer tech, Trinity APU in this case, means larger savings for high volume purchases as well as the other benefits I've laid out here and above which you ignored :)

If they lowered the power consumption by what you showed then all that does is give credibility to the numbers I provided. Even if they're both Llano quad-cores, the A6 does consume less power than the A8.

And I don't care for OEMs... this is a computer enthusiast forum. OEMs are gonna go all wild because of AMD's BS with their high number or cores and high frequencies, which don't mean crap because IPC is still way too low--comparable or lower than Llano. Of course, average Joe does not know this, so he just sees that the numbers on both frequency and cores are higher and says "OMG, THIS IS SO AWESOEM!!!" OEMs are also gonna notice this and will charge higher prices than they need to.

I love how AMD were marketing about the MHz myth years ago when it was Intel in this situation. Now it's them releasing high-frequency, high power consumption, high core count CPUs because their current CPU engineers suck. The only thing I hope is those same engineers don't get their hands on any Radeon product, because that'd be a disaster.

Anyway, to stop the rambling:

If you want high CPU performance on a budget get a Core i3.
If you want high GPU perfomance on a budget get a Celeron or Pentium + Radeon HD 6670 DDR5 or similar.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I'm not sure this is true of Trinity/Piledriver. Unfortunately nobody's OCed one yet (that I know of) to find out.

Piledriver is but a mere refresh of Bulldozer. IPC increases, from what I've seen, is limited to around 5%.

If we're to believe AMD then Steamroller will bring the big changes.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
If they lowered the power consumption by what you showed then all that does is give credibility to the numbers I provided. Even if they're both Llano quad-cores, the A6 does consume less power than the A8.

Actually, it shows that Trinity would consume less power than a Sandy + discrete GPU solution. It already consumes less power than Sandy Bridge without a GPU, adding a discrete GPU that doesn't have the hibernate ability that is offered with Optimus and the GCN architectures means power consumption would be even worse both under load and at idle (where most computing is done).

And I don't care for OEMs... this is a computer enthusiast forum. OEMs are gonna go all wild because of AMD's BS with their high number or cores and high frequencies, which don't mean crap because IPC is still way too low--comparable or lower than Llano. Of course, average Joe does not know this, so he just sees that the numbers on both frequency and cores are higher and says "OMG, THIS IS SO AWESOEM!!!" OEMs are also gonna notice this and will charge higher prices than they need to.

OEMs care about the bottom line and most people don't make their own PCs. I already said this is an enthusiast forum but enthusiasts don't dictate the direction of the market anymore. If we did we wouldn't have Ultrabooks, tablets, and laptops wouldn't be outselling PCs at a 3:1 ratio, neither AMD nor Intel would be bothering with on-die graphics and we wouldn't be going towards efficiency rather than IPC + gigerjigglebitz. Yet these are the trends of modern computing. Choosing to ignore OEMs and the average consumer means you're turning a blind eye to the future of nearly all microarchitecture.
 

sefsefsefsef

Senior member
Jun 21, 2007
218
1
71
Piledriver is but a mere refresh of Bulldozer. IPC increases, from what I've seen, is limited to around 5%.

If we're to believe AMD then Steamroller will bring the big changes.

What does Steamroller have to do with Piledriver's overclocked power consumption? That's what I was talking about. Please stay focused.
 

ctsoth

Member
Feb 6, 2011
148
0
0
If they lowered the power consumption by what you showed then all that does is give credibility to the numbers I provided. Even if they're both Llano quad-cores, the A6 does consume less power than the A8.

And I don't care for OEMs... this is a computer enthusiast forum. OEMs are gonna go all wild because of AMD's BS with their high number or cores and high frequencies, which don't mean crap because IPC is still way too low--comparable or lower than Llano. Of course, average Joe does not know this, so he just sees that the numbers on both frequency and cores are higher and says "OMG, THIS IS SO AWESOEM!!!" OEMs are also gonna notice this and will charge higher prices than they need to.

I love how AMD were marketing about the MHz myth years ago when it was Intel in this situation. Now it's them releasing high-frequency, high power consumption, high core count CPUs because their current CPU engineers suck. The only thing I hope is those same engineers don't get their hands on any Radeon product, because that'd be a disaster.

Anyway, to stop the rambling:

If you want high CPU performance on a budget get a Core i3.
If you want high GPU perfomance on a budget get a Celeron or Pentium + Radeon HD 6670 DDR5 or similar.

You talk like OEM ceos and managers are complete idiots.... You really have no idea how the market works, additionally it seems you are unable to fathom the wants and interests of other people.

You hope the CPU engineers don't get their hands on radeon products? What century are you stuck in? Have you paid attention at all to, I don't know, the processor at the center of this topic? Who cares that the first two iterations of 'apus' aren't omgwtfbbqpwnz, they do what they are meant to do, and that is reach a very broad market at excellent pricing. My entire setup draws less power in a day than a single high end graphics card. In complete silence.
 
Last edited:

The Alias

Senior member
Aug 22, 2012
646
58
91
guys I'm pretty sure a g530 would get slaughtered by trinity just look at what the 5700 and 5800k do here against an i3 : http://www.tomshardware.com/reviews/a10-5800k-a8-5600k-trinity-apu,3241.html
in that whole review sb i2 2100 was bested fully except in 4 benchmarks (excluding sandra)

also in here : http://www.techspot.com/review/458-battlefield-3-performance/page7.html you see where rvenger's complaint about dual core cpus bottlenecking in bf3 whereas the fx 4100 is not . in fact the min framerate is actually higher than the 2500k but that's negligible
 

Gigantopithecus

Diamond Member
Dec 14, 2004
7,664
0
71
If you want productivity you buy an i3-3220 instead. Much lower power consumption than the A10 and better CPU performance overall.

From the main site:
"In terms of its CPU performance, the A8-3870K trades blows with the Intel i3-2100 in real-world applications. (While we don't have the A8-3870K in Bench, that comparison of an A8-3850 with the i3-2100 should give you a good idea of how the A8-3870K compares to the i3-2100 since at stock clocks, the A8-3870K is only 100MHz faster than the A8-3850.) Intel CPUs that are slower than the i3-2100 either come close to the A8-3870K or simply can't keep up with it. In other words, in terms of CPU-centric tasks, AMD's $110 APU performs similarly to Intel's $110 CPU."

...Or do you know more about these matters than the editors of the site?

Maybe if you cared to even read. Seems like that's asking too much for some people here...

:rolleyes:

Do you personally have first-hand experience with the A8-3870K & A10? With both the SB & IB i3s? Have you used a G530 + 6670 DDR3 & DDR5 side by side with an A8 system? Have you benchmarked them for AnandTech? I have. I'd rather game on a G530 and a 6670. I'd rather do work on an A8. The difference is, for my needs (mostly productivity with some light gaming with less demanding titles), I'd much rather have an A8.

The bottom line is that a G530 + 6670 DDR3 does not 'smash' the A8. It's certainly better, but it's not smashingly better. And that advantage becomes much less meaningful at resolutions less than 1080p with less demanding titles like WoW and D3 where your eyes can't tell the difference in frame rates anyway. The trade-off in CPU capability to get better gaming performance, however, is absolutely 'smashing' - the A8 rolls the G530.

Anyone who says the i3-2120/3220s are better at productivity tasks than the A8/A10 is simply wrong. They go back and forth depending on the application, as Bench clearly illustrates. Anyone who says the Intel chips consume less power is also wrong, because AMD's A8 APU has lower idle consumption by 10/15% and higher load consumption by 10/15%. The A10 is lower on idle by about 15% and a bit (i.e. <10%) higher on load than the IB 3220.

Seems to me like, rather, you aren't very informed in this subject. Games care more about single-threaded performance than multi-threaded performance and that's a FACT that is easily corroborated.

This is also flat-out wrong. Some games absolutely benefit from having more cores and others don't. Hell, BF3 can choke even hex-core SB-E chips when it's really going. You should refrain from saying others aren't well-informed when you're the one making terribly erroneous blanket generalizations.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Anyway, to stop the rambling:

If you want high CPU performance on a budget get a Core i3.
If you want high GPU perfomance on a budget get a Celeron or Pentium + Radeon HD 6670 DDR5 or similar.


This isn't an Intel debate so you can start a new thread all about the G530 if you like. Its like you are here to just bash anything AMD and call everyone misinformed. I owned a G530 and 3 variants of Llano and I know what performs better and what doesn't by first hand experience. Next year when they decide to start making more multithreaded titles, my Llano will still chug along like nothing happened and all the Celeron and Pentium dual core will hit a wall just like the old Semprons and Athlon 64s.



I am not going to sit here and start posting up old outdated review charts that shows Intel in the lead on everything.
 
Last edited:

Centauri

Golden Member
Dec 10, 2002
1,631
56
91
Do you have any proof of this?

The parts I mentioned can fit in any conventional mid-tower case, even a micro-atx case.

Also:
Celeron G530: $45
Radeon HD 6670 DDR5: $85
1x4GB DDR3-1333: $19
H61 motherboard: $60
Total: $209

AMD A10-5800K: $130
2x2GB DDR3-1600: $25
A55 motherboard: $60
Total: $214

Where is this supposedly higher platform cost? If you look online, prices say otherwise. If you want cheap gaming a Celeron G530 and an HD 6670 DDR5 will horribly smash an A10-5800K.

They both use next to no power, too.

I'm amazed that such a simple and obvious statement is going so far over your head.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I'm amazed that such a simple and obvious statement is going so far over your head.


It even defeats the purpose of having good integrated graphics which is the purpose of this thread. lol
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
I think it's useless to try and argue with that poster. Just let him believe in his version that celeron+discrete is better and be done with it.
 

meloz

Senior member
Jul 8, 2008
320
0
76
It's the OSS community and Linux. In fact, that's how your favorite distro was developed.

AMD is actually a lot more open than a lot of hardware vendors with regards to their approach. They're quite Linux friendly

You are ignorant beyond belief.
Rtihm.gif
Watch some videos on youtube by Linus or Greg and educate yourself how the development works. Driver patches in the linux kernels are all submitted by hardware vendors. The "community" then helps with testing and some bug fixing to make sure everything works together.

AMD released some documentation and said: "Guys, why don't you spend any free time you have and make us some drivers. We'll keep all the profits, though, when your drivers help sell our graphic cards."

People are always having a go at Intel on these forums (for not being as incompetent as AMD, mostly), but not even Intel have sunk that low. Intel make sure their hardware is well supported in the kernel. It is one of the reason they are so dominant. AMD would do well to copy Intel, here.
 

ctsoth

Member
Feb 6, 2011
148
0
0
You are ignorant beyond belief.
Rtihm.gif
Watch some videos on youtube by Linus or Greg and educate yourself how the development works. Driver patches in the linux kernels are all submitted by hardware vendors. The "community" then helps with testing and some bug fixing to make sure everything works together.

AMD released some documentation and said: "Guys, why don't you spend any free time you have and make us some drivers. We'll keep all the profits, though, when your drivers help sell our graphic cards."

People are always having a go at Intel on these forums (for not being as incompetent as AMD, mostly), but not even Intel have sunk that low. Intel make sure their hardware is well supported in the kernel. It is one of the reason they are so dominant. AMD would do well to copy Intel, here.

Should linux driver devs be payed hardware royalties?
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
You are ignorant beyond belief. Watch some videos on youtube by Linus or Greg and educate yourself how the development works. Driver patches in the linux kernels are all submitted by hardware vendors. The "community" then helps with testing and some bug fixing to make sure everything works together.

I was referring to the upstream flow with regards to testing. Most hardware vendors don't bother with Linux as far as drivers go, or at least compared to Windows, so this shouldn't surprise you. I don't like it either but they only care about their bottom line and currently Linux's market share doesn't justify the extra work they'd have to put in. Not to mention gaming on Linux hasn't changed in years, though Steam's current moves may shake that up a bit.

AMD released some documentation and said: "Guys, why don't you spend any free time you have and make us some drivers. We'll keep all the profits, though, when your drivers help sell our graphic cards."

People are always having a go at Intel on these forums (for not being as incompetent as AMD, mostly), but not even Intel have sunk that low. Intel make sure their hardware is well supported in the kernel. It is one of the reason they are so dominant. AMD would do well to copy Intel, here.

Here you wouldn't be comparing Intel and AMD but rather AMD and nVidia. As for that, here you go. And then there's them favoring proprietary CUDA as opposed to openCL and their rather crappy openGL support. Thankfully Bumblebee's Optimus works, though no thanks to nVidia

I'd give Intel's RST a go on a Linux machine. Tell me how far you get. And when you're done with that give Quicksync a try.

Intel make sure their hardware is well supported in the kernel.

No.
 
Last edited: