Trinity review

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
Certainly interesting to at last see how the top-end Trinity performs... but that's by no means a complete picture as the rest of the line-up takes some pretty hefty hits. The A8-4500M and A10-4655M have roughly two-thirds the raw GPU power (whether or not that affects results much will depend upon how bandwidth starved the reviewed A10-4600M is) while the A6-4400M is at half and the 17W A6-4455M is at roughly 40%... Not to mention the only way they get that SKU down to its 17W TDP is by going with only a single module topping out at 2.6GHz turbo. It's going to be quite amusing to see how that compares with the 17W Ivy Bridge SKUs.
According to Wikipedia 17W IB is supposed to have a 350MHz base GPU clock and between 1.05 and 1.15GHz turbo clock. Compared to 650 base and 1.1-1.3GHz turbo for 35-55W chips. It's definitely going to sacrifice a lot performance to meet the 17W TDP as well, I could see the 17W Trinity chips being very competitive in GPU performance. CPU performance will be no contest, though, a Piledriver module clocked at mid 2GHz isn't even going to compare to two IB cores w/HT clocked at high 1GHz/2GHz base and high 2GHz/low 3GHz turbo.

Anyway, not bad at all IMO, not a huge improvement over Llano but nothing to scoff at either, there's only so much you can do in only a year without transitioning to a new process node. Was especially worried about Piledriver, but it seems AMD has managed to improve performance and power consumption quite a bit compared to Bulldozer. They'll have to stay on top of their game and continue delivering now that Intel is taking integrated graphics seriously, though. HD 4000 improved a ton over HD 3000 and has nearly closed the performance gap between Intel and AMD. And it sounds like Haswell's graphics will be an even greater improvement over HD 4000 than it was over HD 3000.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Disappointing gaming performance. HD4000 saw a 30-50% REALIZED increase in performance and Trinity only is 18% faster than Llano. They are pretty close in parity for GPUs, with AMD being slightly faster but Intel has quicksync. A draw IMO. On the CPU side, Intel continues to spank Trinity. Desktop PD will continue to be disappointing, but hopefully will be more power efficient.

Disappointing ??? Intel 3720QM is a 45W Mobile CPU. Trinity A10-4600M is a 35W Mobile CPU. It is not only faster in Games than Intel, it is cheaper as well.

So to re cope, a cheaper lower power AMD Mobile CPU is faster(in games) than Intel's high-end Mobile CPU and you find it disappointing ?? A10-4600M is competing against Core i5 not Core i7.
The price/performance of Trinity A10-4600M is unbeatable as of now.

Mobile Haswell will probably be the first time we see Intel completely eclipse AMD in IGPs, based on this. Trinity will be AMDs mobile pony for a while now.

By the time Haswell will be released, AMD will release the third Generation APUs with GCN iGPUs and a new lithography process. Intel will be second best in that fight as well. ;)
 

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
Worth noting is the 35W parts should offer similar GPU performance, though. Unless Wikipedia is leading me astray, looks like all the 35W parts will have the same 650MHz base clock, but turbo clock will be slightly slower. The cheaper 35W Core i5 will also have less cache, which will probably hurt integrated GPU performance as well, although I don't know how much.

http://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture)#Mobile_processors
 

pcsavvy

Senior member
Jan 27, 2006
298
0
0
Not wanting to start a war of words of whose better. Just my 2 cents here.
Trinity looks promising for those of us who are not :eek: gamers but would like a laptop that combines good overall computer experience without spending a small fortune on overkill features and having to keep an eye out for the nearest electrical outlet.
I hope retail Trinity laptops come to market soon so there will be more reviews based on actual usage.
There are some of us who are more interested in getting the best bang possible for the lowest dollar as possible because that is all we really need or can afford. Able to run all kinds of videos, play an occassional game, do homework or work, and able to run on a single charge for a full day or two.
For commuters be able to take a laptop to work or school without having to lug around a powerbrick all the time and easy to carry around.
 

Khato

Golden Member
Jul 15, 2001
1,305
383
136
According to Wikipedia 17W IB is supposed to have a 350MHz base GPU clock and between 1.05 and 1.15GHz turbo clock. Compared to 650 base and 1.1-1.3GHz turbo for 35-55W chips. It's definitely going to sacrifice a lot performance to meet the 17W TDP as well, I could see the 17W Trinity chips being very competitive in GPU performance. CPU performance will be no contest, though, a Piledriver module clocked at mid 2GHz isn't even going to compare to two IB cores w/HT clocked at high 1GHz/2GHz base and high 2GHz/low 3GHz turbo.

No question that the 17W Ivy Bridge SKUs are going to be a bit less powerful than the 35W ones, but if it follows the trend of the Sandy Bridge SKUs then it won't even be a 10% difference. In fact, there's more variance to be found in the results on notebookcheck.net between notebooks based upon the same processor than there is between the 17W and 35W versions of Intel's HD 3000 graphics. (Yes, some amount of that is due to the startling number of systems that ship with only a single channel of memory populated.)

Anyway, my point is basically that Sandy Bridge didn't show a large sacrifice in graphics performance between the 35W and 17W SKUs and I don't see any reason why Ivy Bridge would be markedly different. The real question is how Trinity scales down - if it's memory bandwidth limited in many cases then the 17W part having under half the effective throughput might not be so bad. Just have to wait and see since there was no analysis done in that area.
 

happysmiles

Senior member
May 1, 2012
340
0
0

meloz

Senior member
Jul 8, 2008
320
0
76
Quiet underwhleming. :\

On the CPU side, IVB absolutely clobbers Trinity. People keep saying "CPU don't matter", but if you have a SSD the 'bottleneck' immediately shifts back to the CPU for certain tasks. It is always nice to have good single thread performance, and Intel enjoys a huge advantage in this area.

On the iGPU side, the much derided and seemingly impotent HD 4000 manages to actually narrow the lead AMD enjoyed on the graphics side. If only Intel drivers were written by people who had any clue, they would get even better performance out of their silicon.

So it will all come down to pricing. This is as much in Intel's hand as AMD's. AMD can potentially sell a lot of these things, but at a price where they won't be making any profits.

Congratulations to JW on another fine review, but a very underwhelming new product from AMD, as I said earlier.
 
Mar 10, 2006
11,715
2,012
126
I'll be nabbing a Trinity laptop. My IBM T42 just isn't cutting it anymore. Would be really nice to be able to play games on a laptop...
 

KompuKare

Golden Member
Jul 28, 2009
1,232
1,603
136
Computerbase also have a review of this Asus wth an A8-4500M (And that's what I was reading since the Anandtech one still hasn't come through in RSS for me.):
http://www.computerbase.de/artikel/notebooks/2012/bericht-amd-a8-4500m-trinity/
http://translate.google.co.uk/trans...y/&hl=en&safe=off&biw=1024&bih=607&prmd=imvns

Still not finished reading either of them, but for instance Computerbase only benched a few games and the A8-4500M didn't do that well on them. Intel's iGPU may be poor but HD4000 has two advantages: the age old Intel process advantage and a better memory controller. AMD need to seriously improve the memory controller but I doubt they have the capital or manpower to do so at this stage.

As it happens, Computerbase also recently did an article on IB and RAM speeds:
http://www.computerbase.de/artikel/arbeitsspeicher/2012/test-welchen-ram-fuer-intel-ivy-bridge/
http://translate.google.co.uk/trans...e/&hl=en&safe=off&biw=1024&bih=607&prmd=imvns

Intel HD4000 (unlike Llano) seems to not be too bothered about high-speed RAM. Obviously we'll have to wait for desktop Trinity reviews (unless some reviewer can overclock laptop RAM), but Trinity's iGPU will most likely be bandwidth starved again. I think this graph tells a lot:

trinity-vs-ivybridge-gaming-new.png


Hm, disappointing overall I think. Unless AMD find some other markets (emerging markets etc.) I cannot see Trinity selling well if even at games in often ties with HD4000. This does not look good for competition in the CPU market - nor even for the GPU market since GNC is gCompute heavy precisely because of APU / Fusion and Nvidia have a cheaper to make design because they're leaving gCompute to big Kepler.

Wonder how much more stuff Intel will lock down without any competition? Or Nvidia for that matter: GK104 already has limited overclocking and with less competition they might feel bolded to lock down more too: if you want 680 performance you might no longer be able to buy a 670 and overclock it.
 
Mar 6, 2012
104
0
0
I guess I must be one of the few not seeing a good increase on both cpu and gpu as underwhelming. If my toshiba satellite breaks (it's nearly falling apart) then I might very well end up getting a trinity laptop.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
According to Anandtech

Trinity is:

20-25% slower than Intel cpus

20% plus faster than IB in gaming

AMD is easily competitive with Intel on battery life. FINALLY

Conclusion
The improvements in Piledriver really appear to have saved Trinity. What was a very difficult to recommend architecture in AMD's FX products has really been improved to the point where it's suitable for mobile work. AMD couldn't push performance as aggressively as it would have liked given that it's still on a 32nm process and the APU needs to make money. A move to 2x-nm could help tremendously. Similarly the move to a more efficient VLIW4 GPU architecture and additional tuning helped give AMD a boost in GPU performance without increasing die size. Overall, Trinity is a very well designed part given the process constraints AMD was faced with.
As a notebook platform, Trinity's CPU performance isn’t going to set any new records but it’s certainly fast enough for most users; battery life isn’t at the head of the class, but it’s better than just about anything that doesn’t qualify as an ultrabook; and finally there’s the question of cost. That last item isn’t really in AMD’s control, as the final cost of a laptop is a product of many design decisions, so let’s do some quick investigation into laptop pricing.
At this point, AMD has done everything they can to provide a compelling mobile solution. The difficulty is that there's no longer a single laptop configuration that will be "best" for everyone, and Trinity only serves to further muddy the water. Intel continues to offer better CPU performance, and if you need graphics—which mostly means you want to play games—they have a good partner with NVIDIA. AMD on the other hand is delivering better integrated graphics performance with less CPU power, and depending on what you want to do that might be a more well rounded approach to mobile computing. What we need to see now are actual laptops and their prices. To trot out a tired old saying once more, "There are no bad products; only bad prices." Now it's up to AMD's partners to make sure Trinity laptops are priced appropriately.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Quiet underwhleming. :\

On the CPU side, IVB absolutely clobbers Trinity. People keep saying "CPU don't matter", but if you have a SSD the 'bottleneck' immediately shifts back to the CPU for certain tasks. It is always nice to have good single thread performance, and Intel enjoys a huge advantage in this area.

On the iGPU side, the much derided and seemingly impotent HD 4000 manages to actually narrow the lead AMD enjoyed on the graphics side. If only Intel drivers were written by people who had any clue, they would get even better performance out of their silicon.

I agree with this. The problem AMD is facing is mainly in the CPU dept. were IMO it is bottlenecking the iGP unit. Imagine if Trinity graphics had a i7 driving them in games? FPS would have been much higher.

Piledriver sucks for games and other cpu intensive tasks and as a result it's allowing HD 4000 to look better than what it really is.
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
So has anyone done any comparison on how much faster the piledriver core is in comparison to BD?
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
Being an intel fanboy, i'm obviously coloured.


But am i the only one VASTLY concerned how Trinity's GPU is faring?
HD4000 jumped significant percentages, trinity... meh.

Let's remember they wanted to put the Haswell GT3 on IB - but chose to wait for whatever reason.

Id say one is borderline naive if they think a GCN APU somehow has the mass to make a 50% or above jump for the next generation.
Where is AMD's ace for making sure they keep the GPU lead for the next generation?
And certainly by a ammount more significant than HD4000 vs Trinity.

We know Haswell will be a monster increase on the gpu side.



If intel gets both advantages - i fear for my cpu prices going the route of 28nm gpu's :\
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,232
1,603
136
Id say one is borderline naive if they think a GCN APU somehow has the mass to make a 50% or above jump for the next generation.

We know Haswell will be a monster increase on the gpu side.

If intel gets both advantages - i fear for my cpu prices going the route of 28nm gpu's :\

Exactly, much as I may like to support the underdog I don't see myself buying a desktop Trinity. Also, since SB killed old-fashioned overclocking I simply haven't upgraded. While I don't question the value of Intel's 'K' series CPU compared to Extreme Edition or mid-level CPUs of years ago, my upgrade budget since the days of of the Celeron 300A has been fairly consistent: about £50 for the CPU and the same for the mobo, and overclock both. Not possible on that budget anymore.

Regarding future Fusion APUs with GCN: well while GCN isn't bad, I've made the point a few times that it seems AMD used an unwise amount of the 28nm gain for gCompute. Therefore once GCN comes onboard with Kaveri, the gCompute will drag it down unless AMD can come up with a compelling use for gCompute - but then AMD generally has a poor reputation at getting software adopted (aside from AMD64).

The worst case would be that Kaveri will have a die which is 10-30% bigger than it need be and is stuffed full of features which nobody wants. I just don't see the grand vision of Fusion making progress if its component parts (Bulldozer and GCN) are actually a step backwards in term of efficiency. Bulldozer vs Stars certainly was a backward step, and while GCN is a good design without the gCompute baggage it could have been far better (for instance there's no reason to believe that with reduced gCompute and a that die-space used elsewhere, Pitcairn might have beat GK104).
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I think the reason 3DMark gains are so much better than on actual games is due to the new Turbo Core on Trinity. The gain over Llano's 6620G is similar to going from 5870 to 6970. The iGPU Turbo is probably running at base 494MHz rather than top 685MHz lot of the time.

On 3DMark11, the game tests are running at single digit frames and CPU matters not a bit, so it has more headroom for GPU. But in games its running at playable frames and CPU needs more power, thus less power for iGPU.

That of course varies with how well multi-threaded the game too. It's same with HD Graphics 4000 gain over HD Graphics 3000. Less in games that HD 3000 is already decent at, more in the ones HD 3000 is doing bad at.
 

Abwx

Lifer
Apr 2, 2011
11,897
4,879
136
Disappointing ??? Intel 3720QM is a 45W Mobile CPU. Trinity A10-4600M is a 35W Mobile CPU. It is not only faster in Games than Intel, it is cheaper as well.

;)

Nice catch but useless with stubborned people...

Just look at the posts following yours , with people
boasting about a 45W IB Gfx perfs compared to a 35W Trinity
without even noticing that with 10 watts more that could be
dedicated to the 7660 GPU , Trinity graphics perfs would
increase by about 50%.....
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
for instance there's no reason to believe that with reduced gCompute and a that die-space used elsewhere, Pitcairn might have beat GK104).

It doesn't have to be that way. The dual 7970 aka 7990 coming is rumored to have performance equal to dual gk104, the GTX 690. Somehow they have lower power efficiency and performs less than single gk104, yet it is on par with dual versions?

That also happened with Radeon 6970 vs GTX 580 and 6990 vs GTX 590 right?

Abwx said:
Nice catch but useless with stubborned people...

Just look at the posts following yours , with people
boasting about a 45W IB Gfx perfs compared to a 35W Trinity
without even noticing that with 10 watts more that could be
dedicated to the 7660 GPU , Trinity graphics perfs would
increase by about 50%.....

http://www.anandtech.com/bench/Product/348?vs=327

Actually, in case of Sandy Bridge, there's usually only 10-20% difference in games between the 45W quad core and 35W dual core.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,232
1,603
136
It doesn't have to be that way. The dual 7970 aka 7990 coming is rumored to have performance equal to dual gk104, the GTX 690. Somehow they have lower power efficiency and performs less than single gk104, yet it is on par with dual versions?

That also happened with Radeon 6970 vs GTX 580 and 6990 vs GTX 590 right?

Guess time will tell on that one. Biggest problem with 7970 was that it was released at such a low speed: Rory made some silly promise to financial analysts that AMD would ship 28nm GPUs in 2011. If they had kept that promise by shipping 7750 while putting a bit more work into 7970 and realsing it 1GHz+ they'd be in a better position now. Cape Verde didn't have to run fast while having your top-of-the-range running too slow has a trickle-down effect.

Nvidia are fast at responding to this kind of thing, AMD are not. Once NV knew the speed of 7970 they just had to overclock 680 to beat it. Maybe AMD do more validation, although you'd think after bumpgate Nvidia would be more cautious.

On subject: looking forward to seeing how the 100W desktop parts perform and whether AMD have come up with any compelling reason to go with an APU. Like hybrid-crossfire with zero core (but then Trinity is VLIW4 so hybrid with what?). Or, and this would be new, get the iGPU to handle after-processing AA or physics when paired with a gGPU.
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
I'm with Alex here. I know no one that uses a laptop for gaming and if they do game it's on consoles or their smartphone. So yeah, this is a niche market and 90% of users are best served with a fast dual-core and a SSD. And I use my desktop for gaming. Don't really see how playing an FPS "on the road" is possible or anything else than frustrating.
They thing the AMD-defendants should not forget is, that the one tested here is the top-end model. All other ones will perform worse. And the top-model wasn't the ones readily sold at 500$...in fact it mostly was in direct competition to i3 + discrete.
This things also sips too much power in h.264 playback. Probably the thing people do most on a long flight...

Besides that considering the fail BD was, CPU-wise I'm actually impressed it is an real upgrade and not a down or side-grade.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Fixed that for ya :biggrin:

Good try Intel but again HD4000 is second best.

Do I look like I care about Midget MMA?
I don't care about IGP's too...waste of diespace for anything else but office programs.
Give a GPU...and cut the IGP from the CPU die, I don't need my CPU held back by a stupid IGP.

And if I wanted to retort I would say that both Intel and AMD's IGP's are to slow for serious gaming...but Intel makes AMD encoding feature look like something that died a slow death in the corner...

You go all *beeep* about useless diespace...because it's ALMOST usable....kinda funny to observe.