• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Richland & Kabini rumours

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
"Better performance" because you picked a POS graphics card. With a HD7750 and the pentium you would get better gaming performance than trinity.

A HD7750 is $90-110 on Newegg- making the whole shebang far more expensive than the Trinity setup. And that Pentium is going to choke in a lot of games with only two threads. And it doesn't have AVX support enabled, unlike Trinity.

EDIT:

bf3-99th.gif


From here: http://techreport.com/review/23662/amd-a10-5800k-and-a8-5600k-trinity-apus-reviewed/10

Modern engines are being optimised for more than two threads, and that Pentium seriously struggles.

EDIT 2: And yes, that's with a discrete GPU.
 
Last edited:
A HD7750 is $90-110 on Newegg- making the whole shebang far more expensive than the Trinity setup. And that Pentium is going to choke in a lot of games with only two threads. And it doesn't have AVX support enabled, unlike Trinity.

EDIT:

bf3-99th.gif


From here: http://techreport.com/review/23662/amd-a10-5800k-and-a8-5600k-trinity-apus-reviewed/10

Modern engines are being optimised for more than two threads, and that Pentium seriously struggles.

EDIT 2: And yes, that's with a discrete GPU.

Since we are into conjecture, those "modern engines" are also going to be more graphically demanding as well. Good luck running those an an igp at any kind of decent resolution and image quality. And future titles will be even more graphically demanding. Look at the Toms Hardware article. 5800K can't even run Skyrim at medium at 1680 x 1050 at 30 FPS. And I believe me, medium on skyrim is a big drop in image quality.

As far a price goes, I got a HD7770 for 90.00, so I am sure if you watch for a sale you can get the 7750 for less.
 
I would say he was wrong. Just look at 32->22nm.

It's not a valid comparison. A considerable amount of the savings come from a FinFET design. On the following graph Intel attributes most of the Low Voltage power savings to FinFET, the other end doesn't look so impressive, now does it ?

power.jpg


In addittion, when comparing the IGP part, let's not forget that IVB graphics processor is considerably more power efficient than SBs.
 
"Better performance" because you picked a POS graphics card. With a HD7750 and the pentium you would get better gaming performance than trinity.

You did see that Trinity 5800K cost less than Intel CPU + GT440 and yet performs better with lower TDPs.

Im sure that Trinity + HD7970 will smoke any Intel CPU paired with HD7770 🙄
 
AMD stated that bulldozers CMT brought a performance deficit per thread compared to two separate cores. That has nothing to do with IPC vs hound.

Interesting. You are saying that single threaded performance went down as a trade off for better multi-threaded performance, and yet you don't correlate this with IPC. Care to explain your reasoning here?

CMT is far simpler, you dont have to worry about handling/switching registers etc. look how many tries intel has had at SMT, took intel a few tries to get it right.

In your opinion, what are the merits of the Core architecture and why does an Intel SMT core performs on par with AMD module at lower clock speeds?
 
But this doesn't change the fact that Bulldozer is a very inefficient architecture in performance per area and performance per watt.

Not according to AT review,

FX8350 has almost the same performance/power efficiency as Intel Core i5 2500K in heavy MT loads.(both at 32nm).

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/6
power-95w.png


51142.png


FX8350 needs 900sec to finish the job while consuming (900/3600)x195,2 = 48,8W

Core i5 needs ~1210sec (Anand needs to give the exact number for this) to finish the job while consuming (1210/3600)x111,3 = 37,41W

FX8350 is 25,61% faster (1210sec vs 900sec) and consumes 30,44% more power (48,8W vs 37,41W).
 
Not according to AT review,

FX8350 has almost the same performance/power efficiency as Intel Core i5 2500K in heavy MT loads.(both at 32nm).

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/6

FX8350 needs 900sec to finish the job while consuming (900/3600)x195,2 = 48,8W

Core i5 needs ~1210sec (Anand needs to give the exact number for this) secs to finish the job while consuming (1210/3600)x111,3 = 37,41W

FX8350 is 25,61% faster (1210sec vs 900sec) and consumes 30,44% more power (48,8W vs 37,41W).
Is that seriously your argument? Look how awesome a fully-enabled Piledriver is! It can beat a disabled chip!

What a joke.

Then there's the fact that you're completely wrong about efficiency:
x264-power-total-energy.gif

x264-power-task-energy.gif


Give up already.

If you want to use FinFets as argumentation. You can also start with HKMG, SiGe and so on. Always something new on that front.
Except those technologies you listed aren't fine-tuned for low power, whereas Ivy Bridge's FinFETs are.
 
Since we are into conjecture, those "modern engines" are also going to be more graphically demanding as well. Good luck running those an an igp at any kind of decent resolution and image quality. And future titles will be even more graphically demanding. Look at the Toms Hardware article. 5800K can't even run Skyrim at medium at 1680 x 1050 at 30 FPS. And I believe me, medium on skyrim is a big drop in image quality.

As far a price goes, I got a HD7770 for 90.00, so I am sure if you watch for a sale you can get the 7750 for less.

According to Anandtech's review, it runs it at 47.6fps. Likely different settings though.

http://www.anandtech.com/show/6332/amd-trinity-a10-5800k-a8-5600k-review-part-1/5

With that said, you're thinking in too much of a gamer's mindset. Having the graphical horsepower of a Radeon 7750/7770, while nice, simply isn't necessary to actually enjoy a game. Even new games can be played on lower level hardware. I don't primarily game on my laptop (which sports a Geforce 540m), but when I do, I can get the job done at a reasonable, enjoyable level.

But with that said, if you're buying a 5800k with the mindset to game on it full time, then you're better budgeting your money elsewhere. People who buy the 5800k, whether they realize it or not, are going to be purchasing it with the mindset to get a cheap, capable computer. Buying an additional $100 piece of hardware not only defeats the purpose, and contradicts the purchasing environment.

I mean, let's be honest. The largest market for something like a 5800k is going to be HTPCs, all-in-ones (which are limited by space anyways), and people who go to Best Buy and purchase a cheap desktop computer. Not gamers.
 
You did see that Trinity 5800K cost less than Intel CPU + GT440 and yet performs better with lower TDPs.

Im sure that Trinity + HD7970 will smoke any Intel CPU paired with HD7770 🙄

Just another red herring to try to steer away from the truth. Of course a HD7970 will perform better than a 7770 in gaming, irregardless of the CPU. That is an irrelevant comparison brought in to cloud the issue. In my post I did not mention either the HD7970 or HD 7770.

If you read my post carefully, I said that a HD7750 plus pentium will perform better than A10-5800 without a discrete card for gaming. I stand by that statement.
 
Just another red herring to try to steer away from the truth. Of course a HD7970 will perform better than a 7770 in gaming, irregardless of the CPU. That is an irrelevant comparison brought in to cloud the issue. In my post I did not mention either the HD7970 or HD 7770.

If you read my post carefully, I said that a HD7750 plus pentium will perform better than A10-5800 without a discrete card for gaming. I stand by that statement.

It will also cost considerably more. 😕 (And you're one to talk about red herrings, when you use special offers to claim a lower price on parts.)
 
Just another red herring to try to steer away from the truth. Of course a HD7970 will perform better than a 7770 in gaming, irregardless of the CPU. That is an irrelevant comparison brought in to cloud the issue. In my post I did not mention either the HD7970 or HD 7770.

If you read my post carefully, I said that a HD7750 plus pentium will perform better than A10-5800 without a discrete card for gaming. I stand by that statement.

And still HD7750 + Pentium will cost more than A10-5800K, thats why i said that Trinity + HD7970 will be faster than any Intel CPU + 7770.

Go it now ???
 
Just another red herring to try to steer away from the truth. Of course a HD7970 will perform better than a 7770 in gaming, irregardless of the CPU. That is an irrelevant comparison brought in to cloud the issue. In my post I did not mention either the HD7970 or HD 7770.

If you read my post carefully, I said that a HD7750 plus pentium will perform better than A10-5800 without a discrete card for gaming. I stand by that statement.

You mentioned $50 - $100 earlier, now it's $90 - $100. However you *can* buy a $50 6670 and crossfire it with the 5800K.

Check out this - http://www.youtube.com/watch?v=lbvtxmb-PJ0

Suitably impressed I'm sure. Now how do you think that Pentium and 6670 will perform in the same title?

You mentioned Skyrim earlier right? How about this at high-max with the same setup? - http://www.youtube.com/watch?v=41HTKLjCw9s
 
Last edited by a moderator:
"Better performance" because you picked a POS graphics card. With a HD7750 and the pentium you would get better gaming performance than trinity.

I don't think the point of the APU's is to compete with systems with discrete GPUs.

If you go to Costco, Best Buy, Sam's, or even NewEgg you'll find the majority of systems that are selling, sell with nothing but integrated graphics.

Most users are not even going to think much about getting a discrete graphics card for their new computer, in the same way that most people don't go out and buy brand new rims and tires for a car they just bought. Some do, yes - the enthusiasts, but the vast majority do not.

Given that simple fact and the price points, most users are going to get a far better experience out of an A10-5800K than they will even a core i3, and unless you buy top of the line i3 you're getting an HD 2500 (i3-3220) not the HD 4000 (i3-3225) bandied about here.

Talking about discrete GPUs with APUs is nonsensical to me, no question that if you're getting a discrete GPU then AMDs APUs are not a logical choice (unless you're getting a low end GPU like the 420 / 620 etc, which do sell quite a lot - and are a big fail compared to AMDs APU).

But I would think for an OEM or any vendor selling lower end systems where price is critical, the AMD APU's are kind of a no-brainer from a user experience perspective. The only reason they would not want to sell them imo is because they disrupt the price tiers at the lower level.


This is the one place where AMD should still be dominant, but because every reviewer and enthusiast insists on putting the APUs up against intel machines with discrete graphics, they all then conclude that it's slower. Yeah the CPU is slower, but the overall platform is faster - when setup to reflect its target market.
 
For a new system setup Trinity is the better choice (over CPU + Discrete) for OEMs (less parts, less power consumption, less heat etc) and for DIY users.

If that were true then the OEM's would be buying a whole lot more AMD CPU's than they are.
 
I don't think the point of the APU's is to compete with systems with discrete GPUs.

If you go to Costco, Best Buy, Sam's, or even NewEgg you'll find the majority of systems that are selling, sell with nothing but integrated graphics.

Most users are not even going to think much about getting a discrete graphics card for their new computer, in the same way that most people don't go out and buy brand new rims and tires for a car they just bought. Some do, yes - the enthusiasts, but the vast majority do not.

Given that simple fact and the price points, most users are going to get a far better experience out of an A10-5800K than they will even a core i3, and unless you buy top of the line i3 you're getting an HD 2500 (i3-3220) not the HD 4000 (i3-3225) bandied about here.

Talking about discrete GPUs with APUs is nonsensical to me, no question that if you're getting a discrete GPU then AMDs APUs are not a logical choice (unless you're getting a low end GPU like the 420 / 620 etc, which do sell quite a lot - and are a big fail compared to AMDs APU).

But I would think for an OEM or any vendor selling lower end systems where price is critical, the AMD APU's are kind of a no-brainer from a user experience perspective. The only reason they would not want to sell them imo is because they disrupt the price tiers at the lower level.


This is the one place where AMD should still be dominant, but because every reviewer and enthusiast insists on putting the APUs up against intel machines with discrete graphics, they all then conclude that it's slower. Yeah the CPU is slower, but the overall platform is faster - when setup to reflect its target market.

This is very true. And it's even more depressing when these shops tout a PC as a "gaming PC" because it has a few LEDs and a HD6450 installed. D:
 
techreport.com uses x264 HD 4.0, AT used the new version x264 HD 5.0.1
http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/2

Next time try to find the same application.

ps: x264 HD 4.0 needs more than 60 secs to finish the bench, i have no idea what techreport.com measured in those graphs.
Except the power consumption over time graphs are identical. There is no meaningful difference between those program versions.

Even if we assume that you're correct about efficiency (but let's not forget, as I've just demonstrated, that you're not), you've still ignored the other half of his argument involving performance in a given die space.
 
Talking about discrete GPUs with APUs is nonsensical to me, no question that if you're getting a discrete GPU then AMDs APUs are not a logical choice (unless you're getting a low end GPU like the 420 / 620 etc, which do sell quite a lot - and are a big fail compared to AMDs APU).

I will have to disagree with that. Have a look at the link bellow. The review is with a single GTX670. A10-5800K(FX4130) is on par or better than Core i3.

http://pctuning.tyden.cz/hardware/g...esoru-na-vykon-ve-hrach-od-phenomu-po-core-i7
 
Last edited:
Meh, I distrust Crossfire/Dual Graphics/SLI. Too flakey and driver dependent, and not enabled in enough games.

Granted, however if you were going to spend another $50 on a discrete card, the 5800K simply blows away the Pentium (and the i3) at that level due to having the option of dual graphics.
 
If you want to use FinFets as argumentation. You can also start with HKMG, SiGe and so on. Always something new on that front.

It's true, that new stuff is being invented all the time, however AFAIK older process nodes have required much less effort to net additional energy savings.

Even with all these advancements you'll get more and more transistors per mm^2 but less and less in energy savings per transistor. This means you can't turn much more of them on at the same time.

Why do you think power-gating is gaining more prominence only now ? Why did we get multi-core CPUs in the first place ? Or why is Intel integrating the GPU, NB and SB on die Why are they adding quite large "hardwired" blocks to their mainstream CPUs like QuickSync ?

EDIT:
Because the large majority of them won't be running at their peak capacity on any given time.
 
And still HD7750 + Pentium will cost more than A10-5800K, thats why i said that Trinity + HD7970 will be faster than any Intel CPU + 7770.

Go it now ???

I "got" it. It is not a difficult concept to comprehend. You are still however making an apples to oranges comparison to try to show AMD in the best light.

And I never said the pentium plus 7750 would be cheaper. But lets look at the numbers. Pentium is about 80.00, 7750, OK, I will assume 100.00 which is a very high estimate. A10 maybe 110.00. So 70.00 difference or maybe 15 to 20 percent of the cost of a system, IF you can find a well priced A10. In gaming I would estimate at least 50% improvement. So 20% more cost for 50% improvement. Seems like a good trade-off to me.
 
I "got" it. It is not a difficult concept to comprehend. You are still however making an apples to oranges comparison to try to show AMD in the best light.

And I never said the pentium plus 7750 would be cheaper. But lets look at the numbers. Pentium is about 80.00, 7750, OK, I will assume 100.00 which is a very high estimate. A10 maybe 110.00. So 70.00 difference or maybe 15 to 20 percent of the cost of a system, IF you can find a well priced A10. In gaming I would estimate at least 50% improvement. So 20% more cost for 50% improvement. Seems like a good trade-off to me.

Or you could use that money to upgrade from an HDD to an SSD instead, which would be the smarter option. Not to mention that Pentium system will be far worse in day-to-day use than the Trinity (i.e. spreadsheets, websites, that sort of thing)- it's a terribly unbalanced system aiming for gaming framerates above all else.

The A10 isn't the choice for a balls-out gaming nut, no. But it gives you a well balanced system which does everything well enough for most people.
 
Status
Not open for further replies.
Back
Top