• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

AMD A10-5800K preview - iGPU side only

Page 26 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Now you're going back and quoting out of context too much.

I said "THIS PRODUCT SUCKS AT PLAYING GAMES".

You are refusing to separate that from any other use of the product and trying to refute what I've said based on non gaming uses while I have repeatedly stated that this product, and all other igpus available new today are just fine for doing things other than gaming.

You are strawman-ing me entirely.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Now you're going back and quoting out of context too much.

I quoted your posts in context. It's not my fault your claims are all over the place.

Even your current argument sucks

I said "THIS PRODUCT SUCKS AT PLAYING GAMES".

No it doesn't. It replaces a GT540m and that's a midrange GPU in a laptop. It also offers great performance at laptops where the overwhelming majority run 1366x768.

Your argument isn't
"THIS PRODUCT SUCKS AT PLAYING GAMES".
it's "THIS PRODUCT SUCKS AT PLAYING GAMES ON THE DESKTOP AT 1080p"

Yes. It does. Does that make it a crappy IGP? or "unsuitable"? Worth ignoring the advancements and performance gains?

Why ignore it? Because the end result is not suited for what it's apparently being pushed as suitable for. I don't care how hard they tried, or how close you may feel they've come. This is neither horse-shoes nor hand-grenades.

And here's your actual post just to show you it's in context. Like I said too many times to count, you're still looking at the wrong segment of the market.

You are strawman-ing me entirely.

I couldn't if I tried. You're all over the place. What argument would I possibly prop up? Who knows where you're going next.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
30's of fps is far from "good enough for gaming" on a pc.

What resolution and quality?

It doesn't magically become acceptable because it's a laptop. The stuttering performance on a laptop lcd is just as bad as it is on a desktop lcd.

30fps not good enough?
For a shooter, and if you are playing at a high level, ok, we'll accept it, 30 might not be enough (I know 30fps is ok for me, but maybe I am a lousy player) However, RTSs, MMOs and other genres are perfectly ok with 30fps.

I am curious, what setup do you have? For how you have been talking, I am assuming you have an X79 platform with 2 HD7970 GHz ed in crossfire (or dual GTX680 in SLI) driving dual 30in 2560 x 1600. What do you play? What are your "real game" settings? 5120 x 1600 at 120fps, 8x MSAA, 16AF? Or are you another one of those who is lecturing people about gaming, but have a GTX460 or HD6850?

The other guys have already provided plenty of graphs showing several games running at good framerates. Your claim of the HD7660G not being good enough is false. Guild wars 2 and skyrim run acceptable on a HD6450. Granted, you are playing on low, but you can play them. Diablo 3 and Wow also run very well. Most of the game developers code lightly so a lot of machines can play the games. Metro 2033 and BF3 are exceptions rather than norm.

So, show us, what do you have and what settings you can get? What settings are the minimum you want? I am curious.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I couldn't if I tried. You're all over the place. What argument would I possibly prop up? Who knows where you're going next.

What thread have you been reading? I have a singular, concise point. You keep adding to it, morphing it to fit the angle you want to come at it, whatever.

My point is that the product sucks for gaming.

No matter how many other claims you want to put in to my mouth, it is never going to change. That is the claim. It won't change. You can claim I'm all over the place until you're blue in the face, but I don't think your definition of all over the place matches anyone in the world. I have to think, at this point, you're just screwing with me. Kudos I guess.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
My point is that the product sucks for gaming.

It doesn't suck for gaming.

The desktop version sucks compared to desktop discrete GPUs that can be had for cheap. But what about laptops? You know, the focus of the architecture in the first place?

As for graphics performance, the comparisons are again a little clouded by the differences in laptop specifications. But we can still get a pretty good idea of what the AMD A10-4600M is capable of. One of the most popular titles of the moment, for instance, is the swords and sorcery epic known as Elder Scrolls V: Skyrim. It's a very expansive and very pretty game, so long as you use high-res texture packs. So configured and running high detail settings and antialiasing disabled, you're looking at a very healtbhy and playable 46 frames per second at 720p.
Up the ante to ultra quality, 4x antialiasing and 1,366 x 768 pixels and you'll still get a near-playable 25 frames per second. No previous integrated graphics core comes close to this. Even Intel's latest HD 4000 core as found in the new Ivy Bridge processor family is barely half as fast.

Add in the laptop sales figures and the diminishing discrete GPU sales figures (both laptop AND desktop, btw) and what do you get? A decent performing graphics GPU at 1366x768. Exactly what it was meant to do.
 
Aug 11, 2008
10,451
642
126
It doesn't suck for gaming.

The desktop version sucks compared to desktop discrete GPUs that can be had for cheap. But what about laptops? You know, the focus of the architecture in the first place?



Add in the laptop sales figures and the diminishing discrete GPU sales figures (both laptop AND desktop, btw) and what do you get? A decent performing graphics GPU at 1366x768. Exactly what it was meant to do.

Yes, but look at the title of the thread. A-10 5800K preview......

That is a DESKTOP chip. I wont speak for Ferzerp, but my point thru this entire thread has been that it makes no sense to game on the igp of a desktop when a discrete card can increase performance so much for a very minimal cost compared to the cost of a system, buying games, and all the other things we spend disposable income on. You are trying to obfuscate the issue by bringing in laptops, non-gaming scenarios and the general PC market.
 
Aug 11, 2008
10,451
642
126
Direct quote from the "AMD withchunter" site:

"If you've been following the x86 processor wars lately, you might be surprised to see that the A10-5800K beats the Core i3-3225 in overall performance, while the A8-5600K ties it. (You probably aren't surprised to see the pitiful Pentium G2120 buried at the bottom left of the scatter plot.) "

trinity-scatter.png


You know what makes it more interesting? He used the MSI A85x motherboard. If you have been paying attention to what is going around (which I know you don't, but I had to ask anyways) you would already know that the MSI motherboard is as of today, the slowest of the A85x mobos. Anand used the gigabyte A85x mobo, and his results are lower that what other are getting with the same mobo. Scott, despite his emotional stance against AMD nowdays, is still top at extracting performance of the systems.

I can selectively quote from the same article as well:

"Forgive me if this sounds like our Llano review on replay, but it's hard to see where these Trinity APUs fit into the desktop PC landscape. Their 100W TDP disqualifies them from all-in-one systems, small-form-factor enclosures, and home theater PCs. Not being able to slide into those types of systems, where discrete graphics cards aren't practical, largely negates AMD's advantage in integrated graphics performance."

and

"The only remaining landing place for these 100W APUs is a budget desktop PC—but again, the Core i3-3225 will give you better single-threaded performance and lower power draw than the A10-5800K for just a few bucks more. I'm having a hard time envisioning a system guide build where the A10 makes more sense than the Core i3. What's the concept? A very budget desktop PC in which the user does a small amount of gaming with non-casual titles? That needle is hard to thread."

and

"We still think serious gamers will want to add discrete graphics cards to their systems. Expandability is one of the hallmarks of a desktop PC, after all. Even a relatively affordable card like the Radeon HD 7750, which sells for as little as $89 at Newegg, will run today's games competently at two-megapixel resolutions and decent quality levels. We're talking about a vastly superior experience to the one these integrated graphics processors can muster. What's more, the Radeon HD 7750's TDP is just 55W. Add that to the 55W of the Core i3-3225, and you've only exceeded the A10-5800K's max power rating by 10W. In terms of power draw, noise, and cooling demands, the two options will be practically the same."
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
I can selectively quote from the same article as well:

"Forgive me if this sounds like our Llano review on replay, but it's hard to see where these Trinity APUs fit into the desktop PC landscape. Their 100W TDP disqualifies them from all-in-one systems, small-form-factor enclosures, and home theater PCs. Not being able to slide into those types of systems, where discrete graphics cards aren't practical, largely negates AMD's advantage in integrated graphics performance."

and

"The only remaining landing place for these 100W APUs is a budget desktop PC—but again, the Core i3-3225 will give you better single-threaded performance and lower power draw than the A10-5800K for just a few bucks more. I'm having a hard time envisioning a system guide build where the A10 makes more sense than the Core i3. What's the concept? A very budget desktop PC in which the user does a small amount of gaming with non-casual titles? That needle is hard to thread."

and

"We still think serious gamers will want to add discrete graphics cards to their systems. Expandability is one of the hallmarks of a desktop PC, after all. Even a relatively affordable card like the Radeon HD 7750, which sells for as little as $89 at Newegg, will run today's games competently at two-megapixel resolutions and decent quality levels. We're talking about a vastly superior experience to the one these integrated graphics processors can muster. What's more, the Radeon HD 7750's TDP is just 55W. Add that to the 55W of the Core i3-3225, and you've only exceeded the A10-5800K's max power rating by 10W. In terms of power draw, noise, and cooling demands, the two options will be practically the same."

I didn't quote it intentionally. While the numbers showed Trinity as the winner, Scott had to find a reason to disqualify it as he is still having his "AMD witchhunt". How could you claim that a product cannot be recommended when your own numbers show it as better overall? Let's analyze the situation.

1) 100W TDP: TDP is max rated wattage, meaning the APU will not go higher than that. Under what conditions will it be reached? Both CPU and GPU at full blast. Luxmark with openCL was one (CPU+GPU) What other app will do it? More important, what app in a HTPC environment will push it to the 100W number? Video decode. No, it won't. AMD GPUs have proper hardware acceleration so it will be at idle most of the time. Streaming? Not either. Why didn't he test also the power consumption in a HTPC task, like decoding some .mkv files?

2) The i3 has better single threaded performance: You tell me, are the tasks usually encountered in a HTPC scenario single threaded or multithreaded? You got it right, most of them are well threaded. Furthermore, with the A10, you get proper hardware acceleration, meaning the cPU doesn't have to get involved in the decoding. Try that in an i3.

3) Light gaming: You can add a HD7750 to an i3 and get a better gaming machine, right? You can add also that HD7750 and get a similar gaming machine with the A10. Power consumption will be nowhere the 100W max as the iGPU will not be used. Did he test power consumption in that scenario? He didn't.

4) "But, but, it is still 100W TDP": Why not use the A10-5700, rated at 65W TDP? In his own words, the A10-5600K tied the i3-3225, so the A10-5700 will be faster, and at almost the same wattage as the i3. He tested only the 100W TDP parts.

5) You read this one, right:
Anand's A10 in HTPC environment Unless I need new glasses, page 7 shows power consumption in HTPC duties. The max relevant number fot power consumption is 62W (Blu-Ray ISO playback from NAS) I see that 93W number, but that is because of the Blu-Ray ODD. Nothing to do with the A10. Oh even better, he used the A10-5800K, the 100W part also.


The conclusion given by Gamesh in the Anand's review was exactly the opposite to what Scott said. Gamesh said trinity makes a great HTPC chip, and he showed the numbers to prove it.
 
Aug 11, 2008
10,451
642
126
I didn't quote it intentionally. While the numbers showed Trinity as the winner, Scott had to find a reason to disqualify it as he is still having his "AMD witchhunt". How could you claim that a product cannot be recommended when your own numbers show it as better overall? Let's analyze the situation.

1) 100W TDP: TDP is max rated wattage, meaning the APU will not go higher than that. Under what conditions will it be reached? Both CPU and GPU at full blast. Luxmark with openCL was one (CPU+GPU) What other app will do it? More important, what app in a HTPC environment will push it to the 100W number? Video decode. No, it won't. AMD GPUs have proper hardware acceleration so it will be at idle most of the time. Streaming? Not either. Why didn't he test also the power consumption in a HTPC task, like decoding some .mkv files?

2) The i3 has better single threaded performance: You tell me, are the tasks usually encountered in a HTPC scenario single threaded or multithreaded? You got it right, most of them are well threaded. Furthermore, with the A10, you get proper hardware acceleration, meaning the cPU doesn't have to get involved in the decoding. Try that in an i3.

3) Light gaming: You can add a HD7750 to an i3 and get a better gaming machine, right? You can add also that HD7750 and get a similar gaming machine with the A10. Power consumption will be nowhere the 100W max as the iGPU will not be used. Did he test power consumption in that scenario? He didn't.

4) "But, but, it is still 100W TDP": Why not use the A10-5700, rated at 65W TDP? In his own words, the A10-5600K tied the i3-3225, so the A10-5700 will be faster, and at almost the same wattage as the i3. He tested only the 100W TDP parts.

5) You read this one, right:
Anand's A10 in HTPC environment Unless I need new glasses, page 7 shows power consumption in HTPC duties. The max relevant number fot power consumption is 62W (Blu-Ray ISO playback from NAS) I see that 93W number, but that is because of the Blu-Ray ODD. Nothing to do with the A10. Oh even better, he used the A10-5800K, the 100W part also.


The conclusion given by Gamesh in the Anand's review was exactly the opposite to what Scott said. Gamesh said trinity makes a great HTPC chip, and he showed the numbers to prove it.

I dont understand how you could quote it unintentionally. In any case, I agree the article seems to contradict itself, so do not put much faith in its conclusions. I just didnt think it fair that you quoted the one part of the article that favored AMD while ignoring the rest.

I am still trying to figure out what "I didnt quote it intentionally" means.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
I dont understand how you could quote it unintentionally. In any case, I agree the article seems to contradict itself, so do not put much faith in its conclusions. I just didnt think it fair that you quoted the one part of the article that favored AMD while ignoring the rest.

I am still trying to figure out what "I didnt quote it intentionally" means.

Major contradiction in his part as you mentioned it. Makes no sense that the numbers show it on top, yet "cannnot be recommended". Separation of facts (it performs better) and opinion ("i have a hard time recommending it...") and yes, you got me, some bias :oops:
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
I've done some analysis based on the Steam Video Card survey for GPUs.

35% of computers surveyed are better than the A10-5800K Igpu.
27% of computers surveyed are equivalent to or worse than the A10-5800K Igpu
38% of computers surveyed were notebook chips, intel integrated graphics, AMD integrated graphics, DX9 Legacy cards, or labeled as "other".


It looks like a significant portion of the Desktop GPU market that uses dedicated cards are running solutions that are equivalent or worse than the A10-5800K gpu, around 45% of Dedicated GPU desktop users. People do indeed play games and can play games with GPUs at the Trinity level. That discussion is pretty much settled.

So what are your alternatives to the A10 if you're buying a whole new budget system to play Guild Wars 2(or equivalent game)? You can either:

1. Purchase celeron SB chip and a card equivalent to the 6670.
2. Purchase a celeron SB chip and a card better than the 6670 such as the 7770.
3. Buy an i3-3220.


Of the three alternatives listed, option 2 is the only one that makes sense. The APUs of the 5800K is generally better than the Celeron SB CPU, and the i3-3200's graphics are far inferior to the 5800K. The final decisions regarding whether to go with option 2 or stick with an A10-5800K will be based on the games you want to play, the settings you wish to play at, and your overall budget.

With 45% of dedicated Desktop GPUs being equivalent or worse than the A10-5800K's GPU, its safe to assume that a significant group of people, if placed in that situation, would opt for the A10-5800K.



Link to formatted data I used.
http://www.streamfile.com/myid/YIwhAbPzxs4z
 
Last edited:

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
I've done some analysis based on the Steam Video Card survey for GPUs.

35% of computers surveyed are better than the A10-5800K Igpu.
27% of computers surveyed are equivalent to or worse than the A10-5800K Igpu
38% of computers surveyed were notebook chips, intel integrated graphics, AMD integrated graphics, DX9 Legacy cards, or labeled as "other".


It looks like a significant portion of the Desktop GPU market that uses dedicated cards are running solutions that are equivalent or worse than the A10-5800K gpu, around 45% of Dedicated GPU desktop users. People do indeed play games and can play games with GPUs at the Trinity level. That discussion is pretty much settled.

So what are your alternatives to the A10 if you're buying a whole new budget system to play Guild Wars 2(or equivalent game)? You can either:

1. Purchase celeron SB chip and a card equivalent to the 6670.
2. Purchase a celeron SB chip and a card better than the 6670 such as the 7770.
3. Buy an i3-3220.


Of the three alternatives listed, option 2 is the only one that makes sense. The APUs of the 5800K is generally better than the Celeron SB CPU, and the i3-3200's graphics are far inferior to the 5800K. The final decisions regarding whether to go with option 2 or stick with an A10-5800K will be based on the games you want to play, the settings you wish to play at, and your overall budget.

With 45% of dedicated Desktop GPUs being equivalent or worse than the A10-5800K's GPU, its safe to assume that a significant group of people, if placed in that situation, would opt for the A10-5800K.



Link to formatted data I used.
http://www.streamfile.com/myid/VHT5coRT0eQF

Look I don't care what no stupid survey says, I know that a majority of people have 7970a and 670s in their PCs and you fanboys are just trying to slander Intels good name by talking about this garbage iGPU.

Why would you even cite these Steam people? Who are they? Clearly I'm more of an expert than they are, I post on an internet forum. What have they ever done?

/ferzerp

On a more serious note, am I the only one excited to make a truly small form factor PC based on Trinity? Its bringing me back to the days of the VIA Epia.

I want to make one small enough to mount to the VESA mount point on my LCD monitor.
 
Last edited:

dastral

Member
May 22, 2012
67
0
0
Just off Newegg :
Antec NSK3480 with PSU => 100$
Random Motherboard => 50$
A10-5800K => 130$
4GB RAM => 20$
HDD 320G => 70$
Acer 20" 1600*900 => 100$
Keyboard+Mouse => 30$
Windows Licence => 40$ OEM

So you can make a "basic gaming PC" for 540$ monitor included.... And that's with NewEgg prices.
Obviously Dell/HP/Compaq & Friends can build it for less and sell it for 500$ at retailers.

Of course for 100$ more you can get better, but thats 20% more....

But for 500$ you COULD find a PC that will run "most popular games at medium quality"
Steam census proves that you "gaming" is not only about i7 and GTX680.

Would i built an A10 PC ? Unlikely because i know can do better.
Would i recommend it for 14 year old Timmy who wants to play WoW & D3 & L4D2 & SIMS & TF2 ? Absolutely
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Just off Newegg :
Antec NSK3480 with PSU => 100$
Random Motherboard => 50$
A10-5800K => 130$
4GB RAM => 20$
HDD 320G => 70$
Acer 20" 1600*900 => 100$
Keyboard+Mouse => 30$
Windows Licence => 40$ OEM

So you can make a "basic gaming PC" for 540$ monitor included.... And that's with NewEgg prices.
Obviously Dell/HP/Compaq & Friends can build it for less and sell it for 500$ at retailers.

Of course for 100$ more you can get better, but thats 20% more....

But for 500$ you COULD find a PC that will run "most popular games at medium quality"
Steam census proves that you "gaming" is not only about i7 and GTX680.

Would i built an A10 PC ? Unlikely because i know can do better.
Would i recommend it for 14 year old Timmy who wants to play WoW & D3 & L4D2 & SIMS & TF2 ? Absolutely

A 50-60$ Celeron/Pentium+discrete GFX will be much better at gaming.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I quoted your posts in context. It's not my fault your claims are all over the place.

Even your current argument sucks



No it doesn't. It replaces a GT540m and that's a midrange GPU in a laptop. It also offers great performance at laptops where the overwhelming majority run 1366x768.

Your argument isn't it's "THIS PRODUCT SUCKS AT PLAYING GAMES ON THE DESKTOP AT 1080p"

Yes. It does. Does that make it a crappy IGP? or "unsuitable"? Worth ignoring the advancements and performance gains?



And here's your actual post just to show you it's in context. Like I said too many times to count, you're still looking at the wrong segment of the market.



I couldn't if I tried. You're all over the place. What argument would I possibly prop up? Who knows where you're going next.

Your argument isn't sound at all .. If I went back in the GPU section and reread many of your old post ATI vs. NV I could find you making the argument 180degrees from your present attempt at saying that any Igpu is worth while at this time . To me it doesn't matter . If haswell is better than trinity say by 20% which I am sure it will be in the mobile space than the debates are over.. As the AMD fanbois have declared that trinity is good enough for gaming . What follows trinity doesn't matter from AMD aS hASWELL BEATING TRINITY WILL END THIS ARGUMENT . Haswell will be booth faster Cpu and Igpu . what ever AMD does after trinity doesn't change the fact by AMD fanbois decree trinity is good enough. So the gaming aspect will be removed from argument debate
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Im curious to see where Trinity prices settle at. What we are seeing right now is too dollar. I think in a month the landscape is going to be different.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
A 50-60$ Celeron/Pentium+discrete GFX will be much better at gaming.

Try playing BF3 MP on your "monster pentium"...

While the pentium might be faster in older poorly threaded games, the A10 will be playable. Try telling me that BF3 MP will be playable in your "monster pentium..." How about you shows us some numbers of BF3 MP running acceptable in your "monster pentium"?
The pentium is simply unplayable in newer well threaded games. Recommending the pentium is asking to flush money down the toilet.

You can add also the HD7750 to the A10 and change the 5800K to a 5600K.

Edit: Wait, seems that DICE has updated the BF3 code. Now the "monster pentium" is unplayable even in single player, and that is the newer IVB pentium
http://techreport.com/review/23662/amd-a10-5800k-and-a8-5600k-trinity-apus-reviewed/10

Edit 2: That means the lowest you need for a competent gaming machine is the i3 (HT) and that is the same price as the A10s. Price parity.
 
Last edited: