AMD A10-5800K preview - iGPU side only

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I personally wouldn't buy a 7770 that you recommend, for me 7850 would be a minimum if I'm buying today. I plan to buy A8-5600K for my dad though.

I think you're confusing me with someone else. Other than saying "low end discrete video cards" I've not suggested any particular product.

You might be slightly disgusted just how high I'd want to set the bar before I'd accept a card as low end, but suitable.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Price: 20% increase in price for 100% increase in performance: advantage discrete.

Size: Hmm, no computer I have ever added a discrete card to suddenly grew in size: advantage discrete.

Nope.

Your 20% is more like 30% and the performance difference is irrelevant if its not utilized.

Regarding size, I'm considering a slim microatx or mini itx, so a discrete card of the caliber the two of you are talking about would change both the case dimensions and the required psu.

I think its worth noting that neither of you bothered to my specific use case for Trinity. Trolling much?
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I don't hide my bias, but I suspect that you mistake my disdain for "budget" parts as disdain for AMD. If AMD were to produce superior products then I would be all for them! I'm interested in the performance, not who makes it, and it has to be good enough to actually do what I'm trying to do with it.

They don't have to produce superior products, they just need to bean alternative. The A10 is fighting i3's, not HD7770's.
 
Aug 11, 2008
10,451
642
126
They're also mostly 1366x768 :p The image quality and the "smoothness", aka response time, are equal to that of mobile discrete GPUs (FFS, look up. I've already posted that. Saying you've read what I've posted and actually reading it are two different things entirely). In both of those regards, Intel has a bit of work to do. Though a recent driver update has fixed some image quality issues that persisted with Ivy Bridge, they're still a bit away from discrete-level image quality.

So they're not bottom end anything. They're actually replacing mid-level mobile graphics cards already. AMD's 7660G compares roughly to a GT540m/GT630m while Intel is a bit behind, somewhere around the GT520m level.

Instead of going with your _____insert response here instead of reading the reviews_____, it might do you good to actually read the mobile reviews, specifically where the on-die GPU is concerned. The same scenario you're describing that you want on the desktop, discrete level performance that's actually respectable, is already there on laptops. You know... the segment of the market that makes up the majority of sales.

We are not talking laptops here. Laptops are difficult if not impossible to upgrade, and are very limited by power and cooling requirements. Desktops on the other hand are very easy to upgrade and have none of these limitations. Why would you want to live with a very mediocre to inadequate igp in a desktop when it is so easy to add a discrete card.
 

Abwx

Lifer
Apr 2, 2011
12,038
5,014
136
Will it display frames in sequence that approximate a game? Sure.

I made the assumption that most readers had the acuity to recognize that I mean that it would not display them quickly enough to create the illusion of smooth motion with reasonable resolutions and image quality.

Most of these chips will end in SFFs or AIO that are likely
to have no upgrade path for GFX , so the average public
will mainly have to choose between an AMD APU or an Intel
SB/IB , both formulae with only the integrated GPU.

At this point the dilemma will be the same as in laptops ,
and as said ad nauseam , from my past experiences ,
the configurations with the better GFX did better on the durability front.
 

Abwx

Lifer
Apr 2, 2011
12,038
5,014
136
We are not talking laptops here. Laptops are difficult if not impossible to upgrade, and are very limited by power and cooling requirements. Desktops on the other hand are very easy to upgrade and have none of these limitations. Why would you want to live with a very mediocre to inadequate igp in a desktop when it is so easy to add a discrete card.

See above....
 
Aug 11, 2008
10,451
642
126
My logic is perfect. As for your logic, you only see one thing as being logical an that's AMD is crap.

I never said AMD was crap. You are putting words into my mouth. What I am saying is that any igpu in a desktop is barely adequate to run a limited number of games, and it makes no sense to not add a discrete gpu for 50 to 100 dollars that can easily double performance. I will stand by that.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
We are not talking laptops here. Laptops are difficult if not impossible to upgrade, and are very limited by power and cooling requirements. Desktops on the other hand are very easy to upgrade and have none of these limitations. Why would you want to live with a very mediocre to inadequate igp in a desktop when it is so easy to add a discrete card.

You followed my responses but you didn't follow his responses.

It's the same exact chip in both, just the TDP and clocks are different. His claim that the GPU is "useless" is just flat out wrong. If you're looking at only the desktop arena, you're not looking at anything at all. Intel and AMD design desktop chips in mobile first, then push their clocks up and call it a desktop chip. Thus, stating that "it doesn't make sense" only means you don't understand how and for which market the chips were actually designed for. And that's not just AMD here, Intel does the same thing. In fact, Intel started that trend with AMD following years later with Llano being their first attempt.
 
Last edited:

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I never said AMD was crap. You are putting words into my mouth. What I am saying is that any igpu in a desktop is barely adequate to run a limited number of games, and it makes no sense to not add a discrete gpu for 50 to 100 dollars that can easily double performance. I will stand by that.

Who said it is a ubber gaming machine that chews everything you throw at it? What I've been trying to tell you is that it actually plays lots of games at reasonable settings. I game with my kids on my HTPC but for the real stuff I only trust my 7970. And I only have a Llano APU.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I think its worth noting that neither of you bothered to my specific use case for Trinity.

Probably because it's a loaded question, akin to asking if I've stopped beating small children.

You've said that your wife, unequivocally cares nothing at all about image quality and frame rate.

There is literally no way to address that without causing extreme offense of one form or another, and as such I will abstain. Now you may feel that my abstinence means that you are correct and I am wrong, and feel free to do so. However, I realize, that as you've dragged your wife in to it, there is no harmonious path that can arise.

I'll give you some examples of potential issues I could raise if it were a generality that your wrapping up the statement with your spouse makes not allowed.

I could point out that the differences are obvious between them, and given a choice between faster and prettier, anyone is going to chose faster and prettier. Since the topic is your wife, you could take offense and feel that I am suggesting that you don't know your spouse (or that she is less than what you feel she is). Alternatively, you could interpret it as I am suggesting that you think so little of your spouse (who I know nothing about as well as your know nothing about your relationship with) that you are wrong.

No thank you, I'm not touching that land mine with a ten foot pole.
 
Aug 11, 2008
10,451
642
126
You followed my responses but you didn't follow his responses.

It's the same exact chip in both, just the TDP and clocks are different. His claim that the GPU is "useless" is just flat out wrong. If you're looking at only the desktop arena, you're not looking at anything at all. Intel and AMD design desktop chips in mobile first, then push their clocks up and call it a desktop chip. Thus, stating that "it doesn't make sense" only means you don't understand how and for which market the chips were actually designed for. And that's not just AMD here, Intel does the same thing. In fact, Intel started that trend with AMD following years later with Llano being their first attempt.

I fail to see what the origin of the chip has to do with it. My point is that a desktop is much easier to upgrade than a laptop, and has much greater power and thermal headroom. Is this not correct? Thus it is much easier to upgrade a desktop and not have to live with very mediocre performance. And btw, I would not be willing to accept a laptop with a GT540m as my main gaming machine either. That is a mid to low end chip even for a laptop and is quite old now.
 
Aug 11, 2008
10,451
642
126
Nope.

Your 20% is more like 30% and the performance difference is irrelevant if its not utilized.

Regarding size, I'm considering a slim microatx or mini itx, so a discrete card of the caliber the two of you are talking about would change both the case dimensions and the required psu.

I think its worth noting that neither of you bothered to my specific use case for Trinity. Trolling much?

Why would the performance difference not be utilized? We are not talking about moving from a GT 670 to a GT680 or SLI or something.

We are talking about moving from an igp that will run some games at low to moderate settings and some not at all, to something that will run everything at medium to high. How would that performance not be utilized?

I wont even address your name calling.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
My point is that a

You're mistake right here. You don't quote my response to someone else and assume it's to you.

So because you weren't reading, let's try this again. This time a bit clearer for those oblivious

His claim that the GPU is "useless" is just flat out wrong. If you're looking at only the desktop arena, you're not looking at anything at all.

2011 is server
1155 is laptop

AM3+ is server
FM1/FM2 is laptop

I don't need to explain this to you, you already know this. If you want to know why Trinity, or HD4000, or Ivy Bridge, or Haswell make sense, you need to look at their intended market first. He didn't say that for desktop the IGP sucks and is pointless, he said that the same middle ground where Trinity's 7660D is doesn't make sense. Of course it makes sense. He thinks it doesn't make sense because he's assuming, and wrongly at that, that these chips are designed for the desktop.

Newsflash: we haven't had a single desktop chip in years.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I am not assuming they are for desktop.

I made the statement in regards to gaming for all form factors.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
I am not assuming they are for desktop.

I made the statement in regards to gaming for all form factors.

And that's exactly where you're wrong. Trinity's on-die GPU is a godsend for OEMs making laptops. Just like HD3000 was and just like HD4000 was. Haswell and Kaveri will be even better.

The A10-5800K is a strange chip that very few people will actually buy. It's got an unlocked multiplier, meaning enthusiast only pretty much, for enthusiasts who won't buy discrete graphics cards? That's like 5 people.

The A10-5700 at 65W TDP is a great chip for OEMs. Now they can sell a thin all-in-one capable of gaming at decent frame rates at lower resolutions and completely bypass the discrete GPU. It also makes a very good HTPC chip.

You're still stuck in the mindset that everything must be made for the desktop for builders and enthusiasts. That's Neanderthal thinking. For AMD and Intel both, the desktop is an afterthought. They go where the money is and currently that's elsewhere. Thus to see where future architecture is going, you need to study the market trends and the current sales figures. Trinity makes sense because it makes sense for laptops. Ivy makes sense because it's great for laptops. Haswell goes one step below that and is specifically designed for Ultrabooks. If you don't want to take my word for it, listen to Anand's podcast where he goes into detail about that.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
You're still stuck in the mindset that everything must be made for the desktop for builders and enthusiasts. That's Neanderthal thinking. For AMD and Intel both, the desktop is an afterthought

That's not the case. I am stuck in the mindset that there are minimum levels of performance required for games, and being the fastest igpu available does not automatically place it above that bar.

It's still below that bar.

Oh, and by the way, the bar continually moves upwards as higher resolutions become more common.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
That's not the case. I am stuck in the mindset that there are minimum levels of performance required for games, and being the fastest igpu available does not automatically place it above that bar.

It's still below that bar.

Oh, and by the way, the bar continually moves upwards as higher resolutions become more common.

There are several interesting takeaways from the latest graphics chip shipments and suppliers' market share data from Jon Peddie Research (JPR). The first one is that Intel continues to dominate the field with a demanding 59.1 percent share of the market, down from 60.4 percent sequentially but up from 52.5 percent one year prior. Intel's dominance is a testament to the concept of integrated graphics, which is an area NVIDIA ditched to focus on discrete graphics.

The second thing that's interesting is that AMD was the only major GPU player to increase its graphics chip market share sequentially. AMD ended the quarter with a 24.8 percent share, up from 23 percent in Q3. NVIDIA, meanwhile, dropped less than half a percentage point from 16.1 percent in Q3 to 15.7 percent in Q4.

No. Why not? Because the sales figures go against what you're stating. People don't buy computers with discrete GPUs. That number is steadily decreasing as IGP solutions become more common and more powerful. You can argue all you want about discrete gaming performance and what you want, the fact is the numbers don't lie. Laptops outsell desktops in 2-to-1 or 3-to-1 numbers and a vast majority of them don't have discrete GPUs.

Why don't I mention desktops? Because they're an afterthought in chip design.

-- that was at the beginning of the year. Expect those numbers to be even sharper in Intel and AMD's direction, and mainly due to their IGPs.

JPR says discrete GPU shipments declined 12 from last quarter and were down nearly 3.5 percent compared to last year.

:colbert:
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
No. Why not? Because the sales figures go against what you're stating. People don't buy computers with discrete GPUs. That number is steadily decreasing as IGP solutions become more common and more powerful.

Most PCs aren't for games. I never said that this (or any other available igpu) isn't just peachy for non gaming (in fact, I've explicitly stated (multiple times) that it, and the Intel offering are both suitable for non gaming). You can't bring up global PC sales and apply that to games.

If we're looking at that metric and pretending it tells us anything about the suitability of the product for games, we would have to accept that intel must be the king of gaming gpus. Obviously, this is not the case.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
If we're looking at that metric and pretending it tells us anything about the suitability of the product for games, we would have to accept that intel must be the king of gaming gpus. Obviously, this is not the case.

It's a good thing Intel isn't focusing on offering more GPU performance than, isn't it?

Wait a minute...

To win this argument, you're going to have to design your own processor. I'm sorry.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
It's a good thing Intel isn't focusing on offering more GPU performance than, isn't it?

Wait a minute...


So... Now your claim is that because Intel is looking at making faster igpus, that this particular igpu that we are discussing is suitable for games?

I don't follow.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
So... Now your claim is that because Intel is looking at making faster igpus, that this particular igpu that we are discussing is suitable for games?

I don't follow.

Oh...my...goodness.

You're right. Intel is just really into wasting die area needlessly. What else could it be for? I mean, HD3000 played 1080p video just fine. It didn't game all that well. I guess HD4000 played video like... 1000x better. Those videos were going at superfast speeds. I can't wait to see how quickly videos fly by with Haswell. And that compute power of those GPUs? WOW! The way they don't handle CUDA and openCL is utterly incredible.

You can't bring up global PC sales and apply that to games.

Yes you can. Look at the microarchitecture. Unless Intel is planning to play videos at hyperspeed, it's probably for gaming.

Just a thought. I guess you could be right and they're doing it to waste die space.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Oh...my...goodness.

You're right. Intel is just really into wasting die area needlessly. What else could it be for? I mean, HD3000 played 1080p video just fine. It didn't game all that well. I guess HD4000 played video like... 1000x better. Those videos were going at superfast speeds. I can't wait to see how quickly videos fly by with Haswell.


I'm sorry, I have to really get to the bottom of this.

I am saying "this particular igpu is not good enough for gaming"

To provide evidence contrary to that, you say that Intel is looking at making better igpus.

In what world does Intel looking to make better igpus make the igpu we are talking about perform any differently than it does?

You seem to be arguing against I point that I am not making. I never said no igpu can ever perform acceptably in games (though they'll always be a pale shade of discrete offerings). If that were what I were talking about, perhaps the Intel comment would make sense, but as is, it's like me saying "Bulldozer is slow" and you telling me "Well, Intel is trying to make Haswell faster, so you're wrong". It makes no sense.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
No, you said these things.

That's not the case. I am stuck in the mindset that there are minimum levels of performance required for games, and being the fastest igpu available does not automatically place it above that bar.

It's still below that bar.

Oh, and by the way, the bar continually moves upwards as higher resolutions become more common.

Higher resolution means bigger on-die GPUs. (Psst.. that part's important)

Why ignore it? Because the end result is not suited for what it's apparently being pushed as suitable for. I don't care how hard they tried, or how close you may feel they've come. This is neither horse-shoes nor hand-grenades.

...

This one's really good, though. I think it's the best pick of the bunch, personally:

My position is that those few who might accept this type of performance are outliers.

Apparently the "outliers" are those in the majority.

No, I get what you're saying. You're assuming that there is no such thing as too slow because someone, somewhere might accept it.

You need to make things in the middle, sure, but don't expect people who want a fully baked product to buy them. If they had performance like a lower end gpu, I could see the argument that there could be some remote case where you might want to buy them, I guess, but right now, they are not a product suited for playing games.

People are buying these products that still suck at playing games. In fact, they're buying them at a far higher rate than those products meant specifically for playing games at high settings. As Intel and AMD get up there with IGP performance, the low end discrete cards disappear. Next step? Mid-range cards. After that? Your discrete GPU driving your 1080p monitor.

AMD and Intel are selling what you call "crappy IGP" chips at a high enough rate that discrete card sales are slipping and discrete cards are losing market share. Intel and AMD's IGPs won't be "crappy" forever. In fact, people bought these "crappy IGP" chips at pretty high rates, no?

Like I said, you're not going to win this argument. Your claims and statements are what enthusiasts and the small percentage of consumers like you think. There's not many of these people left. The sales figures reflect that. Laptop sales outnumbering desktop sales by a vast margin, discrete GPU sales are falling.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
12,038
5,014
136
I am saying "this particular igpu is not good enough for gaming"

I guess that you didnt game at the time of the first gforce
neither in the following years since the games were frankly
not playable according to your standards up to 2008 or so...