ITT: We discuss future pricing and availability for AM3+ processors and mobos

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
Still seems like it'd be a lot easier and cheaper to leave the GPU separate. I you're gaming, you need an external GPU anyway, if you aren't gaming a generic video card is cheap. I can see some general population type buying a computer from a box store and being pleased they can play old games or guys doing HTPC not having to take up the space for a vid card or for a server type app, but beyond that I just don't see the attraction. There used to be plenty of boards with onboard video, nobody wanted them unless they had a specific reason to not want to add a video card. What am I missing? Power savings for laptops? One of the reasons I bought an FX a couple years ago to start with was they did not have any video stuff I wasn't going to use built in.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The problem with AMD is you cannot get the best cpu performance with an igpu. To get the best cpu performance, you have to either live with ancient motherboard graphics or add a discrete gpu. To get an igpu, you have to sacrifice cpu performance. With Intel, you can pretty much choose what level of cpu performance you need, and you will get a usable igpu with it.

Yes, Intel LGA 1150 processors have good iGPUs (even my G3258's iGPU is quite surprising for playing games)

But I think at this point it is best for AMD not add die bloating iGPU to their big core desktop cpus.

Intel has the process tech advantage to put large GT2 iGPUs on their desktop Core i3s (etc), but AMD does not.

So I think by omitting that iGPU on die (and reserving to chipset only for big core desktop processors), AMD has a chance to catch up to Intel (in some ways) even if they are on an older node.

And as far as chipset integrated graphics go for AM3+ I don't see it being a problem for basic use. If a person wants to game they can just add a dGPU, like most people with Core i3s do.

P.S. Kaveri and other big core APUs are best reserved for mobile IMHO (for reasons I have discussed throughout the thread).
 

turn_pike

Senior member
Mar 4, 2012
316
0
71
Still seems like it'd be a lot easier and cheaper to leave the GPU separate. I you're gaming, you need an external GPU anyway, if you aren't gaming a generic video card is cheap. I can see some general population type buying a computer from a box store and being pleased they can play old games or guys doing HTPC not having to take up the space for a vid card or for a server type app, but beyond that I just don't see the attraction.

I used to hold the same opinion.
However after being "forced" to buy a Kaveri, I found myself thinking that in a couple of years an APU might be all I need.

The games that I actually play for hours on end,League of Legends, perform fine on Kaveri. While surely Kaveri is greatly underpowered for many other games, I cant help but think that if AMD could double or triple the iGPU performance in 2 years then I would be very happy with it as my main box.

Gaming used to be an expensive hobby but these days there are many great games from indie developers or F2P games that dont require beefy GPU. Moreover many newer titles will be ported over from weak ass (compared to flagship discrete GPU) consoles. My hope will be that the APU 2 years from now will be able to handle most games just fine.

Then I can splurge on stuff just as "useless" as GPU .. like a 500 dollar headphone :p

PS
Actually I kinda question AT Forumgoers definition of "gamers" or "serious gamers". Most popular games streamed right now on twitch are LoL, Hearthstone, CS:GO, Dota2. APU is just fine for these games.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Still seems like it'd be a lot easier and cheaper to leave the GPU separate. I you're gaming, you need an external GPU anyway, if you aren't gaming a generic video card is cheap. I can see some general population type buying a computer from a box store and being pleased they can play old games or guys doing HTPC not having to take up the space for a vid card or for a server type app, but beyond that I just don't see the attraction. There used to be plenty of boards with onboard video, nobody wanted them unless they had a specific reason to not want to add a video card. What am I missing? Power savings for laptops? One of the reasons I bought an FX a couple years ago to start with was they did not have any video stuff I wasn't going to use built in.

Yes, considering the CPU throttling issue on the A10 and A8 Kaveri APUs, I have to just wonder how much of a better value having CPU and GPU separate is for something like that big box store desktop you mentioned?

Maybe if the value is great enough for separate dies (vs. APU) AMD could just bundle a R7 240 (or some other video card) with a low power binned AM3+ processor for the big box store slim form factor desktop OEMs (like Dell,Lenovo, HP, Acer) ? In other use cases the AM3+ processor big box store desktop could just use chipset integrated graphics.
 
Last edited:

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I used to hold the same opinion.
However after being "forced" to buy a Kaveri, I found myself thinking that in a couple of years an APU might be all I need.

The games that I actually play for hours on end,League of Legends, perform fine on Kaveri. While surely Kaveri is greatly underpowered for many other games, I cant help but think that if AMD could double or triple the iGPU performance in 2 years then I would be very happy with it as my main box.

Gaming used to be an expensive hobby but these days there are many great games from indie developers or F2P games that dont require beefy GPU. Moreover many newer titles will be ported over from weak ass (compared to flagship discrete GPU) consoles. My hope will be that the APU 2 years from now will be able to handle most games just fine.

Then I can splurge on stuff just as "useless" as GPU .. like a 500 dollar headphone :p

PS
Actually I kinda question AT Forumgoers definition of "gamers" or "serious gamers". Most popular games streamed right now on twitch are LoL, Hearthstone, CS:GO, Dota2. APU is just fine for these games.

Point taken.
Almost all I've ever played with FPS and the occasional space sim, tends to blind me to less demanding games.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Actually I kinda question AT Forumgoers definition of "gamers" or "serious gamers". Most popular games streamed right now on twitch are LoL, Hearthstone, CS:GO, Dota2. APU is just fine for these games.

Regarding CS:GO, I am actually able to play that on a Xeon X3323, 2GB RAM, GT 630 (GK 208) on 1080p low with Linux Mint as the OS.

So yeah, basically hardware that costs about $85 on sale for the whole computer (see this post ---> http://forums.anandtech.com/showpost.php?p=36990967&postcount=11 )

However, Kaveri (on desktop) is a really expensive way of fixing a very low cost problem.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Regarding the gaming peformance of A8-7600, because of the Kaveri cpu throttling issue under iGPU load I'd have to imagine even a Celeron G1820 plus a low cost video card would beat in performance.

This especially in an OEM situation where the A8-7600 is using DDR3 1600 RAM (even if it were a dual channel kit).

Anyway, that is another reason why I don't like the big core APUs on desktop. Mobile : yes. Desktop: No.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
K.A.V.E.R.I

Killing a very expensive Radeon iGPU

And here are the two ways the Kaveri APU hurts the performance of its very expensive to integrate Radeon GPU:

1. CPU throttling.

2. Lack of memory bandwidth
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Looking through the AM3+ Vishera line-up, all the chips (with the exception of FX-4300 which has 4MB L3 cache) have the full 8MB L3 cache:

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested

http://www.anandtech.com/show/8427/amd-fx-8370e-cpu-review-vishera-95w

However, looking at the die shot below I'd imagine there is a lot of room for AMD to offer full octocores with just some (or even all) of the L3 cache disabled?

(re: With that L3 cache being so large, it would make sense to me AMD is saving up chips with eight perfectly good cores but a manufacturing defect in part of the L3 cache)

9111769.jpg
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Looking through the AM3+ Vishera line-up, all the chips (with the exception of FX-4300 which has 4MB L3 cache) have the full 8MB L3 cache:

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested

http://www.anandtech.com/show/8427/amd-fx-8370e-cpu-review-vishera-95w

However, looking at the die shot below I'd imagine there is a lot of room for AMD to offer full octocores with just some (or even all) of the L3 cache disabled?

(re: With that L3 cache being so large, it would make sense to me AMD is saving up chips with eight perfectly good cores but a manufacturing defect in part of the L3 cache)

Cache always has built in redundancies. There is more than the 8MB L3 actually on chip there; its 8 MB + a bit which is deactivated to give a full 8 MB and if other parts of the cache is faulty is active.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Cache always has built in redundancies. There is more than the 8MB L3 actually on chip there; its 8 MB + a bit which is deactivated to give a full 8 MB and if other parts of the cache is faulty is active.

By how much do you think the L3 cache is over-provisioned?

2MB?

4MB?

It does make sense to me that too much extra L3 and the cost of each chip goes up too much. Too little (extra L3 cache) and AMD ends up with more chips than they wanted with a lower than desired spec.

With that mentioned, these Vishera die have been in production for a long time now. Maybe even longer than AMD expected when they originally did the die layout. So with this possibly longer than expected production run I have to wonder how many extra die are available now with less than 8MB cache available (that haven't been used for FX-4300).
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
By how much do you think the L3 cache is over-provisioned?

2MB?

4MB?

It does make sense to me that too much extra L3 and the cost of each chip goes up too much. Too little (extra L3 cache) and AMD ends up with more chips than they wanted with a lower than desired spec.

With that mentioned, these Vishera die have been in production for a long time now. Maybe even longer than AMD expected when they originally did the die layout. So with this possibly longer than expected production run I have to wonder how many extra die are available now with less than 8MB cache available (that haven't been used for FX-4300).

No way that much. Probably a couple KB (2-12 or something like that arranged in sections).
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
No way that much. Probably a couple KB (2-12 or something like that arranged in sections).

Wow, that is very very small amount.

And let me guess that ultra tiny amount of extra is all it takes to guaranteed 8MB L3 cache on nearly every die?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Wow, that is very very small amount.

And let me guess that ultra tiny amount of extra is all it takes to guaranteed 8MB L3 cache on nearly every die?

Bulldozer for instance used a 64 byte cacheline for L1,L2 and L3. That means every time the CPU accesses something in cache it reads off those 64 bytes selecting off the important bits. Thus really only these 64 byte cachelines need redundancies. Higher order cache structures are likely duplicated as well to some extent.

I'm not sure on the exact amount of redundant logic needed for cache but it is likely very low, well below 1%, likely below 0.1% of the total L3 cache size. Simply because cache is very simple and easy to do (generally one of the first things made when developing a new process) and if you are having problems with cache that runs at 2.2 ghz when the CPU hits 4 ghz then you are not going to be producing a CPU.

I'm not sure on the exact specifications and if someone wants to step in and correct me please do but I do know that cache redundancies are nowhere near 25-50%.