suppose... (STARS fusion)

nonameo

Diamond Member
Mar 13, 2006
5,902
2
76
Suppose AMD has an APU that has a bad CPU but good GPU. Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?

I am assuming, you mean Hybrid crossfire between a discrete card and the GPU part of the APU?


X-bit labs: But the GPU integrated into Llano APU consumes a lot less power than a standalone graphics processor...
Godfrey Cheng: Fairly, we have invested heavily to make sure that we do the right thing with power gating and so forth. The point is that when you plug in an AMD APU and an AMD GPU, you get better performance than you would with the same GPU and an Intel processor.
http://www.xbitlabs.com/articles/cpu/display/amd-fusion-interview-2010_5.html

I won't be surprised if they implement this cos.. they already have plenty of experience with hybrid CFX and frankenfire setups.

Llano is going to be the last implementation of STARS core, the future version will be based on either Bobcat or Bulldozer micro architecture.

amd_roadmap_2010.jpg
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Do you mean bad as in not functional or broken?

If so then that seems like a very technical question. Such as, they would have to design a board to fit this chip, since the chip is originally designed to go into a motherboard socket and not onto a dedicated PCB. It may or may not be possible; I really don't have an answer for that. Or it may or may not be economically feasible. Even though I didn't have an answer to the possibility, just about anything is possible, so the next question is if it's feasible.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Llano is supposed to have:

cpu performance of a ~quad-core Athlon II X4 640 or better.
gpu performance of a discrete Radeon ~5600-series graphics card.

to put it into perspective... GPU performance on the Llano is probably gonna be 2-3x times as fast as a i7 2600k HD3000 GPU performance.

Its not gonna replace discrete cards in any forseeable future... its just ment to be really cheap, give good performance for its price.


say you want to build a small pc and get:

GPU: amd 5670 ~ 64$ on newegg.
CPU: amd athlon II x4 640 ~ 99$ on newegg.
= total cost of 64$ + 99$ = 163$

with a Llano it could be:
APU (cpu/gpu in 1): 99$
= total cost 99$


performance between the 2 would be the almost same, the differnce would be that the llano would be the cheaper buy. That I believe is the idea of fusion to some extent, its to offer alot of value for its price.

Again... perspective, a i7 2600k HD3000 GPU is gonna be ~1/3 as powerfull, but the i7 2600k will cost ~3x as much. This isnt a problem, because most that buy these cpus probably do so because their cpu power, not the gpu intel put on them.

Suppose AMD has an APU that has a bad CPU but good GPU. Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?
from reading the guy above, I think I understood what you mean now..

Could the apu chips be smacked onto a discrete grafics card...? hmmm.

But these are small cheap chips, Im guessing if they find a bad chip, they throw it in the bin, instead of trying to save it or make some other product to put a partly funktioning chip into.
 
Last edited:

Ares1214

Senior member
Sep 12, 2010
268
0
0
Llano is supposed to have:

cpu performance of a ~quad-core Athlon II X4 640 or better.
gpu performance of a discrete Radeon ~5600-series graphics card.

to put it into perspective... GPU performance on the Llano is probably gonna be 2-3x times as fast as a i7 2600k HD3000 GPU performance.

Its not gonna replace discrete cards in any forseeable future... its just ment to be really cheap, give good performance for its price.


say you want to build a small pc and get:

GPU: amd 5670 ~ 64$ on newegg.
CPU: amd athlon II x4 640 ~ 99$ on newegg.
= total cost of 64$ + 99$ = 163$

with a Llano it could be:
APU (cpu/gpu in 1): 99$
= total cost 99$


performance between the 2 would be the almost same, the differnce would be that the llano would be the cheaper buy. That I believe is the idea of fusion to some extent, its to offer alot of value for its price.

Again... perspective, a i7 2600k HD3000 GPU is gonna be ~1/3 as powerfull, but the i7 2600k will cost ~3x as much. This isnt a problem, because most that buy these cpus probably do so because their cpu power, not the gpu intel put on them.

from reading the guy above, I think I understood what you mean now..

Could the apu chips be smacked onto a discrete grafics card...? hmmm.

But these are small cheap chips, Im guessing if they find a bad chip, they throw it in the bin, instead of trying to save it or make some other product to put a partly funktioning chip into.

This sounds about right, but dont forget GPU acceleration. In some ways, the 3x more powerful GPU can make the CPU much faster.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
"In some ways, the 3x more powerful GPU can make the CPU much faster."

had forgotten about that.... I can see the reviews where compute power of the cpu is benchmarked, but the llanos have gpgpu power in the mix, they ll thrash the i7's in those.

But again... most people that buy a sandy bridge dont do it because of the onboard gpu, they buy a extra high end discrete gpu to put in alongsides it. So its not really a issue... OEM makers might not be happy about it though, the ones that sell pcs without additional discrete cards.

That said... cpu vs cpu, a i7 will eat a llano alive that doesnt have gpgpu advantage.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I believe the OP is saying to take the Fusion chip, the pure silicon, with a non-working CPU portion, and stick that chip onto a graphics board and just use the graphics portion of the die.

I don't think he's talking about replacing discrete cards. He's kind alluding to the opposite: Turning the Fusion chip into a discrete video card, and keeping the CPU portion unused (since it's busted?)
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
I don't think he's talking about replacing discrete cards. He's kind alluding to the opposite: Turning the Fusion chip into a discrete video card, and keeping the CPU portion unused (since it's busted?)

I see.. that makes a lot of sense.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
"I don't think he's talking about replacing discrete cards. He's kind alluding to the opposite: Turning the Fusion chip into a discrete video card, and keeping the CPU portion unused (since it's busted?)"

busted or not it might hold merit... Im not sure how much it costs to configer differnt fabs for differnt chips, how much R&D costs for makeing a unique grafic card at a certrain speed vs just useing the same chip of the APUs on it. Its bound to cost something, but then again, so is unused mm^2 the cpu part takes up on the chip.

Or just useing chips where they find some issue with the cpu part of the apu, and use the gpu on a discrete card.

Its actually a intresting idea, and something that had not occured to me before reading this thread. Im willing to bet amd do it if they think they can make more money this way though, as they should, no use wasteing a chip that could be used.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Well I'd only see them do it if:

-It's not expensive (or impossible, as the OP is asking) to put Fusion onto a graphics board

-They even have enough chips with a busted CPU and functional GPU. If they don't have enough chips that meet this criteria then it's probably not going to be worth their time.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Suppose AMD has an APU that has a bad CPU but good GPU. Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?

Definitely plausible but the absolute yields of such a specific defect category like this are not going to be high enough to support the economics of actually formulating a business strategy and product lineup around the idea.

To give you an analogy that is proof of this you need look no further than the existing lineup of "fused" things.

When AMD integrated the memory controller you did not see them sell chips that had dead cpu's but good memory controllers as some new northbridge discrete memory controller, right?

And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.

There are so many things that can kill a CPU/GPU, but only so few things that will solely kill just the CPU portion in a way that leaves the chip still binnable/sellable as a GPU. They might get 2 or 3 per wafer, not enough in total to actually create a product lineup from.
 

flexcore

Member
Jul 4, 2010
193
0
0
Definitely plausible but the absolute yields of such a specific defect category like this are not going to be high enough to support the economics of actually formulating a business strategy and product lineup around the idea.

To give you an analogy that is proof of this you need look no further than the existing lineup of "fused" things.

When AMD integrated the memory controller you did not see them sell chips that had dead cpu's but good memory controllers as some new northbridge discrete memory controller, right?

And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.

There are so many things that can kill a CPU/GPU, but only so few things that will solely kill just the CPU portion in a way that leaves the chip still binnable/sellable as a GPU. They might get 2 or 3 per wafer, not enough in total to actually create a product lineup from.

And I would think the GPU part of the APU is more likely to have issues vs. the CPU part anyway. So really if anything we would see a part that had the GPU disabled. I would hope we don't see either of these, because that would mean that AMD is having problems with manufacturing. I hope after all these delays that they have everything ironed out.
 

dac7nco

Senior member
Jun 7, 2009
756
0
0
And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.

Strangely enough, the 487 was a full 486DX which disabled the present 486SX when installed.

Daimon
 

nonameo

Diamond Member
Mar 13, 2006
5,902
2
76
Definitely plausible but the absolute yields of such a specific defect category like this are not going to be high enough to support the economics of actually formulating a business strategy and product lineup around the idea.

To give you an analogy that is proof of this you need look no further than the existing lineup of "fused" things.

When AMD integrated the memory controller you did not see them sell chips that had dead cpu's but good memory controllers as some new northbridge discrete memory controller, right?

And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.

There are so many things that can kill a CPU/GPU, but only so few things that will solely kill just the CPU portion in a way that leaves the chip still binnable/sellable as a GPU. They might get 2 or 3 per wafer, not enough in total to actually create a product lineup from.

thanks, that makes sense.