Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?
http://www.xbitlabs.com/articles/cpu/display/amd-fusion-interview-2010_5.htmlX-bit labs: But the GPU integrated into Llano APU consumes a lot less power than a standalone graphics processor...
Godfrey Cheng: Fairly, we have invested heavily to make sure that we do the right thing with power gating and so forth. The point is that when you plug in an AMD APU and an AMD GPU, you get better performance than you would with the same GPU and an Intel processor.
from reading the guy above, I think I understood what you mean now..Suppose AMD has an APU that has a bad CPU but good GPU. Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?
Llano is supposed to have:
cpu performance of a ~quad-core Athlon II X4 640 or better.
gpu performance of a discrete Radeon ~5600-series graphics card.
to put it into perspective... GPU performance on the Llano is probably gonna be 2-3x times as fast as a i7 2600k HD3000 GPU performance.
Its not gonna replace discrete cards in any forseeable future... its just ment to be really cheap, give good performance for its price.
say you want to build a small pc and get:
GPU: amd 5670 ~ 64$ on newegg.
CPU: amd athlon II x4 640 ~ 99$ on newegg.
= total cost of 64$ + 99$ = 163$
with a Llano it could be:
APU (cpu/gpu in 1): 99$
= total cost 99$
performance between the 2 would be the almost same, the differnce would be that the llano would be the cheaper buy. That I believe is the idea of fusion to some extent, its to offer alot of value for its price.
Again... perspective, a i7 2600k HD3000 GPU is gonna be ~1/3 as powerfull, but the i7 2600k will cost ~3x as much. This isnt a problem, because most that buy these cpus probably do so because their cpu power, not the gpu intel put on them.
from reading the guy above, I think I understood what you mean now..
Could the apu chips be smacked onto a discrete grafics card...? hmmm.
But these are small cheap chips, Im guessing if they find a bad chip, they throw it in the bin, instead of trying to save it or make some other product to put a partly funktioning chip into.
I don't think he's talking about replacing discrete cards. He's kind alluding to the opposite: Turning the Fusion chip into a discrete video card, and keeping the CPU portion unused (since it's busted?)
"I don't think he's talking about replacing discrete cards. He's kind alluding to the opposite: Turning the Fusion chip into a discrete video card, and keeping the CPU portion unused (since it's busted?)"
Suppose AMD has an APU that has a bad CPU but good GPU. Is there any evidence/reasoning to suggest that they might build a videocard that uses only the GPU part of the APU or is this not possible at all?
Definitely plausible but the absolute yields of such a specific defect category like this are not going to be high enough to support the economics of actually formulating a business strategy and product lineup around the idea.
To give you an analogy that is proof of this you need look no further than the existing lineup of "fused" things.
When AMD integrated the memory controller you did not see them sell chips that had dead cpu's but good memory controllers as some new northbridge discrete memory controller, right?
And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.
There are so many things that can kill a CPU/GPU, but only so few things that will solely kill just the CPU portion in a way that leaves the chip still binnable/sellable as a GPU. They might get 2 or 3 per wafer, not enough in total to actually create a product lineup from.
And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.
Definitely plausible but the absolute yields of such a specific defect category like this are not going to be high enough to support the economics of actually formulating a business strategy and product lineup around the idea.
To give you an analogy that is proof of this you need look no further than the existing lineup of "fused" things.
When AMD integrated the memory controller you did not see them sell chips that had dead cpu's but good memory controllers as some new northbridge discrete memory controller, right?
And likewise when Intel first integrated the FPU back in 486 days, you didn't see them sell 486 DX-33 as stand-alone x87 FPU's when the integer CPU portion of the chip failed to function.
There are so many things that can kill a CPU/GPU, but only so few things that will solely kill just the CPU portion in a way that leaves the chip still binnable/sellable as a GPU. They might get 2 or 3 per wafer, not enough in total to actually create a product lineup from.