AMD Fusion graphics processor specifications

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
According to Taiwan's motherboard industry said, AMD processor first generation of Fusion "Swift" graphics core part will be based on the RV710 architecture, using 45-nanometer process from TSMC foundry, and built-in support DirectX 10.1 UVD video processor, expected performance will be RS780 will be the 1.5 x or more.

According to reports, AMD processor first generation of Fusion "Swift" part of the GPU is not fully integrated in the CPU core, but the CPU and the GPU in the same package on a processor, the 3.5 GPU chip, code-named "Kong", Using 45-nanometer process from TSMC foundry.

"Kong" will be based on the RV710 core structure of shifting values, with 40 Stream Processing Unit, 8?Texture unit, 4?ROP is RV710 half, and built-in support for Direct X10.1 UVD video processor, is expected to??Fusion processor Will be in volume production in mid-2009.

AMD Fusion graphics processor specifications exposure based on RV710 architecture built-in 40 operations unit
(google translated link)
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I wonder if it will be possible to do SLI using the integrated GPU + external PCIe card.
I mean it'd almost be a bit pointless in many situations, and the CPU won't be hugely powerful anyway, but maybe in a laptop, a discrete GPU which can power off (like in many situations today) which can run in SLI mode with the integrated GPU when graphical power is needed might be potentially interesting.
Maybe a little silly, and not cost, power or space efficient, but interesting anyway.

Fusion may also prove to be useful if AMD manage to get things like GPGPU working nicely and giving more computational power on the same package than typical current multi-core CPUs.
That's the way of the future anyway (for Intel at least) and Fusion just brings AMD closer to it, but with the added advantage (over Intel if they make a single package Atom) of GPGPU support.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I really am wondering just how well these fusions-type products can be expected to operate without access to the uber-fast and copious in GB's graphics ram that the discreet cards require.

If they lag in performance as badly as modern IGP's do (relative to modern discreet cards) then even for laptops it may simply never be worth attempting to boost the performance of your discreet card in the laptop by complicating things and doing SLi/CF with your fusion GPU (or the one that Intel plans to use with Havendale).
 

KingstonU

Golden Member
Dec 26, 2006
1,405
16
81
I was beaten to post this! Oh well I think that it was said from the start that Fusion was going to start off as a low end product at first. Hope it is the great value product breakthrough that AMD has invested everything in.

If AMD didn't talk up Fusion so early before it'srelease do you think that Intel would still have had Larrabee developed around the same time?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Idontcare
I really am wondering just how well these fusions-type products can be expected to operate without access to the uber-fast and copious in GB's graphics ram that the discreet cards require.

If they lag in performance as badly as modern IGP's do (relative to modern discreet cards) then even for laptops it may simply never be worth attempting to boost the performance of your discreet card in the laptop by complicating things and doing SLi/CF with your fusion GPU (or the one that Intel plans to use with Havendale).

Isn't Fusion DDR3? Assuming it also has an integrated memory controller it'll probably have pretty quick access to memory, at least compared to todays IGPs. 128 bit DDR3 is probably enough for an entry level card.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: SlowSpyder
Originally posted by: Idontcare
I really am wondering just how well these fusions-type products can be expected to operate without access to the uber-fast and copious in GB's graphics ram that the discreet cards require.

If they lag in performance as badly as modern IGP's do (relative to modern discreet cards) then even for laptops it may simply never be worth attempting to boost the performance of your discreet card in the laptop by complicating things and doing SLi/CF with your fusion GPU (or the one that Intel plans to use with Havendale).

Isn't Fusion DDR3? Assuming it also has an integrated memory controller it'll probably have pretty quick access to memory, at least compared to todays IGPs. 128 bit DDR3 is probably enough for an entry level card.

I was responding to the speculation on the viability of running "hybrid graphics in SLi/CF arrangement".

I agree with you, Fusion with DDR3 will be a stellar IGP solution...but I can't imagine coupling Fusion IGP with a discreet high performance card like 4870 would be a good idea.

Or is my perspective about hybrid graphics misguided?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Idontcare
Originally posted by: SlowSpyder
Originally posted by: Idontcare
I really am wondering just how well these fusions-type products can be expected to operate without access to the uber-fast and copious in GB's graphics ram that the discreet cards require.

If they lag in performance as badly as modern IGP's do (relative to modern discreet cards) then even for laptops it may simply never be worth attempting to boost the performance of your discreet card in the laptop by complicating things and doing SLi/CF with your fusion GPU (or the one that Intel plans to use with Havendale).

Isn't Fusion DDR3? Assuming it also has an integrated memory controller it'll probably have pretty quick access to memory, at least compared to todays IGPs. 128 bit DDR3 is probably enough for an entry level card.

I was responding to the speculation on the viability of running "hybrid graphics in SLi/CF arrangement".

I agree with you, Fusion with DDR3 will be a stellar IGP solution...but I can't imagine coupling Fusion IGP with a discreet high performance card like 4870 would be a good idea.

Or is my perspective about hybrid graphics misguided?

There are two types of coupling. One to add performance, where the IGP and discrete card both do graphics calculations in SLI/CF, usually IGP + low end card, and the one where you have a high end discrete card like an HD4870, and a low end IGP, and the IGP does the display output, and is active all the time. The discrete card comes into play as the graphics powerhouse when required, and gets its output routed through the IGP which acts as a display buffer. This means you have full discrete power when necessary, but that card can be disabled and IGP used when you are just doing 2D/desktop stuff.

The concept of CF with the Fusion graphics is pretty pointless, I would agree, but using the Fusion IGP to be the display output while having a discrete card able to give its power for graphics (which can be done now using regular northbridge IGP chips) could be an advantage.

I would agree that the CF using IGP + discrete card might be silly, as performance would be low anyway, but that was more fun speculation.
A hybrid power using Fusion GPU might be more interesting, but the CPU component of the chip could be a limit factor since these chips overall are aimed at lower performance segments anyway, and those who do want the power savings can get it from other oslutions such as using a regular CPU + IGP on mobo + discrete card.

I think the most interesting element of a GPU on the same package as the CPU comes from the advancements in GPGPU stuff, which could potentially be a nice boost for AMD in terms fo SOC designs if they can leverage the additional power of a GPU on their chip vs Intel who don't seem able to support things like OpenCL with their current solutions.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I don't see the point of arguing the performance impact of such a product since Fusion is obviously not going to be benched on 3D mark and the like when it arrives. What I see here is a (I would assume given the 45nm process) low power, perfect HTPC chip- that is going to be all you need for 2D work and decoding HD content (and Transcoding when the intended iteration of Powerdirector arrives) etc. I should think this would also be a good product for AMD's mobile space.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Lonyo
The concept of CF with the Fusion graphics is pretty pointless, I would agree, but using the Fusion IGP to be the display output while having a discrete card able to give its power for graphics (which can be done now using regular northbridge IGP chips) could be an advantage.
Yep I definitely seeing that as a winner arrangement for both laptops and desktops (upgrade path).

Originally posted by: Sylvanas
I don't see the point of arguing the performance impact of such a product since Fusion is obviously not going to be benched on 3D mark and the like when it arrives.
I hope no one is arguing about it here (I don't see that in the posts above) but it was something I was hoping to discuss so I could be enlightened. I had been under the impression that Fusion was for performance, not bargain-bin budget systems.

Didn't AMD buy ATi specifically to enable Fusion? Surely they didn't spend $5B just to beat Intel to the low-ASP fusion-type CPU/GPU market. Or was that really the plan?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Idontcare
Originally posted by: Lonyo
The concept of CF with the Fusion graphics is pretty pointless, I would agree, but using the Fusion IGP to be the display output while having a discrete card able to give its power for graphics (which can be done now using regular northbridge IGP chips) could be an advantage.
Yep I definitely seeing that as a winner arrangement for both laptops and desktops (upgrade path).

Originally posted by: Sylvanas
I don't see the point of arguing the performance impact of such a product since Fusion is obviously not going to be benched on 3D mark and the like when it arrives.
I hope no one is arguing about it here (I don't see that in the posts above) but it was something I was hoping to discuss so I could be enlightened. I had been under the impression that Fusion was for performance, not bargain-bin budget systems.

Didn't AMD buy ATi specifically to enable Fusion? Surely they didn't spend $5B just to beat Intel to the low-ASP fusion-type CPU/GPU market. Or was that really the plan?

first gen of fusion is supposed to be similar to current igp solutions, ie, low end gpu performance. however, succeeding generations are supposed to get better and better until they finally (cue triumphal music here) completely replace pci-e discrete graphics cards with fusion part IV or whatever they end up calling it.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: VirtualLarry
This sounds like an awesome solution for a laptop.

I agree, along with the mini-ITX market. But honestly, I can't see it going much farther than that, even @ 32nm.

What blows my mind is that both of these companies, with their billions of $'s/year worth of EE's, haven't come up with the much better solution that I saw someone mention over in the Video forum awhile back (I can't remember who it was, or I'd give them credit*): the GPU mounted in a socket, right next to/under the CPU. It would/could use the HT bus for AMD, or whatever it is that Intel's calling their highspeed bus, to link to the CPU. All it would require is a VDIMM slot or two, so it wouldn't have to share system RAM.


*Was it you that mentioned that, Bryan?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: myocardia
Originally posted by: VirtualLarry
This sounds like an awesome solution for a laptop.

I agree, along with the mini-ITX market. But honestly, I can't see it going much farther than that, even @ 32nm.

What blows my mind is that both of these companies, with their billions of $'s/year worth of EE's, haven't come up with the much better solution that I saw someone mention over in the Video forum awhile back (I can't remember who it was, or I'd give them credit*): the GPU mounted in a socket, right next to/under the CPU. It would/could use the HT bus for AMD, or whatever it is that Intel's calling their highspeed bus, to link to the CPU. All it would require is a VDIMM slot or two, so it wouldn't have to share system RAM.


*Was it you that mentioned that, Bryan?

That would require all GPU manufacturers to agree to a single socket, and you would be restricted to a single VRAM type and bus width.
It's just really an unworkable solution because graphics CARDS as a whole (not just GPU's) change at a rapid rate.
We've already gone from GDDR to GDDR5 in the same time desktop memory has gone from DDR- -> DDR3, and bus widths range from 64 bit to 512 bit.
It's just an unworkable idea given how much difference there is between cards, since the only way it would work is for low end GPU cores, but then we already have IGP for that.

Upgrading a whole motherboard to give you more power/pins/bus width/new RAM type would have to happen a lot more often than with GPU's. But the use of an additional socket for computation assisting chips is something that I believe they have been thinking of anyway, so that you can add a coprocessor to a motherboard to ease the main CPU workload, and HT/QP links would be involved in that. But that wouldn't feasibly be able to act in the same way as a discrete card.

Torrenza is already similar in terms of using HT to give extra power but not with GPU cores (yet), and Fusion is a different thing to that anyway.

And Fusion will go further, because that's the way things are going. It just won't really replace high end, multi-chip, systems for a good long while.
 

biostud

Lifer
Feb 27, 2003
19,818
6,907
136
Combine this with AMD's external videocard box for laptop's and you have a very versatile solution.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: myocardia
Originally posted by: VirtualLarry
This sounds like an awesome solution for a laptop.

I agree, along with the mini-ITX market. But honestly, I can't see it going much farther than that, even @ 32nm.

What blows my mind is that both of these companies, with their billions of $'s/year worth of EE's, haven't come up with the much better solution that I saw someone mention over in the Video forum awhile back (I can't remember who it was, or I'd give them credit*): the GPU mounted in a socket, right next to/under the CPU. It would/could use the HT bus for AMD, or whatever it is that Intel's calling their highspeed bus, to link to the CPU. All it would require is a VDIMM slot or two, so it wouldn't have to share system RAM.


*Was it you that mentioned that, Bryan?

That was me from the shanghai/bulldozer thread:
http://forums.anandtech.com/me...AR_FORUMVIEWTMP=Linear
Originally posted by: Idontcare
I know it isn't going to happen but I have long wished that the GPU market would become more like the CPU markets in that you'd have a GPU "socket" on the mobo alongside GDDR3/4/5 (just one obviously) dimm slots and an open market of GPU speeds and GDDR speeds/sizes.

Obviously the discreet video card sellers would prefer to sell you the package, just as Abit and Asus would love to sell you nothing less than a packaged mobo with soldered on CPU and soldered on pre-selected ram.

I wonder how fusion is going to handle the graphics ram end of the equation. Will the mobo have onboard GDDR? (hardwired or customer configurable)
 
Oct 19, 2006
194
1
81
Like mentioned before, fusion isn't made for gaming. It's really a gpgpu on the CPU. It's really to strenghten the CPU's performance where it falls short.

Granted it has display applications, like providing integrated video for desktops and laptops, it could also be used to accelerate video encode, or any other mathmatical intense applications. Maybe it will bring the return of 3d only video cards. Have fusion provide basic video functionality, and a dedicated 3d card provide DX and OGL. Just like the voodoo cards did.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: superunknown98
Like mentioned before, fusion isn't made for gaming. It's really a gpgpu on the CPU. It's really to strenghten the CPU's performance where it falls short.

Granted it has display applications, like providing integrated video for desktops and laptops, it could also be used to accelerate video encode, or any other mathmatical intense applications. Maybe it will bring the return of 3d only video cards. Have fusion provide basic video functionality, and a dedicated 3d card provide DX and OGL. Just like the voodoo cards did.

And for the ability to do this AMD was willing to spend $5B?

I could have sworn the argument for buying ATi was that the future of all CPU's (not just the low-performance budget builds) was the monolithic CPUGPU and without ATI's technology giving AMD a leg-up on that footrace to monolithic CPUGPU then Intel would surely beat them out forever and never look back.

I suppose I could have just been under the wrong impression this entire time, and AMD really did buy ATI so they could sell more $80 fusion processors instead of $60 non-fusion chips...seems like they could have gotten to that point by spending less than $5B on enhancing their internal design teams though. So confusing.
 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Originally posted by: Idontcare
I could have sworn the argument for buying ATi was that the future of all CPU's (not just the low-performance budget builds)

I suppose I could have just been under the wrong impression this entire time, and AMD really did buy ATI so they could sell more $80 fusion processors instead of $60 non-fusion chips...seems like they could have gotten to that point by spending less than $5B on enhancing their internal design teams though. So confusing.

Maybe things aren't working out as they'd planned. Just like the rest of their entire CPU roadmap... :(

The concept of an IGP solution that is low budget, low power/heat, and upgradeable, is great, but you're right: where's the profit?
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Because a large chunk of the motherboard cost is adding the GPU. If AMD can crunch a cpu core with a GPU included then this is one process, they will not have to have another FAB create the GPU. This is also the creation of the ultimate mobile chip. Not only for laptops but PDA's, thin clients, portal video game station etc...
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Idontcare
That was me from the shanghai/bulldozer thread:
http://forums.anandtech.com/me...AR_FORUMVIEWTMP=Linear
Originally posted by: Idontcare
I know it isn't going to happen but I have long wished that the GPU market would become more like the CPU markets in that you'd have a GPU "socket" on the mobo alongside GDDR3/4/5 (just one obviously) dimm slots and an open market of GPU speeds and GDDR speeds/sizes.

Obviously the discreet video card sellers would prefer to sell you the package, just as Abit and Asus would love to sell you nothing less than a packaged mobo with soldered on CPU and soldered on pre-selected ram.

I wonder how fusion is going to handle the graphics ram end of the equation. Will the mobo have onboard GDDR? (hardwired or customer configurable)

Ahh, I knew it was someone with intellect, I just couldn't remember which someone (or which forum I saw it in, or what I had for breakfast this morning...). If done properly, I can see it utterly revolutionizing the graphics industry, in more than one way: 1) ATI/nVidia would make more profit, because you've now cut out one of the middle men, along with all of their suppliers 2) It would cost ~1/3 as much to buy a GPU (solely the GPU), so it would be much easier to entice away your competitor's customers, and honestly, at 1/3 the cost, I think alot of us would own one of each (brand, not model) for our given price points. 3) The latest and greatest GPU's would be bought in much higher quantities, at 1/3 the cost (think of how many 8800GT's and 4850's have sold, compared to 4870 X2's and GTX280's, for instance).

Sure, you'd have to buy the VRAM, but only once in awhile, instead of with each new card. And okay, it raises the cost of non-IGP motherboards by a few dollars each. And yeah, I think it would only be a few dollars each, since IGP motherboards only cost ~$10 more than a comparable non-IGP today, and they have ALOT more to them than just a socket and a couple of capacitors.

Originally posted by: Lonyo
That would require all GPU manufacturers to agree to a single socket, and you would be restricted to a single VRAM type and bus width.

Let me know when you think of something bad to say about it, because not a one of those is a bad thing.

We've already gone from GDDR to GDDR5 in the same time desktop memory has gone from DDR- -> DDR3, and bus widths range from 64 bit to 512 bit.

And yet, in that same time period, I've personally bought 10-12 motherboards. Sure, you couldn't use the same motherboard for 10 years, but you wouldn't want to, anyway, would you? You wouldn't want to drop a 4870X2 into your Pentium 233 MMX, IOW, right?:D

It's just an unworkable idea given how much difference there is between cards, since the only way it would work is for low end GPU cores, but then we already have IGP for that.

I think it's highly workable, especially since it would be just as easy to have two different groups of motherboards (and their requisite video socket/caps/etc). You would only need a mainstream, and a highend, just like the new Intel-based boards are going to be. I mean, we've already got three or more different types of motherboard today-- IGP, mainstream without IGP, highend/overclocking, highend with Crossfire, and highend with SLI.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Yeah there is absolutely no doubt that it could be done and is viable...the CPU/Mobo/Ram industries are living proof.

However no company likes to give up degrees of engineering freedom as that means giving up product differentiation which means lowered profit margins.

So in a consumer segment with just two major players there is zero chance of either player making a move in this direction as a tactic for getting a leg up on the competition...this would have had to have happened some 10 years ago if it were going to happen.

The consumer wins when things are made out of commodity components.