• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

ATI's next generation Crossfire (asymmetric GPUs)

cbn

Lifer
Mar 27, 2009
12,968
221
106
http://www.rage3d.com/articles/ati_catalyst_10/index.php?p=3

The link is to page three of a seven page article originally posted by Grimpr.

It looks like ATI is developing a much more sosphisticated Asymmetric Crossfire system (which is good news because the new Llano IGP is supposed to be 480 stream processors according to one review site).

"Alternate Frame Rendering could be adjusted such that a 4:1 ratio could be used between an Enthusiast and Premium product, and need not be the only mode available - split frame rendering could be implemented with one GPU processing all the verticies, and the second for all pixels. Non-3D workload (i.e. DirectCompute, OpenCL) workloads could be scaled appropriately, and dynamically."

I think this could really add a lot of value to their line-up.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
In the RV870 Anandtech article "sideport" was mentioned as a way of Synchronizing GPUs. I wonder how long it will be till we see that happening between APU and discrete card?
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
They promised me asynchronous GPU loading with CF when they first released it, about damn time ;).

I have figured it must be coming.. as they already run the GPU's asynchronously.. just without the ability to load them asynchronously... thus the faster card just does a lot of waiting.

It really isn't talked about much, but currently the primary difference between SLI and CF is that ATI does not down clock and so forth. I have been excited at the prospect of a company taking advantage of that, other than allowing 'Johny old GPU' to run his 4830 with his 4890 and complain that the performance is worse than the 4890 alone.

When the 10.3 preview first came out this was the most exciting feature for me... true scalability in crossfire/SLI would be epic on my wallet at upgrade time.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
About time ATI realizes the importance of asynch multi-GPU. This is a great way to not only have otherwise slower CPUs become the gamers' choice but also increase the draw of their GPUs. Suddenly having an all-AMD platform is a big advantage over mix and match.

Because just about every AMD GPU benchmark would run better with an AMD CPU+GPU you'd suddenly see game reviews featuring AMD CPUs. They've all been nothing but i7 for the past year. Marketing win: mindshare.

Can't get here fast enough for them I imagine.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Would be kinda nice if it works out as planned. Not sure what the point would be doing it with a 5750/5770 combo would be tho as price diff is minimal anyways.

Wonder how it'll work with something like a 5850/5770 combo as it would be a whole lot cheaper than 2 5850's

Of course this would be a nvidia type move if it only worked with a AMD cpu's. To clarify I could care less if the IGP only had AMD support. The discrete cards would be my concern.
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Would be kinda nice if it works out as planned. Not sure what the point would be doing it with a 5750/5770 combo would be tho as price diff is minimal anyways.

Wonder how it'll work with something like a 5850/5770 combo as it would be a whole lot cheaper than 2 5850's

No, the attractive combo would be the onboard AMD GPU + discrete AMD video card.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Would be kinda nice if it works out as planned. Not sure what the point would be doing it with a 5750/5770 combo would be tho as price diff is minimal anyways.

Wonder how it'll work with something like a 5850/5770 combo as it would be a whole lot cheaper than 2 5850's

Of course this would be a nvidia type move if it only worked with a AMD cpu's. To clarify I could care less if the IGP only had AMD support. The discrete cards would be my concern.

The vendor specific aspects of this are with the future APUs. Crossfire will certainly never work with intel IGPs.. so one would need an AMD APU.

The future of this kind of thing really is in the APU front. If it works well it might bring better scaling to vanilla CF as well. The benefit to upgrade paths is nice, but not really the greatest benefit of this.

If this kind of thing is built well it could be big news though. The ability to use a low end card to augment physics, tessellation, etc. would be great for many folks.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
No, the attractive combo would be the onboard AMD GPU + discrete AMD video card.

I guess if AMD could make a CPU/GPU combo that is more competetive to intel.

It would be of more benefit and more profitable if the technology isn't nvidia-ized :D
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
About time ATI realizes the importance of asynch multi-GPU. This is a great way to not only have otherwise slower CPUs become the gamers' choice but also increase the draw of their GPUs. Suddenly having an all-AMD platform is a big advantage over mix and match.
A lot more people will be interested in CF if this works out. Instead of buying two (nearly-)identical cards at once, you can leap-frog GPU generations: 4xxx+5xxx, 5xxx+6xxx, etc.
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
probably a Captain Obvious statement, but it's a very clever business decision - could open up more consumers for CF than the crowd already inclined to try it.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
If this supports cross-generation with decent performance scaling AMD would be assured vendor lock-in.

Yeah, that new NV card may perform a hair better, but I already have an ATI card so I'll just get another from this generation to replace the one I have from two generations back rather than replace both cards. Complete and utter win for AMD if they can pull it off.

Might force NV to implement heterogeneous multi-GPU as well.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
This is a great technology that - much like nvidia's optimus - someone should have been brought to market several years ago. I applaud ATI for going after a way to create a method to use nonidentical gpu's together simultaneously.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Nvidia is so horrible with "multigeneration" compatibility. You can only SLI two of the exact same models even though they may use essentially the same card (i.e. all of those rebadged g92s 8800,9800,250, etc...). Strangely enough I think they do so on purpose to sell more cards. It's like the exact opposite of ATI's strategy.

By purposely disabling SLI between two "generations", NV can sell 2 new rebadged cards or 1 more expensive card instead of just 1 new card if someone wanted SLI. The other option is finding an old card, so even their old cards still command a premium price.

With ATI's strategy allowing anything to crossfire means someone wanting more performance can buy whatever the cheapest card is available at the time as long as they have free slots. That could mean smaller profits as people buy midrange, cheap used cards, or free after rebate 4350s to add on to their old card instead of the expensive high end cards.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Thing is, the artificial arm twisting doesn't always work. If I could have added a $60 AR 9800GT to my OCd 8800GT I might have been tempted to pick one up. Better yet, the $50 GT250 hot deal from last year. If I have to throw away my current video card anyway I am certainly going to consider the best price/performance outside of nvidia. Artificially obsoleting my existing video hardware may result in a sale for ATI in other words.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Thing is, the artificial arm twisting doesn't always work. If I could have added a $60 AR 9800GT to my OCd 8800GT I might have been tempted to pick one up. Better yet, the $50 GT250 hot deal from last year. If I have to throw away my current video card anyway I am certainly going to consider the best price/performance outside of nvidia. Artificially obsoleting my existing video hardware may result in a sale for ATI in other words.

That's exactly my point. NV wouldn't make money off of or makes very little on those sales so they don't actually want you to pick up a clearanced 9800GT or GT250 hot deal. The goal is to force you to get something like a single 280 or at the very least make you buy 2 of the hot deal cards. Sure it may result in a sale for ATI, but that's something they always have to worry about regardless and is why they need to hype physx and CUDA to differntiate themselves.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
If this kind of thing is built well it could be big news though. The ability to use a low end card to augment physics, tessellation, etc. would be great for many folks.

Would split frame rendering used in this fashion suffer from micro-stutter.

I wonder if ATI is planning on using Split frame rendering to develop more versatile architectures in the future. For example, Could the newest generation card do the tessellation, but then let the older card do Anti-alaising?

Is Split frame rendering going to be a technology that eventually needs GPU to GPU communication (ie, synchronization)
 

CurseTheSky

Diamond Member
Oct 21, 2006
5,401
2
0
I would gladly buy an AMD platform if I could CF a GPU for gaming (for example, an HD 5770 or 5850), an onboard GPU (785G or its successor), and even throw in a lesser or midrange GPU (HD 5570 or 5670) for a little extra oomph if I needed it or found a great deal on one. Of course, it all depends on how well it scales.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
This is a great technology that - much like nvidia's optimus - someone should have been brought to market several years ago. I applaud ATI for going after a way to create a method to use nonidentical gpu's together simultaneously.

NVIDIA tried with Hybrid SLI a few years ago.

http://www.tomshardware.com/reviews/nvidia-hybrid-sli,1924.html

It was unsuccessful...

Hybrid SLI was the ability of the IGP to take over in a system with an IGP and discrete gpu when the discrete card was not needed. This was put down by MS for some reason. IIRC, they just wouldn't support it in wddm 1.1.

GeForce Boost was the ability to share the work between lower end discrete (GeForce 8400GS and 8500GT) cards and the IGP. This just never really took off on the desktop.

We haven't heard much about these technologies in while from NV, but it looks like they are still around in the mobile sector: http://www.nvidia.com/object/hybridsli_notebook.html

Personally, I think ATI's asymmetric CF and NV's Hybrid SLI are interesting technologies. It will be interesting to see if ATI is more successful than NV was on the desktop. Perhaps running asymmetric discrete cards will be the key to success.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Would split frame rendering used in this fashion suffer from micro-stutter.

If it does it won't be for the same reasons as AFR (alternate frame rendering). SFR is single frame rendering -- if there is a gain to be had from using multiple GPUs that single frame would simply render faster reducing any stutter.

I wonder if ATI is planning on using Split frame rendering to develop more versatile architectures in the future. For example, Could the newest generation card do the tessellation, but then let the older card do Anti-alaising?

Highly unlikely for FSAA but more possible for things like GPU compute and physics. They already mentioned splitting geometry vs texture workloads though.

Is Split frame rendering going to be a technology that eventually needs GPU to GPU communication (ie, synchronization)

It needs significantly more intra-gpu communication than AFR.
 

Hyperlite

Diamond Member
May 25, 2004
5,664
2
76
I would gladly buy an AMD platform if I could CF a GPU for gaming (for example, an HD 5770 or 5850), an onboard GPU (785G or its successor), and even throw in a lesser or midrange GPU (HD 5570 or 5670) for a little extra oomph if I needed it or found a great deal on one. Of course, it all depends on how well it scales.

that would be great to hook up some onboard for 4200...it wouldn't make much difference in high end systems, but it would give my 4670 a little oomph.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
It needs significantly more intra-gpu communication than AFR.

This is why I am beginning to wonder if "sideport" or some other type of GPU to GPU communication hardware will be making its way onto Llano APU or N. Islands?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
After reading the article looks like the first generation is simply "render every Xth frame on the feeble GPU." While guaranteed to increase benchmark scores this will stutter and/or have input lag like nothing seen before or since.

Think of it this way: GPU A can render a frame in 16 milliseconds (running at 60fps). GPU B takes 60ms (running at 15fps). If load balanced 4:1 you'd have to be willing to put up with 120ms of input lag (pre-rendering all 5 frames and then displaying them at a smoothed ~70 fps) not to have awful stuttering.

The grand vision of load sharing, if ever implemented, still holds lots of promise.