ATI not supporting OpenGL with CrossfireX

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
The same people keep casting the bait line away. And the same mouth pieces keep the propaganda rolling. How sad.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Wreckage

This is not false, check out all the reviews, they could not benchmark Quake Wars on XfireX because OpenGL is not supported.

Here I will even back up my statements with a link and quote. Something you should do as well.
http://www.techreport.com/articles.x/14355/6

I've excluded the three- and four-way CrossFire X configs here since they don't support OpenGL-based games like this one.

Wrong. It works, it just doesn't scale properly with more than 2 cards.

Tweaktown

ExtremeTech

Didnt search very hard did you? ;) Those were the first two hits on google for "enemy territory quake wars crossfire x" :confused:


 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: nRollo
For single GPU cards, NVIDIA has no competition either.

Haha, what?
For single cards, nvidia has competition.
AMD might not outperform nvidia outright, but when you look at VALUE of products, they do have competition.

Obviously if you want performance, and can afford to spend money, then nvidia is the only way to go, but if you are looking at budget/mid range, then AMD has price competitive offerings no matter how you look at it. They even just cut their prices because nvidia did (see Dailytech).
If both companies cutting prices doesn't indicate competition, I have no idea what does.

Much like the CPU market for the lower end (specifically non-overclocking folk), both main competitors have products which match each other fairly well in terms of value, even if nvidia hold a lead in terms of absolute maximum performance (as do Intel for CPUs).

You may have a GX2, but you are only one person.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
This is not false, check out all the reviews, they could not benchmark Quake Wars on XfireX because OpenGL is not supported.
Firstly, Crossfire X refers to two or more GPUs. Secondly, even if three of four are used, it'll run, it just won't scale past two.

That's a bit different to saying it's ?not supported?. Saying it's not supported means it doesn?t work. It does work, it just doesn't scale.

Here I will even back up my statements with a link and quote. Something you should do as well.
LOL.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo

Note: This post is not directed at you Apoppin, nor any person in particular. It's only meant to be my commentary on debates such as the one in this thread.

Sure it is .. no problem, my old friend; you mention me by name to tell me your message is not for me?
:confused:

You already know how i feel - GX2 is awesome HW, but a little pricey - my personal opinion. Do i need to restate it 300 times? My own opinion? and defend it to the death?

naw ... i really don't care .. people here are smart - they know what they need and they can read the reviews. The reviews are pretty clear, GX2 is the new King-for-a-day! [month, quarter, year - whatever]

i am dropping out of these kinds of loud and somewhat angry discussions for they do no good for our forum.


my personal feelings - from my heart - are here - open for discussion:

http://forums.anandtech.com/me...=2168418&enterthread=y

You guys feel free to carry on without me - just don't bring my name up [please] - this old apoppin-monkey doesn't like his cage rattled.

rose.gif



 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: schneiderguy
Originally posted by: Wreckage

This is not false, check out all the reviews, they could not benchmark Quake Wars on XfireX because OpenGL is not supported.

Here I will even back up my statements with a link and quote. Something you should do as well.
http://www.techreport.com/articles.x/14355/6

I've excluded the three- and four-way CrossFire X configs here since they don't support OpenGL-based games like this one.

Wrong. It works, it just doesn't scale properly with more than 2 cards.

Tweaktown

ExtremeTech

Didnt search very hard did you? ;) Those were the first two hits on google for "enemy territory quake wars crossfire x" :confused:

I've been here for a while and I've seen how nVidia biased Wreckage is, if ATi didn't support OpenGL, then the game wouldn't work at all, not all engines scales very well with multiple GPU's because not all engines are equal.

If it wasn't because of AMD or ATi, we would still be using Pentium 4's with GeForce FX, If it wasn't because of nVidia or Intel, we would still be using Athlon 64's single cores and Radeon 9800 derivates (Including X800 series)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
CrossfireX enables a 3rd and 4th GPU over standard Crossfire, except of course in OpenGL games. So that would be unsupported as so many sites of said.
 

pcgamer321

Member
Jan 22, 2008
179
0
0
Originally posted by: mwmorph
Originally posted by: Sable
Originally posted by: Wreckage
I really hate AMD/ATI
Yeah, we got it. :thumbsup:

At least he's honest about it, though it does mean he has little to no credibility in anything he says.
Haha, basically.

Myself, I prefer AMD/ATi, but that doesn't mean I totally disregard Nvidia.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
CrossfireX enables a 3rd and 4th GPU over standard Crossfire, except of course in OpenGL games. So that would be unsupported as so many sites of said.

How does that translate into "ATI still not supporting OpenGL?"
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: munky


How does that translate into "ATI still not supporting OpenGL?"

Since you did not read the first post in this thread.
http://www.techreport.com/articles.x/14355/6

I've excluded the three- and four-way CrossFire X configs here since they don't support OpenGL-based games like this one.

And other people already explained what it really means - that OpenGL apps currently do not scale beyond 2 gpu's. The title of this thread means something else entirely.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Originally posted by: Martimus
Originally posted by: nRollo

At the end of the day, for buyers right now, only two things matter:

For dual GPU cards, NVIDIA offers a product that faster than two of ATis best dual GPUs combined, and offers comparable (if not better) IQ.

For single GPU cards, NVIDIA has no competition either.

In my opinion, nVidia has better gaming cards, while AMD has better HTPC cards. To say that nVidia has no competition, is sweeping away the features that non-gamers (90% of the computer builders that I know) care about under the rug.

Also, people generally start to care about those other features that you seem to sweep away because they needed a similar feature in the past, and didn't have it.

*People hated the noise of their previous card, because it kept them up at night, so they want a quieter video card for their next purchase.

*People want DX10.1, because their last card only supported DX9.0B, and they couldn't play some games because their card didn't support Shader Model 3.0 (which was my case), so they are afraid of something similar happening in the future.

*People had general issues with their computer, because their new Video Card drew too much power and it took a long time to debug that they needed a bigger PSU to fix their new problems, so they want to keep power draw down in future purchases. (I don't understand the money savings argument that goes with this. If you are saving a couple dollars a month, you are likely to spend it on something else anyway.)

Just making sweeping generalizations about peoples reasons for what they want is a good way to insult others, but it doesn't mean much.


*Thermals are not a brand issue, both sides have cards with custom coolers pre-installed and if noise is much of an issue you can always get a 3rd party cooler.

*DX10.1 won't be a problem since it's not a major shader model update, the main feature it allows is AA with deffered shading on DX10, it's not a huge deal and I assure you no developer is ever going to make a DX10.1 exclusive game, NV cards just won't get the benefits (which are minimal and the HD 38xx cards are too slow to use AA anyways) and will run the game just fine. SM2.0 to SM3.0 was a HUGE jump, it probably was a bad decision by ATI but it didn't hurt them in the long run, it just hurt the customers reluctant to upgrade for new games (running Bioshock on a X800XL... what a joke).

*That's a good point, power draw is not really an issue of economics but rather the problem it poses for people with weak PSUs and system stability. The only time where money matters with power draw is when you need to buy a new PSU for a card but that's about it, G92 cards are pretty efficient anyways so no issues there.

As for the actual matter at hand, I don't think ATI is in a rush to fix OGL, after all, ET QW is the most demanding OGL game and it runs perfectly fine on a single HD 3870. However ATI should look into this problem and fix it as soon as possible before the release of id Tech 5 and their game Rage, I don't know when it's coming but it's most likely using OpenGL (as is traditional with id Games engines and most likely given that it's on PS3 and Mac as well) and it looks awfully demanding, like, HD3870 Crossfire demanding.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: munky


And other people already explained what it really means - that OpenGL apps currently do not scale beyond 2 gpu's. The title of this thread means something else entirely.
It means CrossfireX does not support OpenGL.

CrossfireX is for 3 or 4 GPUs. However OpenGL is not supported using 3 or 4 GPUs.

Hence the use of the term not supported.

I don't think I can explain this in any simpler terms for you.



 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Piuc2020
Originally posted by: Martimus
Originally posted by: nRollo

At the end of the day, for buyers right now, only two things matter:

For dual GPU cards, NVIDIA offers a product that faster than two of ATis best dual GPUs combined, and offers comparable (if not better) IQ.

For single GPU cards, NVIDIA has no competition either.

In my opinion, nVidia has better gaming cards, while AMD has better HTPC cards. To say that nVidia has no competition, is sweeping away the features that non-gamers (90% of the computer builders that I know) care about under the rug.

Also, people generally start to care about those other features that you seem to sweep away because they needed a similar feature in the past, and didn't have it.

*People hated the noise of their previous card, because it kept them up at night, so they want a quieter video card for their next purchase.

*People want DX10.1, because their last card only supported DX9.0B, and they couldn't play some games because their card didn't support Shader Model 3.0 (which was my case), so they are afraid of something similar happening in the future.

*People had general issues with their computer, because their new Video Card drew too much power and it took a long time to debug that they needed a bigger PSU to fix their new problems, so they want to keep power draw down in future purchases. (I don't understand the money savings argument that goes with this. If you are saving a couple dollars a month, you are likely to spend it on something else anyway.)

Just making sweeping generalizations about peoples reasons for what they want is a good way to insult others, but it doesn't mean much.


*Thermals are not a brand issue, both sides have cards with custom coolers pre-installed and if noise is much of an issue you can always get a 3rd party cooler.

*DX10.1 won't be a problem since it's not a major shader model update, the main feature it allows is AA with deffered shading on DX10, it's not a huge deal and I assure you no developer is ever going to make a DX10.1 exclusive game, NV cards just won't get the benefits (which are minimal and the HD 38xx cards are too slow to use AA anyways) and will run the game just fine. SM2.0 to SM3.0 was a HUGE jump, it probably was a bad decision by ATI but it didn't hurt them in the long run, it just hurt the customers reluctant to upgrade for new games (running Bioshock on a X800XL... what a joke).

*That's a good point, power draw is not really an issue of economics but rather the problem it poses for people with weak PSUs and system stability. The only time where money matters with power draw is when you need to buy a new PSU for a card but that's about it, G92 cards are pretty efficient anyways so no issues there.

As for the actual matter at hand, I don't think ATI is in a rush to fix OGL, after all, ET QW is the most demanding OGL game and it runs perfectly fine on a single HD 3870. However ATI should look into this problem and fix it as soon as possible before the release of id Tech 5 and their game Rage, I don't know when it's coming but it's most likely using OpenGL (as is traditional with id Games engines and most likely given that it's on PS3 and Mac as well) and it looks awfully demanding, like, HD3870 Crossfire demanding.

It's true that developers will not make an exclusive DX10.1 game because the difference is just too minimal. It's true that the HD 38XX tends to have a greater impact in performance when Anti Aliasing is used, but most games are still playable when it's used. That architecture uses Anti Aliasing though shaders, in games like Call of Juarez which shader anti aliasing is used, the HD 38XX series pulls ahead of the geForce 8 series in most resolutions, that shows that the Radeon HD 38XX has a quite powerful but hard to optimize shader units, if the GeForce 8 used shader anti aliasing, it's impact in performance would be far more greater. So in simple words, is a feat for that architecture to retain playable FPS using shader anti aliasing in most games.

Bear in mind that the Radeon X800 series are Shader Model 2.0b, which uses the same 512 shader instruction slot per component as Shader Model 3.0, the difference is that SM 3.0 have dynamic branching which allows unlimited instruction length of the shaders, there's no game that really pushes more than 500 shader instructions per frame because it would be unplayable, no matter if it's 2.0b or 3.0, only techdemos (DX 9.0 of course) barely reaches over 300 instructions. I was able to run Bioshock with the hack posted in some forum with my old X800XT PE and I was able to run it on high at 1024x768, the main issue is that the levels takes too long to load because that hack compiles the shader in real time. Gears of War also runs on high with the same resolution on that card and uses the same engine as Bioshock and looks even better.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: Piuc2020
Originally posted by: Martimus
Originally posted by: nRollo

At the end of the day, for buyers right now, only two things matter:

For dual GPU cards, NVIDIA offers a product that faster than two of ATis best dual GPUs combined, and offers comparable (if not better) IQ.

For single GPU cards, NVIDIA has no competition either.

In my opinion, nVidia has better gaming cards, while AMD has better HTPC cards. To say that nVidia has no competition, is sweeping away the features that non-gamers (90% of the computer builders that I know) care about under the rug.

Also, people generally start to care about those other features that you seem to sweep away because they needed a similar feature in the past, and didn't have it.

*People hated the noise of their previous card, because it kept them up at night, so they want a quieter video card for their next purchase.

*People want DX10.1, because their last card only supported DX9.0B, and they couldn't play some games because their card didn't support Shader Model 3.0 (which was my case), so they are afraid of something similar happening in the future.

*People had general issues with their computer, because their new Video Card drew too much power and it took a long time to debug that they needed a bigger PSU to fix their new problems, so they want to keep power draw down in future purchases. (I don't understand the money savings argument that goes with this. If you are saving a couple dollars a month, you are likely to spend it on something else anyway.)

Just making sweeping generalizations about peoples reasons for what they want is a good way to insult others, but it doesn't mean much.


*Thermals are not a brand issue, both sides have cards with custom coolers pre-installed and if noise is much of an issue you can always get a 3rd party cooler.

*DX10.1 won't be a problem since it's not a major shader model update, the main feature it allows is AA with deffered shading on DX10, it's not a huge deal and I assure you no developer is ever going to make a DX10.1 exclusive game, NV cards just won't get the benefits (which are minimal and the HD 38xx cards are too slow to use AA anyways) and will run the game just fine. SM2.0 to SM3.0 was a HUGE jump, it probably was a bad decision by ATI but it didn't hurt them in the long run, it just hurt the customers reluctant to upgrade for new games (running Bioshock on a X800XL... what a joke).

*That's a good point, power draw is not really an issue of economics but rather the problem it poses for people with weak PSUs and system stability. The only time where money matters with power draw is when you need to buy a new PSU for a card but that's about it, G92 cards are pretty efficient anyways so no issues there.

As for the actual matter at hand, I don't think ATI is in a rush to fix OGL, after all, ET QW is the most demanding OGL game and it runs perfectly fine on a single HD 3870. However ATI should look into this problem and fix it as soon as possible before the release of id Tech 5 and their game Rage, I don't know when it's coming but it's most likely using OpenGL (as is traditional with id Games engines and most likely given that it's on PS3 and Mac as well) and it looks awfully demanding, like, HD3870 Crossfire demanding.

I agree with you on all the issues except on idTech5.

I'd like to add power draw is pretty much a nonissue. As reviews point out, single and even SLI graphics cards dont even break the 400W barrier on total system power.

I mean look at it this way, how many of us don't run at least quality 400W PSUs for single setups and 550W for SLI/Xfire? Quality 400W/550W PSUs are very, very capable of running 98% of setups very, very well. I ran a 9800Pro overclocked and flashed to XT bios with a 240W quality PSU and I've run a 7900GT(fomerly, sold it) and a x1950GT overclocked to Pro speeds(up until last week) with a 350W el cheapo POS power supply with no problems.

Now I have a FSP/Sparkle quality 400W PSU that never breaks a sweat despite dual optical drives, dual HDDs and an overclocked graphics card and an old, power comparatively hungry AMD64 130nm process CPU.

As for id, they are more about pushing GPU limits than designing decent games. Doom III was hardly a groundbreaking game gameplay wise and I think the public will realize that sooner or later and sales will drop. What we should be looking forward to on the gameplay front is the games after Rage that will be based on the idTech5 engine that will actually be fun.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
I actually enjoyed Doom 3 a lot, it wasn't groundbreaking but it was simple and it was fun (plus it made my jaw drop the first time I saw it, no other game since then, not even Crysis, has produced the same effect on me) and it had great graphics and sound, I thought it was a nice game. Quake 4 was pretty mediocre though.

Rage looks much more elaborate and is blending a bunch of genres, in any case, however you look at it this will be a major title and if ATI doesn't sort out it's OGL issues by the time it comes out then they will be in trouble.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Wreckage
Originally posted by: munky


And other people already explained what it really means - that OpenGL apps currently do not scale beyond 2 gpu's. The title of this thread means something else entirely.
It means CrossfireX does not support OpenGL.

CrossfireX is for 3 or 4 GPUs. However OpenGL is not supported using 3 or 4 GPUs.

Hence the use of the term not supported.

I don't think I can explain this in any simpler terms for you.

So any games that don't scale above 2 GPU's are "unsupported"?

You should make a thread titled "ATI and nvidia still not supporting 90% of D3D and OpenGL games" to let everyone know ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
It means CrossfireX does not support OpenGL.

CrossfireX is for 3 or 4 GPUs. However OpenGL is not supported using 3 or 4 GPUs.

Hence the use of the term not supported.

I don't think I can explain this in any simpler terms for you.
So then by your reasoning SLI doesn't support Direct3D or OpenGL games in the instances it doesn't scale?

If so can I make a thread titled "nVidia still not supporting Direct3D or OpenGL"?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
It means CrossfireX does not support OpenGL.

CrossfireX is for 3 or 4 GPUs. However OpenGL is not supported using 3 or 4 GPUs.

Hence the use of the term not supported.

I don't think I can explain this in any simpler terms for you.
So then by your reasoning SLI doesn't support Direct3D or OpenGL games in the instances it doesn't scale?

If so can I make a thread titled "nVidia still not supporting Direct3D or OpenGL"?

CrossfireX does not support any OpenGL games. It does not support OpenGL at all.

Are you saying SLI does not support DirectX or OpenGL at all? Are you saying that all games using either OpenGL or DirectX are not supported by SLI? That's a pretty bold accusation there BFG!

I'm not sure why you would defend this? Well I sort of have an idea but the facts are pretty clear in this case.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: schneiderguy
[

So any games that don't scale above 2 GPU's are "unsupported"?

You should make a thread titled "ATI and nvidia still not supporting 90% of D3D and OpenGL games" to let everyone know ;)

Actually unlike Crossfire you can create your own profiles for games in SLI so that just about all games can be supported.

However I would love to see a link to where you came up with 90% and what you consider that number to be for Crossfire.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
CrossfireX does not support any OpenGL games. It does not support OpenGL at all.
Err, no. It doesn?t scale past two GPUs but it?ll still physically run them.

Saying it doesn?t support them means they won?t run period, which is false.

Are you saying SLI does not support DirectX or OpenGL at all? Are you saying that all games using either OpenGL or DirectX are not supported by SLI?
No, you are. You?re claiming Crossfire X doesn?t support OpenGL games and your evidence is that it doesn?t scale past 2 GPUs.

Using that reasoning you can say exactly the same about nVidia?s SLI for games that don?t scale past one GPU.

Why the double standard?

That a pretty bold accusation there BFG!
But you made it.

I'm not sure why you would defend this?
What am I defending? Your topic title is a troll; it?s inaccurate and designed to illicit a response, AKA baiting.

Furthermore you aren?t applying your own definition of ?support? (not scaling) to nVidia so I?m calling you out on it with facts and logic, but as a regular poster rather than a moderator.

Actually unlike Crossfire you can create your own profiles for games in SLI so that just about all games can be supported.
Just because you make an SLI profile it does not mean you?ll get error-free scaling.

Do you think for example I can use an nVidia driver 18 months old, make an SLI profile for Crysis, and everything will be peachy?

However I would love to see a link to where you came up with 90% and what you consider that number to be for Crossfire.
IIRC nVidia?s master list of profiles contains 200, maybe 300 games. Compared to the tens of thousands of 3D games on the PC that is a drop in the bucket.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K

Err, no. It doesn?t scale past two GPUs but it?ll still physically run them.

Saying it doesn?t support them means they won?t run period, which is false.
CrossfireX is specifically for scaling past 2 GPUs. Which it does not support for OpenGL. Why are you trying to spin that? Do you not no the difference between standard Crossfire and CrossfireX?

No, you are. You?re claiming Crossfire X doesn?t support OpenGL games and your evidence is that it doesn?t scale past 2 GPUs.

Using that reasoning you can say exactly the same about nVidia?s SLI for games that don?t scale past one GPU.

Why the double standard?

Are you saying that SLI does not scale for a specific API like CrossfireX?


What am I defending? Your topic title is a troll; it?s inaccurate and designed to illicit a response, AKA baiting.
No it's not. CrossfireX is for 3 or more GPUs except that it does not support OpenGL. Even that Anandtech review states this. Go yell at them. Go call them a troll.
Furthermore you aren?t applying your own definition of ?support? (not scaling) to nVidia so I?m calling you out on it with facts and logic, but as a regular poster rather than a moderator.
Please list which specific API either DirectX or OpenGL that SLI does not scale on. Your agenda is clear, now back it up.

Just because you make an SLI profile it does not mean you?ll get error-free scaling.
Whoa now don't back down from saying SLI does not support a certain API. You are back pedaling here.
Do you think for example I can use an nVidia driver 18 months old, make an SLI profile for Crysis, and everything will be peachy?
Is Crysis an API? Are you now saying that Crossfire has no bugs in any game and it has profiles for every game? or that you can at least make profiles for unsupported games?

IIRC nVidia?s master list of profiles contains 200, maybe 300 games. Compared to the tens of thousands of 3D games on the PC that is a drop in the bucket.
You can make your own, so yeah it does cover pretty much every game. However this thread has nothing to do with NVIDIA. So why change the subject?

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
No it's not. CrossfireX is for 3 or more GPUs except that it does not support OpenGL.
CrossfireX is specifically for scaling past 2 GPUs.
False; this has been repeatedly pointed out to you but you insist on perpetuating inaccurate arguments.

Let's quote ATi's product page on the issue, shall we?

http://game.amd.com/us-en/crossfirex_about.aspx

ATI CrossFireX? is the ultimate multi-GPU performance gaming platform. Enabling game-dominating power, ATI CrossFireX technology enables two or more discrete graphics processors to work together to improve system performance.

Like I said earlier, TWO or more. Are you now going to retract your fallacious claims given your arguments have been proven wrong beyond a shadow of a doubt?

My question also becomes where it says anything about that in your thread title?

All I see is:

Topic Title: ATI still not supporting OpenGL
Topic Summary: Come on AMD you should be better than this

Now even if Crossfire X was three or more GPUs like claim, your thread title is still wrong by miles. It's wrong because tri/quad Crossfire will support OpenGL apps (i.e. they'll run), they just won't scale past two GPUs.

AFAIK at no time will tri/quad Crossfire demand "your active configuration does not support OpenGL games, please disable two or more GPUs and try again" when someone tries to launch an OpenGL title.

Your thread title is trolling to the extreme degree and a casual reader might be forgiven into thinking ATi doesn't support OpenGL period.

A more accurate thread title might be "OpenGL games don't scale past two GPUs on Crossfire X". Of course we already know this given ATi told us so, so such a thread title is unlikely to stir up the hive as much as your one does, eh Wreckage?

Are you saying that SLI does not scale for a specific API like CrossfireX?
But that isn't even the case for Crossfire X.

Even that Anandtech review states this. Go yell at them. Go call them a troll.
Pardon? Where did Anandtech state "ATI still not supporting OpenGL" like you did? They said nothing of the sort. Put up evidence or retract your claim.

Please list which specific API either DirectX or OpenGL that SLI does not scale on. Your agenda is clear, now back it up.
:roll:

Whoa now don't back down from saying SLI does not support a certain API. You are back pedaling here
Okay if that's the way you want it, using your reasoning that support = scaling, SLI doesn't support 90% of games out there.

Is Crysis an API? Are you now saying that Crossfire has no bugs in any game and it has profiles for every game? or that you can at least make profiles for unsupported games?
You?re the one that told us making a profile would give you scaling.

So no, it's not an API and no, I didn't say that.

What I said is that using your criteria of support (i.e. scaling) SLI doesn't support 90% of games out there. In fact it's probably more than that given there are tens of thousands of 2D/DOS/emulated games out there I didn?t account for the first time.

You can make your own, so yeah it does cover pretty much every game.
Except it doesn't. Demonstrate to me how to get SLI scaling in Starcraft. And again even for 3D games making a prolife is no guarantee of error free scaling and in fact nVidia themselves ship profiles with SLI disabled.

However this thread has nothing to do with NVIDIA. So why change the subject?
Because it demonstrates the bias and double standards of your arguments.

Edits: bad grammar.