Support for GX2 / GTX 295

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
How long do you think nVidia is gonna support this GPU?

The 9800GX2 was released not that long ago (March 2008) and apparently support has already been abandoned from a driver perspective (EOL'ed in 3 months lol...), for example Far Cry benchmarks show that its slower than an 8800GTS (since each core is clocked slower).

So will GTX295 be complete irrelevant in a few months and have even less resell value than a GTX280...since it will be slower card?

I'm building a new rig in the coming weeks and I'm torn between getting a GTX295 or a GTX285 SC...the price difference isn't huge but performance for most "current" games is huge...but some reviewer's have reviewed less mainstream games and without nvidia writing specific drivers for these games, it seems like the second card is completely worthless...falling quite a bit behind the GTX280.

So in the long run it seems SLI will be better than a GX2 card since SLI support is gonna affect cards across the board...not so for GX2-type cards?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Astrallite
How long do you think nVidia is gonna support this GPU?

The 9800GX2 was released not that long ago (March 2008) and apparently support has already been abandoned from a driver perspective (EOL'ed in 3 months lol...), for example Far Cry benchmarks show that its slower than an 8800GTS (since each core is clocked slower).

So will GTX295 be complete irrelevant in a few months and have even less resell value than a GTX280...since it will be slower card?

I'm building a new rig in the coming weeks and I'm torn between getting a GTX295 or a GTX285 SC...the price difference isn't huge but performance for most "current" games is huge...but some reviewer's have reviewed less mainstream games and without nvidia writing specific drivers for these games, it seems like the second card is completely worthless...falling quite a bit behind the GTX280.

So in the long run it seems SLI will be better than a GX2 card since SLI support is gonna affect cards across the board...not so for GX2-type cards?

Why do you say driver support is EOL for the 9800GX2 when it is not?

Second, NVIDIA doesn't have to "write drivers" for the GTX295. (or 9800GX2, any SLi)
SLi users have always had the ability to force AFR, AFR2, or SFR on any game- and they can write their own game profiles in the drivers. This is the primary advantage of SLi over Crossfire.

A single GTX295 GPU is not "quite a bit behind" a GTX280- as it is a GTX280 GPU with GTX260 memory.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The card was EOL, not the drivers.

SLI is a perfect example of buying for the now, not the futureproof. It will be faster now with some specific very intensive games, as well very high resolutions, but it is NOT an upgrade path, and has a lower on resale value in the future (due to its obscene power and heat requirements, combined with lower performance than a single high end card of the future, while a single current high end card will compete with a single mid range card of the future).

Usually the SLI/CF performance goes UP in games as the game ages... Typically a brand new release game has lower SLI than single card performance. at about 3 months after release it is about 30% scaling, at about 6 months its about 50%, at about 12 months after game release it is about 70-90% scaling and as high as that game will ever go. The reason it doesn't scale well with farcry, is because farcry is so new.

Anyways, if money, and resale value, and cost of operation are issues to you... than you should not be bothering with multi GPU setups. Simply buy a decent single GPU card.
 

instantcoffee

Member
Dec 22, 2008
28
0
0
Support for the 9800GX2 isn't dropped completely, but I can't remember seeing any optimizations or bug fixes the last months.

Edit: I withdraw that, it has been made bug fixes. :)

nRollo:
I am interested in this question as well. Its not true that cards automatically have support and don't need it especially in drivers. Does 7950GX2 ring a bell? Here's a refresher:

? Important Note: SLI support for the e-GeForce 7950 GX2 will be provided through a future NVIDIA ForceWare driver release, and will only be available initially through authorized system builders.
http://www.ubergizmo.com/15/ar...with_hdcp_support.html

And people already had sli support for their 7800GTX cards. It needed to be written spesifically for the dual GPU cards.

Here's another one from the newest driver set from Nvidia:
What?s New in Release 181
? Added support for NVIDIA SLI technology on SLI-certified Intel X58-based 
motherboards with the following GPUs: GeForce GTX 280, GeForce GTX 
260, GeForce 9800 GX2, GeForce 9800 GTX+, and GeForce 9800 GTX. 
? Added support for GeForce GTX 295 and GeForce GTX 285.
? Added NVIDIA SLI Multi-monitor support, giving you the ability to use 
two monitors with your GeForce graphics cards in SLI mode.

http://us.download.nvidia.com/...orce_Release_Notes.pdf

Does this mean that they added support for ALL Nvidia cards on the X58?
Does this mean that every other driver supports the GTX295 and Nvidia actually didn't add support for it as they said they did?

Go into the products supported list here:
http://www.nvidia.com/object/w...a_x64_181.22_whql.html

Most of those cards only have legacy support. Check with the driver department and they will confirm it.

This is what the OP asks. When will the support of the GTX295 be reduced to legacy support only?

It is an issue for people:
http://forums.guru3d.com/showthread.php?p=2918295

Should deserve a good answer. :)


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: instantcoffee
Support for the 9800GX2 isn't dropped completely, but I can't remember seeing any optimizations or bug fixes the last months.

Edit: I withdraw that, it has been made bug fixes. :)

nRollo:
I am interested in this question as well. Its not true that cards automatically have support and don't need it especially in drivers. Does 7950GX2 ring a bell? Here's a refresher:

? Important Note: SLI support for the e-GeForce 7950 GX2 will be provided through a future NVIDIA ForceWare driver release, and will only be available initially through authorized system builders.
http://www.ubergizmo.com/15/ar...with_hdcp_support.html

And people already had sli support for their 7800GTX cards. It needed to be written spesifically for the dual GPU cards.

Here's another one from the newest driver set from Nvidia:
What?s New in Release 181
? Added support for NVIDIA SLI technology on SLI-certified Intel X58-based 
motherboards with the following GPUs: GeForce GTX 280, GeForce GTX 
260, GeForce 9800 GX2, GeForce 9800 GTX+, and GeForce 9800 GTX. 
? Added support for GeForce GTX 295 and GeForce GTX 285.
? Added NVIDIA SLI Multi-monitor support, giving you the ability to use 
two monitors with your GeForce graphics cards in SLI mode.

http://us.download.nvidia.com/...orce_Release_Notes.pdf

Does this mean that they added support for ALL Nvidia cards on the X58?
Does this mean that every other driver supports the GTX295 and Nvidia actually didn't add support for it as they said they did?

Go into the products supported list here:
http://www.nvidia.com/object/w...a_x64_181.22_whql.html

Most of those cards only have legacy support. Check with the driver department and they will confirm it.

This is what the OP asks. When will the support of the GTX295 be reduced to legacy support only?

It is an issue for people:
http://forums.guru3d.com/showthread.php?p=2918295

Should deserve a good answer. :)

1. I don't think we can draw parallels between when NVIDIA launched their first SLi in one slot effort to their third. For example, you would never see a "refer to your OEM" for drivers message now, NVIDIA launches unified drivers on their website.

2. If you think there's not more driver work done for new products than old, I'd suggest you don't understand business. Companies are in business to make money, and trying to get another fps or two out of old products likely isn't high on any hardware vendors list.

I sort of doubt ATi spends much time these days trying to optomize for 3870X2s compared to 4870X2s, for example.

Beyond this, the drivers have already been tweaked for those parts.

3. 7950GX2s, 9800GX2s, and GTX295s are all still SLi, and as such you can still create SLi profiles and force all render modes. So I answered his question.

 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Hmm I think I'll lean towards SLI 285 over GTX295 right now...I don't want to worry about scaling issues in the future for games that may or may not receive optimization, at least it won't take a total dump in certain games like the 9800GX2 is doing, vs SLI 8800s. nVidia will keep supporting SLI optimization regardless.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Astrallite
Hmm I think I'll lean towards SLI 285 over GTX295 right now...I don't want to worry about scaling issues in the future for games that may or may not receive optimization, at least it won't take a total dump in certain games like the 9800GX2 is doing, vs SLI 8800s. nVidia will keep supporting SLI optimization regardless.

Whatever you say, but my old 7950GX2 and 9800GX2 are on loan to good friends, still serving them well.

The reports of dual GPU card driver support being bad are greatly exaggerated.

EDIT- I don't really care what video card you buy, just don't want you to make the choice based on bad info.
 

legcramp

Golden Member
May 31, 2005
1,671
113
116
I wouldn't touch another nvidia sandwich card since the 7950gx2, what a nightmare.
 

instantcoffee

Member
Dec 22, 2008
28
0
0
Originally posted by: nRollo1. I don't think we can draw parallels between when NVIDIA launched their first SLi in one slot effort to their third. For example, you would never see a "refer to your OEM" for drivers message now, NVIDIA launches unified drivers on their website.

2. If you think there's not more driver work done for new products than old, I'd suggest you don't understand business. Companies are in business to make money, and trying to get another fps or two out of old products likely isn't high on any hardware vendors list.

I sort of doubt ATi spends much time these days trying to optomize for 3870X2s compared to 4870X2s, for example.

Beyond this, the drivers have already been tweaked for those parts.

3. 7950GX2s, 9800GX2s, and GTX295s are all still SLi, and as such you can still create SLi profiles and force all render modes. So I answered his question.

1. It was not ment as an example of "refer to your OEM", but that the 7950GX2 didn't benifit from the existing SLI support and needed to get its own in a newer driver release. Since SLI support was already there for two 7800GTX cards, it should have been enabled for the 7950GX2 if it was automatic. Especially since the drivers supported 7950GX2 cards, but they didn't support sli with them.

2. I know that business prioritize the newest products that they wish to sell, compared to the old ones. The problem people who have had the GX2's are that they feel the support have dropped pretty quickly beyond the legacy support and a couple of fixes here and there (as I linked in the thread about some feeling this). I've been reading this so much that even I have become spectical about GX2's and I'm pretty critical to all the "issues" people have. Especally considering how much of them are not even related to the GFX cards.

Thats why I was glad to see this question pop up and that a focus member/nvidia forum moderator joined the thread. People experience lack of support for the GX2's and it doesn't seem its without reason. Thats why I wish you could provide a good answer without dismissing that they have issues with it. An answer that might explain why so many feel this way. :)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: instantcoffee

1. It was not ment as an example of "refer to your OEM", but that the 7950GX2 didn't benifit from the existing SLI support and needed to get its own in a newer driver release. Since SLI support was already there for two 7800GTX cards, it should have been enabled for the 7950GX2 if it was automatic. Especially since the drivers supported 7950GX2 cards, but they didn't support sli with them.

Cards like that (with bridge chip) had not existed up to that point in time, now we're on our 3rd generation of them.

Originally posted by: instantcoffee
2. I know that business prioritize the newest products that they wish to sell, compared to the old ones. The problem people who have had the GX2's are that they feel the support have dropped pretty quickly beyond the legacy support and a couple of fixes here and there (as I linked in the thread about some feeling this). I've been reading this so much that even I have become spectical about GX2's and I'm pretty critical to all the "issues" people have. Especally considering how much of them are not even related to the GFX cards.
And yet my friends are using my old GX2s with no problems, and I was before them. GX2 cards don't go on "legacy support" any sooner or later than the other cards in their generation.


Originally posted by: instantcoffee
Thats why I was glad to see this question pop up and that a focus member/nvidia forum moderator joined the thread. People experience lack of support for the GX2's and it doesn't seem its without reason. Thats why I wish you could provide a good answer without dismissing that they have issues with it. An answer that might explain why so many feel this way. :)

"So many" don't feel this way. What you need to understand about posts of "problems" on forums is 1. People who don't experience problems don't post "Hey, I ran a game with no problem!" Seeing ten guys on a forum post they have a problem seems like a big deal until you realize there are 100s of 1000s who don't. 2. Often people create their own "problems" by OCing, bad procedures, etc.. 3. All cards have outstanding driver issues at all times, this isn't console gaming.
Etc.

I used the 7950GX2 for several years, my buddy still is. I used the 9800GX2 for 6 months, my buddy still is. I'm currently using the latest "GX2".

The three of us have never experienced the woeful lack of driver support you seem to be trying to allege.

How much experience have you had with GX2s?

 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
nRollo,

None of us (I think) are trying to be inflamatory. If it comes off like we are, sorry for the misunderstanding. I know not every reviewer is using the same systems, or may have flaws with their testing conditions, hell may even be using old drivers on one system to the next. Sometimes the details may be lost but we still rely a lot on reviews as authority sources.

There are a few cases where the 9800GX2 in particular runs even slower than an 8800GT, and while there are games where GX2 and SLI don't scale well, it seems like SLI is less of a candidate for failure since its a staple of nvidia generation after generation. . I suspect being a little proactive and disabling SLI may help in these cases, but its not always entirely obvious. Forcing AFR might also work...but you never know.

GX2 cards also seem to occasionally suffer from pretty disastrous minimum framerates even though the averages equal or even surpass the flagship single GPU cards. Of course, I'm not claiming that SLI doesn't suffer from this as well, since the benchmarks i look at are scattered all over the place from multiple reviews, so its accuracy of all of the individual premises in an enclosed system may not hold.

Anyway, I know the way I went was costlier (alot) but I feel better in the long run (I'm the kind of guy that builds a new comp every 6 years or so).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Astrallite
Hmm I think I'll lean towards SLI 285 over GTX295 right now...I don't want to worry about scaling issues in the future for games that may or may not receive optimization, at least it won't take a total dump in certain games like the 9800GX2 is doing, vs SLI 8800s. nVidia will keep supporting SLI optimization regardless.

that is a safe choice. The drives can always be split and sold and used individually.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Astrallite
nRollo,

None of us (I think) are trying to be inflamatory. If it comes off like we are, sorry for the misunderstanding. I know not every reviewer is using the same systems, or may have flaws with their testing conditions, hell may even be using old drivers on one system to the next. Sometimes the details may be lost but we still rely a lot on reviews as authority sources.

There are a few cases where the 9800GX2 in particular runs even slower than an 8800GT, and while there are games where GX2 and SLI don't scale well, it seems like SLI is less of a candidate for failure since its a staple of nvidia generation after generation. . I suspect being a little proactive and disabling SLI may help in these cases, but its not always entirely obvious. Forcing AFR might also work...but you never know.

GX2 cards also seem to occasionally suffer from pretty disastrous minimum framerates even though the averages equal or even surpass the flagship single GPU cards. Of course, I'm not claiming that SLI doesn't suffer from this as well, since the benchmarks i look at are scattered all over the place from multiple reviews, so its accuracy of all of the individual premises in an enclosed system may not hold.

Anyway, I know the way I went was costlier (alot) but I feel better in the long run (I'm the kind of guy that builds a new comp every 6 years or so).

I'm not going to fault you for buying the best dual GPU config out there dude, congrats.

I only disagree with the assertion GX2s get different driver support, or have "many issues".
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: Astrallite
GX2 cards also seem to occasionally suffer from pretty disastrous minimum framerates even though the averages equal or even surpass the flagship single GPU cards.

Here's as example of this happening:
http://www.guru3d.com/article/...s-performance-review/5
Lol, slower than a 3850 256mb card at 2560. :)

This was with fairly new 180.42 drivers and the 260 SLI scaled while the GX2 didn't.

I'm sure this had been fixed now but it does happen.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Yeah I did see that benchmark, SLI was doing fine, but 9800GX2 is trailing behind the 8800GT on some resolutions, and behind the 8800GTS at higher resolutions (since its clocked higher).
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Originally posted by: Zap
Originally posted by: nRollo
I sort of doubt ATi spends much time these days trying to optomize for 3870X2s compared to 4870X2s, for example.

Where's the Rage Fury MAXX support these days? :confused:

Hmm well its only capable of Direct X 7 right? So it should be fully optimized for Half Life ^.^
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: nRollo

Second, NVIDIA doesn't have to "write drivers" for the GTX295. (or 9800GX2, any SLi)
Yes they do. Without performing application specific optimizations to work around individual games? issues with multi-GPU, SLI simply wouldn?t work properly in said games.

SLi users have always had the ability to force AFR, AFR2, or SFR on any game- and they can write their own game profiles in the drivers. This is the primary advantage of SLi over Crossfire.
Being able to force SLI in a profile does not guarantee proper scaling or error-free rendering. If you think it does then roll back to the launch driver for your SLI configuration, make any SLI profiles you please, and then let me know how the latest games run.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Originally posted by: nRollo

Second, NVIDIA doesn't have to "write drivers" for the GTX295. (or 9800GX2, any SLi)
Yes they do. Without performing application specific optimizations to work around individual games? issues with multi-GPU, SLI simply wouldn?t work properly in said games.

SLi users have always had the ability to force AFR, AFR2, or SFR on any game- and they can write their own game profiles in the drivers. This is the primary advantage of SLi over Crossfire.
Being able to force SLI in a profile does not guarantee proper scaling or error-free rendering. If you think it does then roll back to the launch driver for your SLI configuration, make any SLI profiles you please, and then let me know how the latest games run.

1. AFAIK the SLi of the GTX295 is the same as the SLi of GTX280 and they don't need to write individual code for every game for the GX2 cards. I will verify this (or not) and post.

2. I don't think your second point applies at all to what I said. If you agree it's better to have the ability to create profiles and force any render mode is better not being able to create profiles and only force one render mode (which it is by definition) what does the fact that some games have issues with multi GPU have to do with anything? :confused:

What I said is true, driver flexibility is the primary advantage of SLi over CF these days. Why obfuscate the issue by adding "some games have errors with multi GPU"?

I would have countered that point with "The lack of CF profiles at launch and the ability to create them" is the primary advantage of SLi if you wanted to disagree, it's more applicable.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: nRollo

1. AFAIK the SLi of the GTX295 is the same as the SLi of GTX280 and they don't need to write individual code for every game for the GX2 cards.
I didn?t say they had to write code for every game, just the ones that resist scaling properly.

And while the GTX295 and GTX280 may share traits because they?re the same chipset, the 9800 GX2 is a different chipset so driver optimizations that work on one may not work as well on the other (e.g. making certain optimizations based on the assumption of a certain memory size being installed on each GPU).

2. I don't think your second point applies at all to what I said. If you agree it's better to have the ability to create profiles and force any render mode is better not being able to create profiles and only force one render mode (which it is by definition) what does the fact that some games have issues with multi GPU have to do with anything?
It has everything to do with it. You claimed nVidia doesn?t have to write drivers for SLI because you can make an SLI profile, which of course is utter nonsense.

Why obfuscate the issue by adding "some games have errors with multi GPU"?
Why obfuscate the issues of multi-GPU - specifically by brushing away the massive application specific optimizations required for them - by adding ?you can make profiles on nVidia??

Again, that you can make a profile does not guarantee you?ll get proper and error-free scaling.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Hmm, so wait, are you saying SLI and GX2 are exactly the same, and if a game is optimized for SLI (for the sake of discussion, lets say a game before GX2 existed, and we are using drivers that were *only* optimized for SLI) it will run equally as well on GX2--i.e, a 9800GX2 will exhibit identical performance (minimum framerate & average framerate) with SLI 8800GTs?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
it should be... GX2 is 2 SEPERATE cards bolted together, and connected via a regular SLI bridge (the one you normally need to connect two seperate cards with)... there is simply a PCIe splitter that splits a 16x pcie lane into two 8x lanes and each card only gets 8 lanes...
That is why a GX2 card is always slightly lower performance than two individual cards in SLI, because unlike the two individual cards, the GX2 card has lower pcie bandwidth. (and typically, lower clocks too for cooling purposes).

The AMD solution is a little more elegant, but essentially the same, its two seperate cards. At some driver version they just started hiding it from the user by disabling the "enable crossfire" button for it. So now it APPEARS like a single card, but to the OS it is still two completely separate cards.

The thing is... I see no reason why a game optimized for "sli" will work better with an older SLI set of cards, they are different cards, so optimizations could be different.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Astrallite
Hmm, so wait, are you saying SLI and GX2 are exactly the same, and if a game is optimized for SLI (for the sake of discussion, lets say a game before GX2 existed, and we are using drivers that were *only* optimized for SLI) it will run equally as well on GX2--i.e, a 9800GX2 will exhibit identical performance (minimum framerate & average framerate) with SLI 8800GTs?
No, it seems as if the GX2 solutions do not have their SLI profiles applied the same way as standard single-card SLI solutions. The main evidence of this is how a GX2 solution might have scaling issues due to SLI profile/driver problems while all the other single-card SLI solutions scale just fine.

The other issue would be optimizations, like RAM that may need to be specified if they're not detected automatically when the profile is applied, or if they're detected improperly. For example, a 9800GX2 has 1GB but only 512MB per GPU, but if 1GB was specified as the VRAM buffer, that could lead to serious scaling issues.

Easiest way to compare this would be to look at the SLI profile codes in nHancer for the different SLI solutions. Its obvious many of the same SLI hex strings are the same, but many are different for various game. Perhaps nRollo can compare the same game with different SLI solutions to see if there's a difference in the SLI hex strings listed.

For example: Far Cry 2 with GTX 280 SLI has:

SLI: AFR-4W [0x02406005] (predefined)
DX10-SLI: AFR-4W [0x004802F5] (predefined)

I plugged a few of those hex codes into a translator and couldn't make any direct correlation to the output values and specific optimizations, so there's most likely an additional translation done by the driver.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
No, it seems as if the GX2 solutions do not have their SLI profiles applied the same way as standard single-card SLI solutions. The main evidence of this is how a GX2 solution might have scaling issues due to SLI profile/driver problems while all the other single-card SLI solutions scale just fine.
That could very well be because of them having a different "model number" and simply not auto applying the EXISTING profile for that game... does manually creating an identical profile as the "single card equivalent" solve that issue? if it does than they are implemented that way.