Any new news on Quad SLI driver support yet?

m21s

Senior member
Dec 6, 2004
775
0
71
Just curious if there has been anything new with Nvidia releasing driver support for there 7950GX2's to run in SLI mode.

Are they even going to support this?

Quad SLI sounds yummy :)
My 7950GX2 is getting lonely in the case all by itself ;)
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.

I'm not sure if they changed their stance on that though.
 

m21s

Senior member
Dec 6, 2004
775
0
71
Originally posted by: wizboy11
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.

I'm not sure if they changed their stance on that though.

That would suck.

I didn't even realize that would be an issue :(
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: m21s
Just curious if there has been anything new with Nvidia releasing driver support for there 7950GX2's to run in SLI mode.

Are they even going to support this?

Quad SLI sounds yummy :)
My 7950GX2 is getting lonely in the case all by itself ;)

The 91.33 beta drivers support quad sli and people are using it now:

http://www.geforce3d.net/forums/index.php?topic=165.0

http://forums.firingsquad.com/firingsqu...sage?board.id=hardware&thread.id=85670

http://www.rage3d.com/board/showthread.php?t=33859362

Originally posted by: wizboy11
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.

I'm not sure if they changed their stance on that though.

Nvidia has said it will indeed work on 1st gen dual PCIE 8X SLI boards, albeit at slightly reduced performance. My educated guess is that a 8X slot is enough (for the most part) for even a 7950GX2.
Since the OP has a FX55 :Q I'd suggest buying the 2nd GX2, benching it, if its absymally slow (how much ATI stomping power do you need?? ;)) then move to either a NF590 (C2D) or get a dual X16 A64 mobo.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: m21s
Originally posted by: wizboy11
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.

I'm not sure if they changed their stance on that though.

That would suck.

I didn't even realize that would be an issue :(

If you plan on running quad SLI, surely upgrading your motherboard is trivial... I would be concerned about getting a new PSU though,

Originally posted by: Crusader
Nvidia has said it will indeed work on 1st gen dual PCIE 8X SLI boards, albeit at slightly reduced performance.

I believe it was you that called the GX2 the card for the man's man... Does a man's man run two of these with "slightly reduced performance"?
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Theoratically there will be a decrease in performance 4x vs 8x.
In reality, 4x should run as fast as 8x.

You are fine with your current mb. The bigger question is your CPU.. most likely it'll be CPU bounded with quad-SLI..
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: beggerking
The bigger question is your CPU.. most likely it'll be CPU bounded with quad-SLI..

True, but doesn't it depend on the game and the settings?

Also, if someone has enough money to fiddle around with quad-SLI, why not just get a good Conroe system ready and wait for G80 and R600?

Originally posted by: Crusader
The 91.33 beta drivers support quad sli and people are using it now:

http://www.geforce3d.net/forums/index.php?topic=165.0

That thread was started by Rollo, the AEG member who got banned here. No surprise he's saying it is awesome....

Also, his average frames when using quad-SLI are greatly different than the average ones your other link shows:

Originally posted by: Rollo at Geforce 3D
Scratching the surface of the quad sli goodness:

FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum

Oblivion: No benches, but played a while outdoors with all on max, 19X12, HDR- VERY smooth.

More to come!

An AEG member at a very Pro-Nvidia forum, go figure. His results are nice, but then in your other link the OP didn't even know how to use the FEAR benchmark until other members told him. He even searched for it in Windows, not knowing that it is as plain as daylight to spot in the FEAR options. His results are as follows:

Originally posted by: sqitso at FS
FEAR 7950GX2 QUAD SLI

MIN 76
AVERAGE 123
MAX 297

Lets compare:

Rollo's:
Min.................Avg................Max.
42...................90.................UNK
______________________________

sqitso's:
Min.................Avg................Max.
76...................123...............297

That's wierd. The guy who didn't even know how to bench FEAR is doing better than Rollo? If Rollo was as I have heard him to be, he must be losing his touch. That or a rich person without a clue is outpacing him by 33-34 frames and I doubt that going from 16x12 to 19x12 would cause that much of a bump.

Besides, without Vsync FEAR looks horrible with all of the screen tearing on a decent LCD and when you're in a room with a flickering light it is absolutely horrible. Then, if you want to fix that you have to Sync 4 GPU's with one monitor. Yeah, that's fun.

The rage3d link of yours is the only one that isn't restricted to FEAR scores and even they are on a Win 2003 server OS, with two dual core processors (Opteron 285's x 2).

Sorry Crusader, with how much you think HDR + AA was introduced too early, I'd have to say that quad-SLI just isn't beneficial yet. Especially when you think about hte cost of having a system that can do it effectively and when G80 and R600 are almost out.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: m21s
Originally posted by: nitromullet
Looks like the OP got his wish...

http://www.slizone.com/object/slizone_quad_beta.html


I scare myself sometimes.
I posted this last night thinking about quad SLI.

And BINGO today a beta driver is released.

Hmmm.....maybe I should start thinking about winning the lottery....

:beer:

So, you gonna do it?

All you need is an A8N32-SLI, 850W or 1KW PCP&C PSU, and another GX2... :)

edit: oops, doesn't look like the PCP&C 850 made the cut for Quad-SLI, but the XClio 700W did... hmmm...

http://www.slizone.com/object/slizone2_build.html#certified_powersupplies
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: josh6079
Originally posted by: beggerking
The bigger question is your CPU.. most likely it'll be CPU bounded with quad-SLI..

True, but doesn't it depend on the game and the settings?

Also, if someone has enough money to fiddle around with quad-SLI, why not just get a good Conroe system ready and wait for G80 and R600?

If you have the money, why not get both. Doesnt make sense to just avoid Nvidia products. Unless you dont want people using NV products, Joshua.

Originally posted by: josh6079
Originally posted by: Crusader
The 91.33 beta drivers support quad sli and people are using it now:

http://www.geforce3d.net/forums/index.php?topic=165.0[/Q ]

That thread was started by Rollo, the AEG member who got banned here. No surprise he's saying it is awesome....

Also, his average frames when using quad-SLI are greatly different than the average ones your other link shows:

Originally posted by: Rollo at Geforce 3D
Scratching the surface of the quad sli goodness:

FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum

Oblivion: No benches, but played a while outdoors with all on max, 19X12, HDR- VERY smooth.

More to come!

An AEG member at a very Pro-Nvidia forum, go figure. His results are nice, but then in your other link the OP didn't even know how to use the FEAR benchmark until other members told him. He even searched for it in Windows, not knowing that it is as plain as daylight to spot in the FEAR options. His results are as follows:

Originally posted by: sqitso at FS
FEAR 7950GX2 QUAD SLI

MIN 76
AVERAGE 123
MAX 297

Lets compare:

Rollo's:
Min.................Avg................Max.
42...................90.................UNK
______________________________

sqitso's:
Min.................Avg................Max.
76...................123...............297

That's wierd. The guy who didn't even know how to bench FEAR is doing better than Rollo? If Rollo was as I have heard him to be, he must be losing his touch. That or a rich person without a clue is outpacing him by 33-34 frames and I doubt that going from 16x12 to 19x12 would cause that much of a bump.

Besides, without Vsync FEAR looks horrible with all of the screen tearing on a decent LCD and when you're in a room with a flickering light it is absolutely horrible. Then, if you want to fix that you have to Sync 4 GPU's with one monitor. Yeah, that's fun.

The rage3d link of yours is the only one that isn't restricted to FEAR scores and even they are on a Win 2003 server OS, with two dual core processors (Opteron 285's x 2).

Sorry Crusader, with how much you think HDR + AA was introduced too early, I'd have to say that quad-SLI just isn't beneficial yet. Especially when you think about hte cost of having a system that can do it effectively and when G80 and R600 are almost out.

First, I dont know where you get off saying 7950 QuadSLI was introduced too early Josh.
It hasnt been introduced at all to the consumer market, besides a BETA driver. You can save your prejudgements and obvious zeal to downplay 7950 QuadSLI when NV officially pulls the curtain and has WHQLs ready.
I know you want to judge based off of beta drivers and so on, but I give companies fair quarter.
When the drivers are ready, Nvidia will let everyone know. Then you can unleash your obvious bias, k.

Till then, not sure why you are directing that at me. I like Nvidia, but I havent made any calls for QuadSLI. I personally have paid little attention to any Quad benchmarks.
When Nvidia pulls the veil on their product, I'll stand up and take notice.

Secondly, I'm not sure Rollo is "losing his touch". I'm not sure why you are comparing tests taken on different systems, at different resolutions, probably different settings altogether and trying to make conclusions based on that?

Yeah we should assume the guy who had to ask how to do it ran the benches better than Rollo, who?s benched FEAR many times and posted his results all over the web.

Besides the fact that the FS guy?s processor is at 2.8GHz and Rollo likely doesn?t have that, and the fact they used different resolutions, we don?t know if the FS guy used AA/AF?

Hothardware got 95fps average with quad SLI at 16X12 using 4X16X like Rollo used, apparently the guy who had to ask how to bench it knows better than them as well?

If you take a look at HotHardwares review with FEAR benches http://www.hothardware.com/viewarticle.aspx?articleid=841&cid=2

Rollos Results
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
HotHardwares 16x12 4x16X 95FPS avg

Losing his touch? His results look pretty good from here.

I'd say if you consider yourself to have a pair, if you have an issue with Rollo to take it to a forum hes registered on and face him. You could always ask him yourself.
But it is easier to cower and curse his name as a distance. I understand, some people live their whole lives that way.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
If you have the money, why not get both. Doesnt make sense to just avoid Nvidia products. Unless you dont want people using NV products, Joshua.

I never said they couldn't use an Nvidia card with a Conroe system. You're right, if they have the money they could do both. I just think that updating a complete system to DDR2 and Conroe will prove to be a longer lived benefit than another 7950GX2.

First, I dont know where you get off saying 7950 QuadSLI was introduced too early Josh.
It hasnt been introduced at all to the consumer market, besides a BETA driver. You can save your prejudgements and obvious zeal to downplay 7950 QuadSLI when NV officially pulls the curtain and has WHQLs ready.
I know you want to judge based off of beta drivers and so on, but I give companies fair quarter.
When the drivers are ready, Nvidia will let everyone know. Then you can unleash your obvious bias, k.

Till then, not sure why you are directing that at me. I like Nvidia, but I havent made any calls for QuadSLI. I personally have paid little attention to any Quad benchmarks.
When Nvidia pulls the veil on their product, I'll stand up and take notice.

That just further proves my point. G80 is getting closer and closer and so is DX10 and DX10 games. If they wait longer and longer to get drivers out for Quad-SLI, the ability will be very short lived and too early of a feature to be introduced only to be replaced 6 or less months later. Not saying that it is a bad idea at all, it is pretty neat that they can link four GPU's together in order to perform better than one or two.

Secondly, I'm not sure Rollo is "losing his touch". I'm not sure why you are comparing tests taken on different systems, at different resolutions, probably different settings altogether and trying to make conclusions based on that?

The settings were labeled to be the same--max settings, 4xAA, 16xAF. The only thing that could be different is the resolution difference between 16x12 and 19x12. Unless you believe that one of those two is running on default Quality settings while the other is running on Nvidias High Quality?

Yeah we should assume the guy who had to ask how to do it ran the benches better than Rollo, who?s benched FEAR many times and posted his results all over the web.

We don't need to assume, he did. ;)

Besides the fact that the FS guy?s processor is at 2.8GHz and Rollo likely doesn?t have that, and the fact they used different resolutions, we don?t know if the FS guy used AA/AF?

C'mon Crusader, who has a quad-SLI rig and does not use AA/AF?

Hothardware got 95fps average with quad SLI at 16X12 using 4X16X like Rollo used

That's interesting since Rollo was gaming at 19x12.

If you take a look at HotHardwares review with FEAR benches http://www.hothardware.com/viewarticle.aspx?articleid=841&cid=2

Rollos Results
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
HotHardwares 16x12 4x16X 95FPS avg

Losing his touch? His results look pretty good from here.

I'd say if you consider yourself to have a pair, if you have an issue with Rollo to take it to a forum hes registered on and face him. You could always ask him yourself.

But it is easier to cower and curse his name as a distance. I understand, some people live their whole lives that way.

He's AEG trash, why would I try to argue with someone on a different forum about the video card that he probably got for free?

You can take my previous posts as some kind of attack if you want to. Fact is, I just don't see the sense in quad-SLI when you can get very playable frames out of other Nvidia hardware for a lot less. You can be Rollo's dog and repeat all of his kinds of scores, but that doesn't change that you're going to be paying around $1000 for more screen tearing, or horrible syncrinization all bundled with the same 7 series feature set that has been around for more than a year.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: josh6079
Originally posted by: Crusader
If you have the money, why not get both. Doesnt make sense to just avoid Nvidia products. Unless you dont want people using NV products, Joshua.

I never said they couldn't use an Nvidia card with a Conroe system. You're right, if they have the money they could do both. I just think that updating a complete system to DDR2 and Conroe will prove to be a longer lived benefit than another 7950GX2.

Not in games.

Originally posted by: josh6079
Originally posted by: Crusader
First, I dont know where you get off saying 7950 QuadSLI was introduced too early Josh.
It hasnt been introduced at all to the consumer market, besides a BETA driver. You can save your prejudgements and obvious zeal to downplay 7950 QuadSLI when NV officially pulls the curtain and has WHQLs ready.
I know you want to judge based off of beta drivers and so on, but I give companies fair quarter.
When the drivers are ready, Nvidia will let everyone know. Then you can unleash your obvious bias, k.

Till then, not sure why you are directing that at me. I like Nvidia, but I havent made any calls for QuadSLI. I personally have paid little attention to any Quad benchmarks.
When Nvidia pulls the veil on their product, I'll stand up and take notice.

That just further proves my point. G80 is getting closer and closer and so is DX10 and DX10 games. If they wait longer and longer to get drivers out for Quad-SLI, the ability will be very short lived and too early of a feature to be introduced only to be replaced 6 or less months later. Not saying that it is a bad idea at all, it is pretty neat that they can link four GPU's together in order to perform better than one or two.

Fair enough. Its a good choice for those with the money and desire though. Certainly an industry first.


Originally posted by: josh6079
Originally posted by: Crusader
Secondly, I'm not sure Rollo is "losing his touch". I'm not sure why you are comparing tests taken on different systems, at different resolutions, probably different settings altogether and trying to make conclusions based on that?

The settings were labeled to be the same--max settings, 4xAA, 16xAF. The only thing that could be different is the resolution difference between 16x12 and 19x12. Unless you believe that one of those two is running on default Quality settings while the other is running on Nvidias High Quality?
I dont know, I'm not about to guess at the individual PCs, how they were configured and the game settings. Not to mention all the driver settings left out of the loop.

I'll leave the comparisons for review sites, not some guy in a garage.
Which is why I (rightfully) dont buy your assertions on high res HDR+AA being playable. Ackmed insists its GREAT on a single card, others refute him. And in that group of refuting parties includes review sites. Which when I call him out on that he claims intensive portions of the game are not valid, and do not matter.

Guess some people prefer to only look at the Best-Case scenario when recommending hardware?
Thats not how I do things.

Originally posted by: josh6079
Originally posted by: Crusader
Yeah we should assume the guy who had to ask how to do it ran the benches better than Rollo, who?s benched FEAR many times and posted his results all over the web.

We don't need to assume, he did. ;)

Erm.. ya, ok. If you think these are proper benchmarks done in controlled environments, or reveal anything about Rollo and his "inability to properly get a system running" (or whatever you are trying to infer).. your out of your mind josh.

Originally posted by: josh6079
Originally posted by: Crusader
Besides the fact that the FS guy?s processor is at 2.8GHz and Rollo likely doesn?t have that, and the fact they used different resolutions, we don?t know if the FS guy used AA/AF?

C'mon Crusader, who has a quad-SLI rig and does not use AA/AF?

Possibly someone simply benchmarking, Josh. :disgust:

Originally posted by: josh6079
Originally posted by: Crusader
Hothardware got 95fps average with quad SLI at 16X12 using 4X16X like Rollo used

That's interesting since Rollo was gaming at 19x12.
It is, and I realized that when I posted it. You are confused. Its not a big deal though because like I said, trying to construe something out of off-the-cuff benchmarks by DIFFERENT systems/settings/unknown factors affecting them is an absolute waste of your time.

Originally posted by: josh6079
Originally posted by: Crusader
If you take a look at HotHardwares review with FEAR benches http://www.hothardware.com/viewarticle.aspx?articleid=841&cid=2

Rollos Results
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
HotHardwares 16x12 4x16X 95FPS avg

Losing his touch? His results look pretty good from here.

I'd say if you consider yourself to have a pair, if you have an issue with Rollo to take it to a forum hes registered on and face him. You could always ask him yourself.

But it is easier to cower and curse his name as a distance. I understand, some people live their whole lives that way.

He's AEG trash, why would I try to argue with someone on a different forum about the video card that he probably got for free?

I cant speak for Rollo, but lots of people get free video cards. I dont, but I'm not jealous of those who do. Why mention he gets free stuff unless you resent that fact?
Besides why wouldnt you take him on if you have an issue or complaint? Hes civil.. actually a good guy from what I've seen.
AEG is a focus group, its not a bad thing at all. It means direct connections with NV/ATI. Unless you dont like a line of direct communication within enthusiast forums like this one?

Its clear you dont, as your comrades got him banned. Good going. Now theres no one here from Nvidia to forward driver issues to or relay responses from NV engineering.
Anything for ATI, I suppose. Even if it hurts the community at large.
Great job. You really do as much good around here as he did helping people out, forwarding issues to engineering you come across, running benches for people on request... Oh.. Wait.


Originally posted by: josh6079
You can take my previous posts as some kind of attack if you want to. Fact is, I just don't see the sense in quad-SLI when you can get very playable frames out of other Nvidia hardware for a lot less. You can be Rollo's dog and repeat all of his kinds of scores, but that doesn't change that you're going to be paying around $1000 for more screen tearing, or horrible syncrinization all bundled with the same 7 series feature set that has been around for more than a year.
LOL.. and thats great. I dont care if you use Quad, or even like it. I pay no attention to it. There are no WHQL Quad drivers yet. I dont understand why you are so consumed on this topic? Trying to get a rise out of me?
I dont use it, never will probably. But other people do use it, and like it. Its a great idea, and preliminary results show it works fine and blows the doors off anything your gods from Canada can muster? Why does that bother you? I mean, I can live with that fact.
I'm guessing it will be a great product when NV pulls the final curtains off of it officially. You dont like it. Thats phenomenal.

And as far as Rollos "dog" :disgust: .. eat my *** dude. I browse many forums. And you dont even know what you are talking about as far as screen tearing ect.. "same 7 series feature set"?
You mean superior AA, quiet operation, far superior OEM vendors with great service and lifetime warranties (double lifetime on XFX), SM3.. and no useless wasted silicon on a half-baked HDR+AA implementation?
I'd take Quad SLI over a X1900 Xfire rig anyday, anyone with a brain would.

Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.

If you have a problem with Rollo, you should take it up with him Josh. We?re going off topic? Tell him Crusader sent ya.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
Not in games
Yes it does. http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2795

Fair enough. Its a good choice for those with the money and desire though.
I was never debating that.
I'll leave the comparisons for review sites, not some guy in a garage.
Which is why I (rightfully) dont buy your assertions on high res HDR+AA being playable. Ackmed insists its GREAT on a single card, others refute him. And in that group of refuting parties includes review sites. Which when I call him out on that he claims intensive portions of the game are not valid, and do not matter.

What's the deal with Ackmed? We weren't even talking about him? Did you never see my clips that I had of a single X1900XTX playing in foliage with max settings + visual mods? One card is enough, and this is coming from somone who was all over the 7800's in their day of unchallenged performance and quality.
Guess some people prefer to only look at the Best-Case scenario when recommending hardware?
Thats not how I do things.
So do you disregard any maximum or average framerates in a benchmark? Because it seems like you are quick to jump on ATI's minimum frames yet first in line to praise Nvidia's maximum ones.
Erm.. ya, ok. If you think these are proper benchmarks done in controlled environments, or reveal anything about Rollo and his "inability to properly get a system running" (or whatever you are trying to infer).. your out of your mind josh.
I'm not trying to say that Rollo is inept at making a gaming computer run well. I was simply saying that one of them was distorting the the scores because if both knew how to stress test the GX2's, the scores shouldn't have been that far off. Since one source is a known AEG member and the other one is a no brainer who didn't know how to bench FEAR, a game with its own benchmark program implemented in it, all I was saying was that your sources weren't very credible. I'm with you here, saying that they are not directly comparible, yet you placed them together as evidence that qSLI performs well even if one setup didn't optimize everything that could potentially boggle it down. I'm with you in that Rollo probably has his head on straighter than the guy who didn't know how to bench FEAR. Why you're so defensive about him is beyond me.
And you dont even know what you are talking about as far as screen tearing ect..
Yes I do. Two 7800GT's under the belt and I've experienced Nvidias multi-GPU vsync first hand. That's more than what I can say about you, commenting on something you haven't ever owned and downplaying it against something you do own. Talk about cowering and cursing at a distance. I understand, some people live their whole lives that way.
Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.
If ATI still is only offering the same things the X1900 series offer only faster over a year after its launch date, I will be just as upset with them. You think I hate Nvidia? Far from it. I just am not afraid to comment on where they have been lacking, just like I'm not afraid to comment on where ATI's been lacking. Nvidia did a good job with the 7 series GPU, but when comparing GPU to GPU, the X1900 always comes out on top for me, but maybe not for everyone and I understand that. Nvidia did spend time and research in developing different multi-GPU constructions. That might save them some resources in the end and because of that, they probably had a few extra spending dollars to invest in G80. It better be an awesome card.

The fact is, Quad-SLI is here, and while it's performance increases at maximum frames are impressive, I can't help but feel that it is nothing but an e-penis checkbox. Around $1000 in just video cards and you'll only get crazy frames without vsync, making your ultra high AA (which I think you'll agree with me is kind of redundant since 4x is enough for you and me) look like crap because everything keeps ripping anyway. I guess my main frustration with all of the multi-GPU trends is that you get tempting performance increases, but more problematic screen fluidity to counter it.

Like I said, it is interesting and something that I'm sure others are very excited about. Although, it is too expensive to use on a massive scale, too late in the life of DX9, and not worth the clunky vsync or horrible tearing you'll get.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: josh6079
Originally posted by: Crusader
Not in games
Yes it does. http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2795

Wrong again.
If you think a Conroe will boost performance in games more than moving to a GX2, you've lost it.

Originally posted by: josh6079
Originally posted by: Crusader
And you dont even know what you are talking about as far as screen tearing ect..
Yes I do. Two 7800GT's under the belt and I've experienced Nvidias multi-GPU vsync first hand. That's more than what I can say about you, commenting on something you haven't ever owned and downplaying it against something you do own. Talk about cowering and cursing at a distance. I understand, some people live their whole lives that way.

Put your foot in your mouth much? I had SLI far before you, as I bought it on launch. I know what issues did exist far better than you could, and I know what was fixed when.
There are no problems with vsync now.

Originally posted by: josh6079
Originally posted by: Crusader
Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.
If ATI still is only offering the same things the X1900 series offer only faster over a year after its launch date, I will be just as upset with them. You think I hate Nvidia? Far from it. I just am not afraid to comment on where they have been lacking, just like I'm not afraid to comment on where ATI's been lacking. Nvidia did a good job with the 7 series GPU, but when comparing GPU to GPU, the X1900 always comes out on top for me, but maybe not for everyone and I understand that. Nvidia did spend time and research in developing different multi-GPU constructions. That might save them some resources in the end and because of that, they probably had a few extra spending dollars to invest in G80. It better be an awesome card.

The fact is, Quad-SLI is here, and while it's performance increases at maximum frames are impressive, I can't help but feel that it is nothing but an e-penis checkbox. Around $1000 in just video cards and you'll only get crazy frames without vsync, making your ultra high AA (which I think you'll agree with me is kind of redundant since 4x is enough for you and me) look like crap because everything keeps ripping anyway. I guess my main frustration with all of the multi-GPU trends is that you get tempting performance increases, but more problematic screen fluidity to counter it.

Like I said, it is interesting and something that I'm sure others are very excited about. Although, it is too expensive to use on a massive scale, too late in the life of DX9, and not worth the clunky vsync or horrible tearing you'll get.

Too late in the life cycle of DX9?
Do you know when Vista will be out? Cuz I'd like to know. So would Bill Gates.

Think all the DX9 games based off all these major DX9 engines like Source are just going to disappear soon after Vistas launch?
Now is a great time to buy, after Vista comes out, who knows (maybe you?) when DX10-Source, DX10-UE3.0 and the "Doom4" engine is released to utilize these cards on mass scale?
You dont buy hardware in anticipation of games or software, you buy it after the games and software are released, see what works the best and go from there. This lesson has been taught many times.

Since DX9 rules the gaming landscape, and all the major DX9 engines are out (besides UE3).. you couldnt pick a better time to buy.
I'm sure for a select few, now is NOT the time to buy- because Nvidia is ruling the roost.
ATI and its supporters are screaming about HDR+AA, but benchmarks and many user opinions are ringing back that this feature is hollow as its to slow with ATIs implementation, and requires 3rd party hacks to work in a single popular game. Toss in horrible OEM vendors with bad service, no warranty and loud coolers. Yet still have the 2nd best software support in the industry, and you have yourself a real doozy of a situation for ATI.
Yet with all these DX9 games there is not a better card than the GX2 to play all these great games on. Picking up a GX2, or QuadSLI is certainly not a deplorable idea in the least.

At least there are DX9 games to utilize their power.
Case in point, you want to play DX9 games, now is the time to buy. Everyone knows who the king of DX9 ended up being.

If you dont like the scoreboard, then I'd agree with you.. might as well pull the classic "wait" argument out.

Yet there are many people enjoying their GX2s (and quad SLI) daily, ripping up DX9 games in resolutions that no single card (GX2) from ATI can match. And no multiGPU (quad) solution from ATI can match either.
Its a shame to see ATI go from the short lived glory days of the R300 to this.
But its a reality that you must live in.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: Crusader
Its clear you dont, as your comrades got him banned. Good going. Now theres no one here from Nvidia to forward driver issues to or relay responses from NV engineering.
Anything for ATI, I suppose. Even if it hurts the community at large.
Great job. You really do as much good around here as he did helping people out, forwarding issues to engineering you come across, running benches for people on request... Oh.. Wait.

Rollo wasn't banned specifically for being an AEG member...it was actually triggered by something else. Can't remember what it was and those posts are gone now anyway. Actually HE got other members banned for accusing him of being an AEG member(before he admit it himself), so there's your high and mighty Rollo for ya.

I dont use it, never will probably. But other people do use it, and like it. Its a great idea, and preliminary results show it works fine and blows the doors off anything your gods from Canada can muster? Why does that bother you? I mean, I can live with that fact.

I love the way you talk mate...I guess anyone with an ATI card is a Canadian devil??? You can live with a great performing NVidia card but you don't seem to be able to live with the FACT that HDR+AA is very useable and playable. I wouldn't expect anything less from you.

and no useless wasted silicon on a half-baked HDR+AA implementation?

You tell me how it was "half-baked"?? It worked perfectly and if more developers weren't in bed with NVidia (TWIMTBP) we might see more titles for it(what I find laughable is that Bethesda said HDR+AA wouldn't work). I guess we should call the 6 series half baked cause SM3 was available in ONE game when it came out and took a MASSIVE performance hit when enabled.

I'd take Quad SLI over a X1900 Xfire rig anyday, anyone with more money than brains would.

FIXED.

Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.

Have you ever even used any ATI card?? Josh actually had 7800SLI. Anything to say there??

You know I seriously find it hard to believe the lengths people like you would go to defend one COMPANY. It's only a company and you turn it into some North American civil war. Most of the people you accuse of being ATI fanboys (off the top of my head Joker, Ackmed, and Josh) have actually owned or currently own a 7 series NVidia product...can you say you've owned recent products from both companies?? I doubt it.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I wasn't implying that Conroe would give you better results in games than another 7950GX2. I said that it would be a good investment and it would give some performance increase (more than even the highest AMD FX). That way, if when getting either another 7950 or any other card that might be out when you upgrade, you will less likely be CPU limited.

I don't care that you had SLI at its launch. I've had it with the 7 series, have you? Others have stated that the vsync isn't so cool. And if they are finally getting it right with just two GPU's, add two more and I bet you're back to square one. Thanks for telling me though that I didn't have any issues with it, even though I was the one playing on my computer.

I see your point. DX10 isn't implemented yet and probably won't be until 2007. However, hardware for it is starting to come out (e.g. HDCP monitors, video cards) and because of that, why not wait until G80? If Nvidia is going to release it before ATI's R600 (like the 78's were released before the X1K series) you can't tell me that that wouldn't be as good of an investment as qSLI and a regular AMD setup.

You can ramble all day about the differences between ATI and Nvidia, we know what they are. I don't care if ATI were to perish from the market nor if Nvidia lost every piece of corporate asset. I've got a good, quality product from ATI that has fit my needs just like you've gotten a good, quality product from Nvidia that has fit yours. They both have made them in an attempt to out sale the other and have therefore brought better products. I don't hold disdain for something that is instigating higher engineering and progression like you. The fact that you like to see things as Kings and Queens (I thought you were damn proud to be an american?) is kind of sad. Without ATI, there might not have been a qSLI, just like without Nvidia, there might not have been the X1k series.

This thread has gotten into a clarification between you and me and our views of qSLI. Our views are different, and I'm okay with that. Our intentions with gaming are different, and I'm okay with that. Perhaps I was wrong to start carrying on about the near future and the coming of DX10 and such, you are right in that no one really knows. You and I would purchase different things and fortuneatly for the both of us the market is diverse enough for that to happen.

Back to the topic, out of the beta drivers is there anything in them that people are hoping gets improved before the official ones hit? I know sometimes the only thing that makes them different is the WHQL certification.

Also, just how heavy of a PSU does one need for it? Some companies exaggerate on the amount required, but what is the lowest PSU that someone can get by with?

EDIT: Changed "in crease" to "increase"
 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Conroe + Quad SLI? Mmmmmmm could be very yummy. And here I am planning a new Conroe build next month. Looking forward to August 3rd when the Quad beta ends and seeing actual finished product reviews. Now we just need some mobos that'll run Conroe and SLI. For me the mobo will need to comfortably fit the Quad SLI setup *and* a seperate sound card without problems. It also needs to be very good at OCing. Looking forward to August more and more.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: josh6079
I see your point. DX10 isn't implemented yet and probably won't be until 2007. However, hardware for it is starting to come out (e.g. HDCP monitors, video cards) and because of that, why not wait until G80? If Nvidia is going to release it before ATI's R600 (like the 78's were released before the X1K series) you can't tell me that that wouldn't be as good of an investment as qSLI and a regular AMD setup.
If you are waiting for HDCP, what are you waiting for? Every GX2 already has that.
*bonk*

Besides, (and I'm not trying to rip Nvidia here, just telling the truth), whats the point to having HDCP unless you watch movies on your PC? Most people dont, for gaming its pointless.
No reason to wait for G80, esp if you want HDCP. Since not even all the DX9 engines are out yet (UE3), but most are.. it will simply be a very, very long time till theres UE4 (probably the most popular and best engine overall thats used in the most games on the market, if UE2 is any indication).
DX10 on a mass scale is so far off, its not even worth talking about DX10 at all out.. outside of mere curiousity and dreaming. Thus I'm a big fan of the GTX, GX2 and quad looks like it might be pretty good as well.

You can ramble all day about the differences between ATI and Nvidia, we know what they are. I don't care if ATI were to perish from the market nor if Nvidia lost every piece of corporate asset. I've got a good, quality product from ATI that has fit my needs just like you've gotten a good, quality product from Nvidia that has fit yours. They both have made them in an attempt to out sale the other and have therefore brought better products. I don't hold disdain for something that is instigating higher engineering and progression like you. The fact that you like to see things as Kings and Queens (I thought you were damn proud to be an american?) is kind of sad. Without ATI, there might not have been a qSLI, just like without Nvidia, there might not have been the X1k series.
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.

Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.

This thread has gotten into a clarification between you and me and our views of qSLI. Our views are different, and I'm okay with that. Our intentions with gaming are different, and I'm okay with that. Perhaps I was wrong to start carrying on about the near future and the coming of DX10 and such, you are right in that no one really knows. You and I would purchase different things and fortuneatly for the both of us the market is diverse enough for that to happen.
I can live with that.
The DX10 craziness is something I hold in disdain. Useless DX10 features, that will be used 90% of their lifespan on DX9 games that are currently out, and might not even surpass current cards in whats important (DX9).
Who knows, who cares. There are great, proven DX9 options available today.

No need to wait for a DX10 card to play on current DX9 games. The ones we have (esp the GX2), perform great.
Originally posted by: josh6079
Back to the topic, out of the beta drivers is there anything in them that people are hoping gets improved before the official ones hit? I know sometimes the only thing that makes them different is the WHQL certification.
I'm not sure, but I dont really like to brag about quad or condone it until Nvidia has what they feel is a market-ready product. I'd give ATI the same benefit if they were in a similar situation.
Some quad results are great, some are not. Its the nature of the beast with multicard in general. Since that is the nature of the tech in use anyway, why not at least give NV time to eliminate most of the driver bugs they want to before officiating the launch.
Heck, if its the same thats fine with me. I wont be using quad.
Though its great to have it open for people who want SLI GF8 performance, or single card Geforce 9 performance today (if thats even close).

Also, just how heavy of a PSU does one need for it? Some companies exaggerate on the amount required, but what is the lowest PSU that someone can get by with?

I'm not sure what kind of wattage quad needs. But seems like 700+W works.
All the GX2s need is a dedicated 12volt rail. 1 GX2 will run on a shared 12V rail. And with quad rail PSUs you can dedicate a 12v rail to each card. Then have 1 rail for the CPU and 1 for the rest of the devices (mobo, dvd, hdds, fans).
A lot of requirements are exaggerated because people tend to buy cheap PSUs. In many cases (in a lot of hardware situations) a low wattage PSU will work fine if its quality built.
Almost any quad rail PSU will work for quadSLI.. most of those are good and heavy duty. Far more than I'll ever need.

Current ATX2.0 spec is being pushed to its limits though by current cards.
I'm expecting the next round of cards to use either a 5 1/4 PSU dedicated to video, or my preferred method- a power brick.
Dont really want to have to start upgrading my 5 1/4 PSU, pulling that out with every different card I buy (midrange/highend/ati/nv/matrox/whatever)... rather just get a laptop style power brick with each card (like the original 7900GX2).

Im sure power supply manufacturers will jump on this opportunity though. Some people will want no power brick.. but I'm suspecting you'll need a 2nd power cable ran to the 5 1/4 PSU anyway.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: Crusader
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.

Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.

So what's your opinion on SM3 that was introduced with the 6 series with ONE game supporting at the time and a massive performance hit??

That wasn't progression(I actually thought it was)?? By your logic the 6800 series was badly engineered??

I'd be surprised if I get any decent, logical response to that.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: thilan29
Originally posted by: Crusader
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.

Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.

So what's your opinion on SM3 that was introduced with the 6 series with ONE game supporting at the time and a massive performance hit??

That wasn't progression(I actually thought it was)?? By your logic the 6800 series was badly engineered??

I'd be surprised if I get any decent, logical response to that.

My view on SM3 is that its mostly a performance enhancer for the most part, beyond vertex displacement mapping.
So, while there might not have been a ton of games that used it.. theres certainly nothing wrong with the Geforce6's implementation of a feature meant to speed up processes.
Its a bonus, and one that cant hurt because theres no such thing as a "slow SM3 implementation".
Most everything else being equal back in those days, SM3+SLI on the GF6 made it the clear choice for many.

HDR+AA on ATI is simply too slow for high res in a first-person game. Not to mention theres only 1 really popular game that uses it, and it requires a 3rd party hack to work.
So no, I'm not a huge fan of that.
Now that I've said my piece there, let me tell you what good stuff I do have to say about ATI and their current lineup. I do like the X1900XT. I've said this in the past. I dont like its HSF at all though, nor the vendors that sell it with their poor warranty. It also doesnt have my precious 8xS texture AA that I treasure on the NV side.
Other than that, its a good piece of GPU. Though it doenst have any speed advantage over the competing NV cards, and Crossfire is not very appealing to say the least.
If it did have some sort of perf advantage, I'd buy one even without the good warranty ect and put an after market cooler on it. I wouldnt diss anyone for purchasing one, I respect it for what it is. But its really not the best choice..
I decided I dont want to mess with all that HSF swapping ect myself, and the lifetime warranty means a lot to me as I've used that (along with StepUp) a few times.. "very handy" to say the least!!
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: Crusader
Originally posted by: thilan29
Originally posted by: Crusader
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.

Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.

So what's your opinion on SM3 that was introduced with the 6 series with ONE game supporting at the time and a massive performance hit??

That wasn't progression(I actually thought it was)?? By your logic the 6800 series was badly engineered??

I'd be surprised if I get any decent, logical response to that.

A decent logical response.. do you deserve it?

My view on SM3 is that its mostly a performance enhancer for the most part, beyond vertex displacement mapping.
So, while there might not have been a ton of games that used it.. theres certainly nothing wrong with the Geforce6's implementation of a feature meant to speed up processes.
Its a bonus, and one that cant hurt because theres no such thing as a "slow SM3 implementation".

HDR+AA on ATI is simply too slow for high res in a first-person game. Not to mention theres only 1 really popular game that uses it, and it requires a 3rd party hack to work.
So no, I'm not a huge fan of that.

I haven't exactly been UNCIVIL towards you have I??

Alright, my question is specifically about the FP16 HDR implementation that was a highly touted feature, especially for FarCry. It took a massive performance hit...but was still a highly touted feature.

Here is a benchmark of a 6800Ultra running HDR. Even at only 1280x1024, the average fps is below 30, and check the subsequent pages which give the same sort of numbers...and some levels aren't even benchmarked at higher than 1024x768.

Those numbers ARE actually unplayable(especially for a FPS like FarCry), but the X1900 numbers for Oblivion are playable (the expection being 1600x1200 max quality in the foliage area and even this is debatable since it still reaches about 25fps). So how can you say it is a worthless feature?? By your logic, the HDR implementation in the 6800 series was a worthless feature. I actually thought HDR looked stunning regardless of the performance hit and was very worthwhile. I personally don't think YOU thought it was a worthless feature.
Its a bonus, and one that cant hurt because theres no such thing as a "slow SM3 implementation".
This is your quote, but I've never seen you say such a nice thing for HDR+AA. However, I see you sorta sidestepped my question and focused on only "SM3" when I'm sure you could have guessed I was referring to the performance hit in FarCry(the first game to use the 6 series SM3 features).

So again I'll ask, why is HDR+AA on X1K cards worthless when by your same logic HDR on the 6 series was worse than worthless? I personally don't think YOU thought HDR on 6 series cards was a worthless feature which is why I'm asking you why you think it's a worthless feature on X1K cards.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: thilan29
Alright, my question is specifically about the FP16 HDR implementation that was a highly touted feature of SM3, especially for FarCry. It took a massive performance hit...but was still a highly touted feature.

FP16 HDR was not and is not part of SM3.0...
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Gstanfor
FP16 HDR was not and is not part of SM3.0...

I thought FP16 was the minimum to SM3's specular range?

Click

Custom fp16-fp32 shader program...Shader Model 3.0 gives developers full and precise control over specular and fog computations, previously fixed-function

And here a tech reporter asked the CEO and President of Crytek about its FP16 usage in SM3:

Click
I asked Yerli about the image quality problems on GeForce 6800 cards in version 1.1 of Far Cry. He dismissed those as "just rendering issues" and said that the Shader Model 3.0 path cleaned up those problems. The SM3.0 path does include a mix of FP16 and FP32 datatypes on GeForce 6800 cards for the sake of performance, but Yerli said Crytek never sacrificed image quality for performance, a move that's "not an option" for them. Out of curiosity, I also asked whether he knew of any instances where the use of FP24 precision on ATI cards caused visual artifacts, and he said no.

How is it not apart of Shader Model 3.0?