Originally posted by: wizboy11
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.
I'm not sure if they changed their stance on that though.
Originally posted by: m21s
Just curious if there has been anything new with Nvidia releasing driver support for there 7950GX2's to run in SLI mode.
Are they even going to support this?
Quad SLI sounds yummy
My 7950GX2 is getting lonely in the case all by itself![]()
Originally posted by: wizboy11
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.
I'm not sure if they changed their stance on that though.
Originally posted by: m21s
Originally posted by: wizboy11
The should eventually, but I think only for the SLI x16 mobos (like the NF4 x16 and the 590) so you might be out of luck.
I'm not sure if they changed their stance on that though.
That would suck.
I didn't even realize that would be an issue![]()
Originally posted by: Crusader
Nvidia has said it will indeed work on 1st gen dual PCIE 8X SLI boards, albeit at slightly reduced performance.
Originally posted by: beggerking
The bigger question is your CPU.. most likely it'll be CPU bounded with quad-SLI..
Originally posted by: Crusader
The 91.33 beta drivers support quad sli and people are using it now:
http://www.geforce3d.net/forums/index.php?topic=165.0
Originally posted by: Rollo at Geforce 3D
Scratching the surface of the quad sli goodness:
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
Oblivion: No benches, but played a while outdoors with all on max, 19X12, HDR- VERY smooth.
More to come!
Originally posted by: sqitso at FS
FEAR 7950GX2 QUAD SLI
MIN 76
AVERAGE 123
MAX 297
Originally posted by: nitromullet
Looks like the OP got his wish...
http://www.slizone.com/object/slizone_quad_beta.html
Originally posted by: m21s
Originally posted by: nitromullet
Looks like the OP got his wish...
http://www.slizone.com/object/slizone_quad_beta.html
I scare myself sometimes.
I posted this last night thinking about quad SLI.
And BINGO today a beta driver is released.
Hmmm.....maybe I should start thinking about winning the lottery....
:beer:
Originally posted by: josh6079
Originally posted by: beggerking
The bigger question is your CPU.. most likely it'll be CPU bounded with quad-SLI..
True, but doesn't it depend on the game and the settings?
Also, if someone has enough money to fiddle around with quad-SLI, why not just get a good Conroe system ready and wait for G80 and R600?
Originally posted by: josh6079
Originally posted by: Crusader
The 91.33 beta drivers support quad sli and people are using it now:
http://www.geforce3d.net/forums/index.php?topic=165.0[/Q ]
That thread was started by Rollo, the AEG member who got banned here. No surprise he's saying it is awesome....
Also, his average frames when using quad-SLI are greatly different than the average ones your other link shows:
Originally posted by: Rollo at Geforce 3D
Scratching the surface of the quad sli goodness:
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
Oblivion: No benches, but played a while outdoors with all on max, 19X12, HDR- VERY smooth.
More to come!
An AEG member at a very Pro-Nvidia forum, go figure. His results are nice, but then in your other link the OP didn't even know how to use the FEAR benchmark until other members told him. He even searched for it in Windows, not knowing that it is as plain as daylight to spot in the FEAR options. His results are as follows:
Originally posted by: sqitso at FS
FEAR 7950GX2 QUAD SLI
MIN 76
AVERAGE 123
MAX 297
Lets compare:
Rollo's:
Min.................Avg................Max.
42...................90.................UNK
______________________________
sqitso's:
Min.................Avg................Max.
76...................123...............297
That's wierd. The guy who didn't even know how to bench FEAR is doing better than Rollo? If Rollo was as I have heard him to be, he must be losing his touch. That or a rich person without a clue is outpacing him by 33-34 frames and I doubt that going from 16x12 to 19x12 would cause that much of a bump.
Besides, without Vsync FEAR looks horrible with all of the screen tearing on a decent LCD and when you're in a room with a flickering light it is absolutely horrible. Then, if you want to fix that you have to Sync 4 GPU's with one monitor. Yeah, that's fun.
The rage3d link of yours is the only one that isn't restricted to FEAR scores and even they are on a Win 2003 server OS, with two dual core processors (Opteron 285's x 2).
Sorry Crusader, with how much you think HDR + AA was introduced too early, I'd have to say that quad-SLI just isn't beneficial yet. Especially when you think about hte cost of having a system that can do it effectively and when G80 and R600 are almost out.
Originally posted by: Crusader
If you have the money, why not get both. Doesnt make sense to just avoid Nvidia products. Unless you dont want people using NV products, Joshua.
First, I dont know where you get off saying 7950 QuadSLI was introduced too early Josh.
It hasnt been introduced at all to the consumer market, besides a BETA driver. You can save your prejudgements and obvious zeal to downplay 7950 QuadSLI when NV officially pulls the curtain and has WHQLs ready.
I know you want to judge based off of beta drivers and so on, but I give companies fair quarter.
When the drivers are ready, Nvidia will let everyone know. Then you can unleash your obvious bias, k.
Till then, not sure why you are directing that at me. I like Nvidia, but I havent made any calls for QuadSLI. I personally have paid little attention to any Quad benchmarks.
When Nvidia pulls the veil on their product, I'll stand up and take notice.
Secondly, I'm not sure Rollo is "losing his touch". I'm not sure why you are comparing tests taken on different systems, at different resolutions, probably different settings altogether and trying to make conclusions based on that?
Yeah we should assume the guy who had to ask how to do it ran the benches better than Rollo, who?s benched FEAR many times and posted his results all over the web.
Besides the fact that the FS guy?s processor is at 2.8GHz and Rollo likely doesn?t have that, and the fact they used different resolutions, we don?t know if the FS guy used AA/AF?
Hothardware got 95fps average with quad SLI at 16X12 using 4X16X like Rollo used
If you take a look at HotHardwares review with FEAR benches http://www.hothardware.com/viewarticle.aspx?articleid=841&cid=2
Rollos Results
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
HotHardwares 16x12 4x16X 95FPS avg
Losing his touch? His results look pretty good from here.
I'd say if you consider yourself to have a pair, if you have an issue with Rollo to take it to a forum hes registered on and face him. You could always ask him yourself.
But it is easier to cower and curse his name as a distance. I understand, some people live their whole lives that way.
Originally posted by: josh6079
Originally posted by: Crusader
If you have the money, why not get both. Doesnt make sense to just avoid Nvidia products. Unless you dont want people using NV products, Joshua.
I never said they couldn't use an Nvidia card with a Conroe system. You're right, if they have the money they could do both. I just think that updating a complete system to DDR2 and Conroe will prove to be a longer lived benefit than another 7950GX2.
Originally posted by: josh6079
Originally posted by: Crusader
First, I dont know where you get off saying 7950 QuadSLI was introduced too early Josh.
It hasnt been introduced at all to the consumer market, besides a BETA driver. You can save your prejudgements and obvious zeal to downplay 7950 QuadSLI when NV officially pulls the curtain and has WHQLs ready.
I know you want to judge based off of beta drivers and so on, but I give companies fair quarter.
When the drivers are ready, Nvidia will let everyone know. Then you can unleash your obvious bias, k.
Till then, not sure why you are directing that at me. I like Nvidia, but I havent made any calls for QuadSLI. I personally have paid little attention to any Quad benchmarks.
When Nvidia pulls the veil on their product, I'll stand up and take notice.
That just further proves my point. G80 is getting closer and closer and so is DX10 and DX10 games. If they wait longer and longer to get drivers out for Quad-SLI, the ability will be very short lived and too early of a feature to be introduced only to be replaced 6 or less months later. Not saying that it is a bad idea at all, it is pretty neat that they can link four GPU's together in order to perform better than one or two.
I dont know, I'm not about to guess at the individual PCs, how they were configured and the game settings. Not to mention all the driver settings left out of the loop.Originally posted by: josh6079
Originally posted by: Crusader
Secondly, I'm not sure Rollo is "losing his touch". I'm not sure why you are comparing tests taken on different systems, at different resolutions, probably different settings altogether and trying to make conclusions based on that?
The settings were labeled to be the same--max settings, 4xAA, 16xAF. The only thing that could be different is the resolution difference between 16x12 and 19x12. Unless you believe that one of those two is running on default Quality settings while the other is running on Nvidias High Quality?
Originally posted by: josh6079
Originally posted by: Crusader
Yeah we should assume the guy who had to ask how to do it ran the benches better than Rollo, who?s benched FEAR many times and posted his results all over the web.
We don't need to assume, he did.![]()
Originally posted by: josh6079
Originally posted by: Crusader
Besides the fact that the FS guy?s processor is at 2.8GHz and Rollo likely doesn?t have that, and the fact they used different resolutions, we don?t know if the FS guy used AA/AF?
C'mon Crusader, who has a quad-SLI rig and does not use AA/AF?
It is, and I realized that when I posted it. You are confused. Its not a big deal though because like I said, trying to construe something out of off-the-cuff benchmarks by DIFFERENT systems/settings/unknown factors affecting them is an absolute waste of your time.Originally posted by: josh6079
Originally posted by: Crusader
Hothardware got 95fps average with quad SLI at 16X12 using 4X16X like Rollo used
That's interesting since Rollo was gaming at 19x12.
Originally posted by: josh6079
Originally posted by: Crusader
If you take a look at HotHardwares review with FEAR benches http://www.hothardware.com/viewarticle.aspx?articleid=841&cid=2
Rollos Results
FEAR 19X12 4X16X, all maxed: 90fps average, 42fps minimum
HotHardwares 16x12 4x16X 95FPS avg
Losing his touch? His results look pretty good from here.
I'd say if you consider yourself to have a pair, if you have an issue with Rollo to take it to a forum hes registered on and face him. You could always ask him yourself.
But it is easier to cower and curse his name as a distance. I understand, some people live their whole lives that way.
He's AEG trash, why would I try to argue with someone on a different forum about the video card that he probably got for free?
LOL.. and thats great. I dont care if you use Quad, or even like it. I pay no attention to it. There are no WHQL Quad drivers yet. I dont understand why you are so consumed on this topic? Trying to get a rise out of me?Originally posted by: josh6079
You can take my previous posts as some kind of attack if you want to. Fact is, I just don't see the sense in quad-SLI when you can get very playable frames out of other Nvidia hardware for a lot less. You can be Rollo's dog and repeat all of his kinds of scores, but that doesn't change that you're going to be paying around $1000 for more screen tearing, or horrible syncrinization all bundled with the same 7 series feature set that has been around for more than a year.
Yes it does. http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2795Originally posted by: Crusader
Not in games
I was never debating that.Fair enough. Its a good choice for those with the money and desire though.
I'll leave the comparisons for review sites, not some guy in a garage.
Which is why I (rightfully) dont buy your assertions on high res HDR+AA being playable. Ackmed insists its GREAT on a single card, others refute him. And in that group of refuting parties includes review sites. Which when I call him out on that he claims intensive portions of the game are not valid, and do not matter.
So do you disregard any maximum or average framerates in a benchmark? Because it seems like you are quick to jump on ATI's minimum frames yet first in line to praise Nvidia's maximum ones.Guess some people prefer to only look at the Best-Case scenario when recommending hardware?
Thats not how I do things.
I'm not trying to say that Rollo is inept at making a gaming computer run well. I was simply saying that one of them was distorting the the scores because if both knew how to stress test the GX2's, the scores shouldn't have been that far off. Since one source is a known AEG member and the other one is a no brainer who didn't know how to bench FEAR, a game with its own benchmark program implemented in it, all I was saying was that your sources weren't very credible. I'm with you here, saying that they are not directly comparible, yet you placed them together as evidence that qSLI performs well even if one setup didn't optimize everything that could potentially boggle it down. I'm with you in that Rollo probably has his head on straighter than the guy who didn't know how to bench FEAR. Why you're so defensive about him is beyond me.Erm.. ya, ok. If you think these are proper benchmarks done in controlled environments, or reveal anything about Rollo and his "inability to properly get a system running" (or whatever you are trying to infer).. your out of your mind josh.
Yes I do. Two 7800GT's under the belt and I've experienced Nvidias multi-GPU vsync first hand. That's more than what I can say about you, commenting on something you haven't ever owned and downplaying it against something you do own. Talk about cowering and cursing at a distance. I understand, some people live their whole lives that way.And you dont even know what you are talking about as far as screen tearing ect..
If ATI still is only offering the same things the X1900 series offer only faster over a year after its launch date, I will be just as upset with them. You think I hate Nvidia? Far from it. I just am not afraid to comment on where they have been lacking, just like I'm not afraid to comment on where ATI's been lacking. Nvidia did a good job with the 7 series GPU, but when comparing GPU to GPU, the X1900 always comes out on top for me, but maybe not for everyone and I understand that. Nvidia did spend time and research in developing different multi-GPU constructions. That might save them some resources in the end and because of that, they probably had a few extra spending dollars to invest in G80. It better be an awesome card.Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.
Originally posted by: josh6079
Yes it does. http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2795Originally posted by: Crusader
Not in games
Originally posted by: josh6079
Yes I do. Two 7800GT's under the belt and I've experienced Nvidias multi-GPU vsync first hand. That's more than what I can say about you, commenting on something you haven't ever owned and downplaying it against something you do own. Talk about cowering and cursing at a distance. I understand, some people live their whole lives that way.Originally posted by: Crusader
And you dont even know what you are talking about as far as screen tearing ect..
Originally posted by: josh6079
If ATI still is only offering the same things the X1900 series offer only faster over a year after its launch date, I will be just as upset with them. You think I hate Nvidia? Far from it. I just am not afraid to comment on where they have been lacking, just like I'm not afraid to comment on where ATI's been lacking. Nvidia did a good job with the 7 series GPU, but when comparing GPU to GPU, the X1900 always comes out on top for me, but maybe not for everyone and I understand that. Nvidia did spend time and research in developing different multi-GPU constructions. That might save them some resources in the end and because of that, they probably had a few extra spending dollars to invest in G80. It better be an awesome card.Originally posted by: Crusader
Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.
The fact is, Quad-SLI is here, and while it's performance increases at maximum frames are impressive, I can't help but feel that it is nothing but an e-penis checkbox. Around $1000 in just video cards and you'll only get crazy frames without vsync, making your ultra high AA (which I think you'll agree with me is kind of redundant since 4x is enough for you and me) look like crap because everything keeps ripping anyway. I guess my main frustration with all of the multi-GPU trends is that you get tempting performance increases, but more problematic screen fluidity to counter it.
Like I said, it is interesting and something that I'm sure others are very excited about. Although, it is too expensive to use on a massive scale, too late in the life of DX9, and not worth the clunky vsync or horrible tearing you'll get.
Originally posted by: Crusader
Its clear you dont, as your comrades got him banned. Good going. Now theres no one here from Nvidia to forward driver issues to or relay responses from NV engineering.
Anything for ATI, I suppose. Even if it hurts the community at large.
Great job. You really do as much good around here as he did helping people out, forwarding issues to engineering you come across, running benches for people on request... Oh.. Wait.
I dont use it, never will probably. But other people do use it, and like it. Its a great idea, and preliminary results show it works fine and blows the doors off anything your gods from Canada can muster? Why does that bother you? I mean, I can live with that fact.
and no useless wasted silicon on a half-baked HDR+AA implementation?
I'd take Quad SLI over a X1900 Xfire rig anyday, anyone with more money than brains would.
Just hold that thought, kid. If that X1950 is released lets hear the same out of your mouth about having the same feature set. You hate Nvidia FAR more than I have a disdain for ATI with these ridiculous comments. That much is clear.
If you are waiting for HDCP, what are you waiting for? Every GX2 already has that.Originally posted by: josh6079
I see your point. DX10 isn't implemented yet and probably won't be until 2007. However, hardware for it is starting to come out (e.g. HDCP monitors, video cards) and because of that, why not wait until G80? If Nvidia is going to release it before ATI's R600 (like the 78's were released before the X1K series) you can't tell me that that wouldn't be as good of an investment as qSLI and a regular AMD setup.
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.You can ramble all day about the differences between ATI and Nvidia, we know what they are. I don't care if ATI were to perish from the market nor if Nvidia lost every piece of corporate asset. I've got a good, quality product from ATI that has fit my needs just like you've gotten a good, quality product from Nvidia that has fit yours. They both have made them in an attempt to out sale the other and have therefore brought better products. I don't hold disdain for something that is instigating higher engineering and progression like you. The fact that you like to see things as Kings and Queens (I thought you were damn proud to be an american?) is kind of sad. Without ATI, there might not have been a qSLI, just like without Nvidia, there might not have been the X1k series.
I can live with that.This thread has gotten into a clarification between you and me and our views of qSLI. Our views are different, and I'm okay with that. Our intentions with gaming are different, and I'm okay with that. Perhaps I was wrong to start carrying on about the near future and the coming of DX10 and such, you are right in that no one really knows. You and I would purchase different things and fortuneatly for the both of us the market is diverse enough for that to happen.
I'm not sure, but I dont really like to brag about quad or condone it until Nvidia has what they feel is a market-ready product. I'd give ATI the same benefit if they were in a similar situation.Originally posted by: josh6079
Back to the topic, out of the beta drivers is there anything in them that people are hoping gets improved before the official ones hit? I know sometimes the only thing that makes them different is the WHQL certification.
Also, just how heavy of a PSU does one need for it? Some companies exaggerate on the amount required, but what is the lowest PSU that someone can get by with?
Originally posted by: Crusader
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.
Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.
Originally posted by: thilan29
Originally posted by: Crusader
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.
Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.
So what's your opinion on SM3 that was introduced with the 6 series with ONE game supporting at the time and a massive performance hit??
That wasn't progression(I actually thought it was)?? By your logic the 6800 series was badly engineered??
I'd be surprised if I get any decent, logical response to that.
Originally posted by: Crusader
Originally posted by: thilan29
Originally posted by: Crusader
I dont consider wasted silicon on features that dont run well on high resolutions to be "higher engineering", nor progression.
Thats called a check-box feature. Looks nice, feels nice. But will be about as useful as PS1.4 on a 8500 is while running BF2 today. Wont mean its "future proof" (to any extent at all). Its a bad implementation, meant to sell cards, nothing more.
So what's your opinion on SM3 that was introduced with the 6 series with ONE game supporting at the time and a massive performance hit??
That wasn't progression(I actually thought it was)?? By your logic the 6800 series was badly engineered??
I'd be surprised if I get any decent, logical response to that.
A decent logical response.. do you deserve it?
My view on SM3 is that its mostly a performance enhancer for the most part, beyond vertex displacement mapping.
So, while there might not have been a ton of games that used it.. theres certainly nothing wrong with the Geforce6's implementation of a feature meant to speed up processes.
Its a bonus, and one that cant hurt because theres no such thing as a "slow SM3 implementation".
HDR+AA on ATI is simply too slow for high res in a first-person game. Not to mention theres only 1 really popular game that uses it, and it requires a 3rd party hack to work.
So no, I'm not a huge fan of that.
This is your quote, but I've never seen you say such a nice thing for HDR+AA. However, I see you sorta sidestepped my question and focused on only "SM3" when I'm sure you could have guessed I was referring to the performance hit in FarCry(the first game to use the 6 series SM3 features).Its a bonus, and one that cant hurt because theres no such thing as a "slow SM3 implementation".
Originally posted by: thilan29
Alright, my question is specifically about the FP16 HDR implementation that was a highly touted feature of SM3, especially for FarCry. It took a massive performance hit...but was still a highly touted feature.
Originally posted by: Gstanfor
FP16 HDR was not and is not part of SM3.0...
Custom fp16-fp32 shader program...Shader Model 3.0 gives developers full and precise control over specular and fog computations, previously fixed-function
I asked Yerli about the image quality problems on GeForce 6800 cards in version 1.1 of Far Cry. He dismissed those as "just rendering issues" and said that the Shader Model 3.0 path cleaned up those problems. The SM3.0 path does include a mix of FP16 and FP32 datatypes on GeForce 6800 cards for the sake of performance, but Yerli said Crytek never sacrificed image quality for performance, a move that's "not an option" for them. Out of curiosity, I also asked whether he knew of any instances where the use of FP24 precision on ATI cards caused visual artifacts, and he said no.