is there any point to SLI?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PSUstoekl

Member
Jun 20, 2005
137
0
0
yes, i am an "average" computer user...meaning i have "only" 1280 resolution, and a 3500 CPU. By the way, my GPU is pretty damn ghetto (9600, free HL2 baby!).

I'm waiting for the next-gen ATi wonderments.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: zendari
Originally posted by: Gamingphreek
Again in SCCT, now the x850xtpe is 50% faster than the 6800 U.

Of course there is the small tidbit that the Xxx series run in SM1.1 because they do not support 3.0. Im sure that is responsible for a VERY large percentage of performance.

-Kevin

Learn to read kevin.

Chaos Theory has a Shader Model 3.0 path, and we used it on the GeForce cards, but we didn't turn on any of the extra effects; it was simply a performance enhancement.

How would the 6800 U done without its performance enhancement I wonder.

You cannot disable features in Shader Models. That is why they have 1.0 1.1 2.0 and 3.0 spec.

Your logic on that entire post is flawed.

-Kevin
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?


Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"

Another useless Bouzouki post, congrats, you're consistent!

Let me break it down so Bouzouki can understand:

If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.

You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.

The thread was about SLI, he was referencing single card performance. I was correct to point it out.

Worse yet, the single card performance he was referencing was in the teens and twenties for averages, totally unplayable.

Does that help Bouzouki? PM if I need to go in to more detail about why my post was accurate and appropriate, and yours was just another in a long line of meaningless babble.

I'd be happy to help you.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: zendari
Originally posted by: Gamingphreek
Again in SCCT, now the x850xtpe is 50% faster than the 6800 U.

Of course there is the small tidbit that the Xxx series run in SM1.1 because they do not support 3.0. Im sure that is responsible for a VERY large percentage of performance.

-Kevin

Learn to read kevin.

Chaos Theory has a Shader Model 3.0 path, and we used it on the GeForce cards, but we didn't turn on any of the extra effects; it was simply a performance enhancement.

How would the 6800 U done without its performance enhancement I wonder.

I think Kevin can read- many have stated issues with banding and IQ in SM 1.1 in SC:CT. Not to mention the card is doing more work rendering the scenes more accurately. (brighter lights, etc)

 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Originally posted by: Rollo
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?


Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"

Another useless Bouzouki post, congrats, you're consistent!

Let me break it down so Bouzouki can understand:

If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.

You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.

The thread was about SLI, he was referencing single card performance. I was correct to point it out.

Worse yet, the single card performance he was referencing was in the teens and twenties for averages, totally unplayable.

Does that help Bouzouki? PM if I need to go in to more detail about why my post was accurate and appropriate, and yours was just another in a long line of meaningless babble.

I'd be happy to help you.


Thanks for clearing that up Rollo!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BouZouki
Originally posted by: Rollo
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?


Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"

Another useless Bouzouki post, congrats, you're consistent!

Let me break it down so Bouzouki can understand:

If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.

You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.

The thread was about SLI, he was referencing single card performance. I was correct to point it out.

Worse yet, the single card performance he was referencing was in the teens and twenties for averages, totally unplayable.

Does that help Bouzouki? PM if I need to go in to more detail about why my post was accurate and appropriate, and yours was just another in a long line of meaningless babble.

I'd be happy to help you.


Thanks for clearing that up Rollo!


I'm all about the giving Bouzouki.
 

zendari

Banned
May 27, 2005
6,558
0
0
Originally posted by: Rollo
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?


Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"

Another useless Bouzouki post, congrats, you're consistent!

Let me break it down so Bouzouki can understand:

If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.

You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.

The thread was about SLI, he was referencing single card performance. I was correct to point it out.


I'd be happy to help you.

Who defines playable? And if this is the case, why your super awesome HDR and softshadows (which run like crap at 16x12 on a 6800 card) of any importance? They run at mediocre FPS.


Oohhh ahhhh yeaaahhh I can run awesome HDR at 10 year old 10x7 res which looks like its from 2000.

I can certainly see how some people could consider the 32 FPS by the x850 playable in SCCT and the 22 FPS by the 6800 U unplayable.

But continue comparing $1000 Nvidia rigs to $300-400 ATI rigs and proclaiming Nvidia's superiority.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: zendari
Who defines playable?
Me, but not you.

And if this is the case, why your super awesome HDR and softshadows (which run like crap at 16x12 on a 6800 card) of any importance? They run at mediocre FPS.
They might not be to you. Some people think it's better to have the option to see them at all than not at all.

Oohhh ahhhh yeaaahhh I can run awesome HDR at 10 year old 10x7 res which looks like its from 2000.

I can assure you nothing we had in 2000 looked like 10X7 Far Cry with HDR. Beyond that, not all implementations of HDR are equal- SC:CT's HDR benefits much from SLI, whereas Far Cry's barely changes. And while there is no ATI "sli" solution, to a lot of people that is a very big deal.
Beyond that, my own benches show HDR is a very usable feature on 7800GTX- 51-68fps average on 12x10 HDR 0X8X? <a target=_blank class=ftalternatingbarlinklarge href="http://forums.anandtech.com/messageview.aspx?catid=31&threadid=1622346&enterthread=y">http://forums.anandtech.com/message...atid=31&threadid=1622346&enterthread=y</a>
37-48fps average on 16X12 HDR 0X8X?

The days of HDR being "questionable" in it's use are behind us. Time to get with the program Zendari, the technology is leaving you behind.

I can certainly see how some people could consider the 32 FPS by the x850 playable in SCCT and the 22 FPS by the 6800 U unplayable.
You might see 32fps as "playable"- I consider it a JOKE. If you see stuttering when the framerate drops below 30 Zendari, how often do you think it stutters if your average is 32? :roll: I consider low 40s to be about minimum, 50s to be smoothg.

But continue comparing $1000 Nvidia rigs to $300-400 ATI rigs and proclaiming Nvidia's superiority.
And you keep on apologizing for why ATI can't figure out things like SLI, soft shadows, SM3, HDR, and Linux and trying to play it off like "you don't need this stuff anyway".

You sound EXACTLY like the 3dfx apologists of yore- "You don't need 32 bit color, bump mapping or hardware transform and lighting! When you do, 3dfx will give it to you!" :roll:

Brand loyalty is a harsh mistress Zendari. ;)

 

zendari

Banned
May 27, 2005
6,558
0
0
Originally posted by: Rollo
Originally posted by: zendari
Who defines playable?
Me, but not you.

I'm glad you take the initiative to decide everything for other people. Not doing so would conflict with your videocard superiority complex.

And if this is the case, why your super awesome HDR and softshadows (which run like crap at 16x12 on a 6800 card) of any importance? They run at mediocre FPS.
They might not be to you. Some people think it's better to have the option to see them at all than not at all.

I loved seeing Farcry HDR run at 15 FPS at 16x12 when I had my 6800 card. But, not much reason to care, as you aptly put it.

Oohhh ahhhh yeaaahhh I can run awesome HDR at 10 year old 10x7 res which looks like its from 2000.

I can assure you nothing we had in 2000 looked like 10X7 Far Cry with HDR. Beyond that, not all implementations of HDR are equal- SC:CT's HDR benefits much from SLI, whereas Far Cry's barely changes. And while there is no ATI "sli" solution, to a lot of people that is a very big deal.
Beyond that, my own benches show HDR is a very usable feature on 7800GTX- 51-68fps average on 12x10 HDR 0X8X? <a target=_blank class=ftalternatingbarlinklarge href="http://forums.anandtech.com/messageview.aspx?catid=31&threadid=1622346&enterthread=y">http://forums.anandtech.com/message...atid=31&threadid=1622346&enterthread=y</a>
37-48fps average on 16X12 HDR 0X8X?

The days of HDR being "questionable" in it's use are behind us. Time to get with the program Zendari, the technology is leaving you behind.

7800 <> 6800. I wonder what nonsense you'll be parroting when the R520 is released.

I can certainly see how some people could consider the 32 FPS by the x850 playable in SCCT and the 22 FPS by the 6800 U unplayable.
You might see 32fps as "playable"- I consider it a JOKE. If you see stuttering when the framerate drops below 30 Zendari, how often do you think it stutters if your average is 32? :roll: I consider low 40s to be about minimum, 50s to be smoothg.

Good. I take it you agree the 6800 U is a bad card? Your vaunted 6800 U SLLI setup doesn't offer those framerates at 20x15 in Farcry, HL2, Riddick, or SCCT. I guess your "next-gen ready" 6800 U SLI that you kept parroting about a few months ago is stuck in last gen resolutions along with a single x850!.

But continue comparing $1000 Nvidia rigs to $300-400 ATI rigs and proclaiming Nvidia's superiority.
And you keep on apologizing for why ATI can't figure out things like SLI, soft shadows, SM3, HDR, and Linux and trying to play it off like "you don't need this stuff anyway".

You sound EXACTLY like the 3dfx apologists of yore- "You don't need 32 bit color, bump mapping or hardware transform and lighting! When you do, 3dfx will give it to you!" :roll:

Brand loyalty is a harsh mistress Zendari. ;)

It rather sounds like the Nvidia apologists "you dont need no stinkin SM2.0 from the R300 days.



I'll pose you the question again: Why get a 6800 U or GT SLI over a 7800 GTX when it can't compete at highres gaming?