Originally posted by: zendari
Originally posted by: Gamingphreek
Again in SCCT, now the x850xtpe is 50% faster than the 6800 U.
Of course there is the small tidbit that the Xxx series run in SM1.1 because they do not support 3.0. Im sure that is responsible for a VERY large percentage of performance.
-Kevin
Learn to read kevin.
Chaos Theory has a Shader Model 3.0 path, and we used it on the GeForce cards, but we didn't turn on any of the extra effects; it was simply a performance enhancement.
How would the 6800 U done without its performance enhancement I wonder.
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?
Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"
Originally posted by: zendari
Originally posted by: Gamingphreek
Again in SCCT, now the x850xtpe is 50% faster than the 6800 U.
Of course there is the small tidbit that the Xxx series run in SM1.1 because they do not support 3.0. Im sure that is responsible for a VERY large percentage of performance.
-Kevin
Learn to read kevin.
Chaos Theory has a Shader Model 3.0 path, and we used it on the GeForce cards, but we didn't turn on any of the extra effects; it was simply a performance enhancement.
How would the 6800 U done without its performance enhancement I wonder.
Originally posted by: Rollo
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?
Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"
Another useless Bouzouki post, congrats, you're consistent!
Let me break it down so Bouzouki can understand:
If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.
You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.
The thread was about SLI, he was referencing single card performance. I was correct to point it out.
Worse yet, the single card performance he was referencing was in the teens and twenties for averages, totally unplayable.
Does that help Bouzouki? PM if I need to go in to more detail about why my post was accurate and appropriate, and yours was just another in a long line of meaningless babble.
I'd be happy to help you.
Originally posted by: BouZouki
Originally posted by: Rollo
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?
Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"
Another useless Bouzouki post, congrats, you're consistent!
Let me break it down so Bouzouki can understand:
If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.
You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.
The thread was about SLI, he was referencing single card performance. I was correct to point it out.
Worse yet, the single card performance he was referencing was in the teens and twenties for averages, totally unplayable.
Does that help Bouzouki? PM if I need to go in to more detail about why my post was accurate and appropriate, and yours was just another in a long line of meaningless babble.
I'd be happy to help you.
Thanks for clearing that up Rollo!
Originally posted by: Rollo
Originally posted by: BouZouki
Cant accept Nvidia 6 series gets beat badly by ATi or somthing?
Anyone can reply with "SO WHAT?? 6800GT DESTROYS YOU, KTHXBYE"
Another useless Bouzouki post, congrats, you're consistent!
Let me break it down so Bouzouki can understand:
If 2 competing cards run a game at 1 fps and 2 fps, while it is true the second card "beats the first badly, twice as fast", there's not much reason to care.
You see Bouzouki, if both are running the game at a level that is below playable, no one will ever see that game at that setting, unless they're really, really stupid, or really, really wasted.
The thread was about SLI, he was referencing single card performance. I was correct to point it out.
I'd be happy to help you.
Me, but not you.Originally posted by: zendari
Who defines playable?
They might not be to you. Some people think it's better to have the option to see them at all than not at all.And if this is the case, why your super awesome HDR and softshadows (which run like crap at 16x12 on a 6800 card) of any importance? They run at mediocre FPS.
Oohhh ahhhh yeaaahhh I can run awesome HDR at 10 year old 10x7 res which looks like its from 2000.
You might see 32fps as "playable"- I consider it a JOKE. If you see stuttering when the framerate drops below 30 Zendari, how often do you think it stutters if your average is 32? :roll: I consider low 40s to be about minimum, 50s to be smoothg.I can certainly see how some people could consider the 32 FPS by the x850 playable in SCCT and the 22 FPS by the 6800 U unplayable.
And you keep on apologizing for why ATI can't figure out things like SLI, soft shadows, SM3, HDR, and Linux and trying to play it off like "you don't need this stuff anyway".But continue comparing $1000 Nvidia rigs to $300-400 ATI rigs and proclaiming Nvidia's superiority.
Originally posted by: Rollo
Me, but not you.Originally posted by: zendari
Who defines playable?
I'm glad you take the initiative to decide everything for other people. Not doing so would conflict with your videocard superiority complex.
They might not be to you. Some people think it's better to have the option to see them at all than not at all.And if this is the case, why your super awesome HDR and softshadows (which run like crap at 16x12 on a 6800 card) of any importance? They run at mediocre FPS.
I loved seeing Farcry HDR run at 15 FPS at 16x12 when I had my 6800 card. But, not much reason to care, as you aptly put it.
Oohhh ahhhh yeaaahhh I can run awesome HDR at 10 year old 10x7 res which looks like its from 2000.
I can assure you nothing we had in 2000 looked like 10X7 Far Cry with HDR. Beyond that, not all implementations of HDR are equal- SC:CT's HDR benefits much from SLI, whereas Far Cry's barely changes. And while there is no ATI "sli" solution, to a lot of people that is a very big deal.
Beyond that, my own benches show HDR is a very usable feature on 7800GTX- 51-68fps average on 12x10 HDR 0X8X? <a target=_blank class=ftalternatingbarlinklarge href="http://forums.anandtech.com/messageview.aspx?catid=31&threadid=1622346&enterthread=y">http://forums.anandtech.com/message...atid=31&threadid=1622346&enterthread=y</a>
37-48fps average on 16X12 HDR 0X8X?
The days of HDR being "questionable" in it's use are behind us. Time to get with the program Zendari, the technology is leaving you behind.
7800 <> 6800. I wonder what nonsense you'll be parroting when the R520 is released.
You might see 32fps as "playable"- I consider it a JOKE. If you see stuttering when the framerate drops below 30 Zendari, how often do you think it stutters if your average is 32? :roll: I consider low 40s to be about minimum, 50s to be smoothg.I can certainly see how some people could consider the 32 FPS by the x850 playable in SCCT and the 22 FPS by the 6800 U unplayable.
Good. I take it you agree the 6800 U is a bad card? Your vaunted 6800 U SLLI setup doesn't offer those framerates at 20x15 in Farcry, HL2, Riddick, or SCCT. I guess your "next-gen ready" 6800 U SLI that you kept parroting about a few months ago is stuck in last gen resolutions along with a single x850!.
And you keep on apologizing for why ATI can't figure out things like SLI, soft shadows, SM3, HDR, and Linux and trying to play it off like "you don't need this stuff anyway".But continue comparing $1000 Nvidia rigs to $300-400 ATI rigs and proclaiming Nvidia's superiority.
You sound EXACTLY like the 3dfx apologists of yore- "You don't need 32 bit color, bump mapping or hardware transform and lighting! When you do, 3dfx will give it to you!" :roll:
Brand loyalty is a harsh mistress Zendari.
It rather sounds like the Nvidia apologists "you dont need no stinkin SM2.0 from the R300 days.
