If you could have either the x800 xt pe or 6800 ultra which would you take?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Answer changes as of today, or should:
I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man.
I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not.
The 6800 is all about options and flexibility, for users and developers alike.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Rollo
Answer changes as of today, or should:
I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man.
I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not.
The 6800 is all about options and flexibility, for users and developers alike.

well. that certainly is a poor answer.

1) there are no 6800 pci-e cards.
2) there are no dual pci-e mainboards.

you might want to save this answer for this fall, cause if you bought a couple now, you'd be SOL ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: CaiNaM
Originally posted by: Rollo
Answer changes as of today, or should:
I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man.
I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not.
The 6800 is all about options and flexibility, for users and developers alike.

well. that certainly is a poor answer.

1) there are no 6800 pci-e cards.
2) there are no dual pci-e mainboards.

you might want to save this answer for this fall, cause if you bought a couple now, you'd be SOL ;)

How about I re-phrase Cainam?

"The announcement of the return of SLI-esque technology gives the serious and moderately serious gamer ample reason to consider waiting for a PCIE 6800/6800GT/6800U over any other card currently made. With the promise of 77-90% more actual performance waiting in Sept. for those willing to pay a relatively steep start up cost, or the possibility of having high end performance for at least two generations of cards instead of one, the "advantage" of slightly higher framerates on some games seems much less important than it once did."

Better? ;)
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Rollo
Originally posted by: CaiNaM
Originally posted by: Rollo
Answer changes as of today, or should:
I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man.
I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not.
The 6800 is all about options and flexibility, for users and developers alike.

well. that certainly is a poor answer.

1) there are no 6800 pci-e cards.
2) there are no dual pci-e mainboards.

you might want to save this answer for this fall, cause if you bought a couple now, you'd be SOL ;)

How about I re-phrase Cainam?

"The announcement of the return of SLI-esque technology gives the serious and moderately serious gamer ample reason to consider waiting for a PCIE 6800/6800GT/6800U over any other card currently made. With the promise of 77-90% more actual performance waiting in Sept. for those willing to pay a relatively steep start up cost, or the possibility of having high end performance for at least two generations of cards instead of one, the "advantage" of slightly higher framerates on some games seems much less important than it once did."

Better? ;)

i certainly could not fault that reasoning :)

for me tho, it might mean i just hang on to my x800 (rather getting a GT for my second rig as i was planning) until this fall as i would likely have to upgrade my psu (tt silentpower 480) along with my mainboard.

for me tho what is enticing is the upgrade possibilites. sli really isn't needed currently, (nothing out or will be in the near future that my card couldn't run at 60fps +) but say 6-12 months from now it could certainly be a very good alternative to having to upgrade to new technology - just add a second, less expensive card.

edit: also... thinking more on it, it occured to me that right now many games are cpu limited. w/o upgrading my cpu considerably (currently 3.2 p4), the added expense of mb, ps, second gfx card would not make much sense. while it would be cool, this is really geared towards the future imo. this certainly would make the 6800 series somewhat "futureproof", which is a big thing as i really don't believe there is such a thing regrading gfx cards currently.
 

FiberoN

Senior member
Apr 10, 2004
390
0
0
My X800 is running flawlessly. I don't understand why you would want a card that uses more power, is bigger, and runs hotter, than a card that is faster, smaller (heatsink), runs on less power and is just plain kickass :)
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
reading these threads for the past few months, I wsa convinced that the 6800u was the card to have. In reality, the x800xt smokes the 6800ultra, which performs on par with the x800pro.
see hardocps new article.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: FiberoN
My X800 is running flawlessly. I don't understand why you would want a card that uses more power, is bigger, and runs hotter, than a card that is faster, smaller (heatsink), runs on less power and is just plain kickass :)

actually the r420's put out more heat.. could just be ati's cooling solution is not quite as good tho.

and flawlessly? you obviously don't play SC. other than that my x800 runs near flawlessly as well (is anything ever flawless?), but that doesn't take anything away from the 6800, and doesn't mean nv40 isn't good on it's own merits.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: gururu
reading these threads for the past few months, I wsa convinced that the 6800u was the card to have. In reality, the x800xt smokes the 6800ultra, which performs on par with the x800pro.
see hardocps new article.

I have a feeling anything ATI is going to "smoke" anything nVidia on Kyle's page Gururu, he'll run every setting he can till he finds one that makes ATI look better. If that doesn't work, he'll disable the nVidia optomizations, ask ATI how to ratchet the level of theirs, and say it's because to him it looks comparable. If that doesn't work he'll use GF1 drivers because they're supposed to be unified right?

I know I'm exaggerating, but H seems to have developed a need to put nVidia in a negative light of late. I don't know if they stopped sending him cards, or Brian Burke dissed Kyle's mom, but he really seems to go out of his way not to put up standardized, optomizations either both enabled or both disabled reviews the last year or so.

I forsee his review this fall:
X800 Pro 133% faster then 6800U SLI! Save your money!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: FiberoN
My X800 is running flawlessly. I don't understand why you would want a card that uses more power, is bigger, and runs hotter, than a card that is faster, smaller (heatsink), runs on less power and is just plain kickass :)


I see the heat, size, and power argument a lot from ATI minded folk. What do gamers with gaming boxes care about any of that?
Do you think we have mini towers with only the psu fan? I have an Enlight full tower with 1 intake, 2 exhaust, a Thermaltake Xaser 3 with 7 fans, and a mid tower with 2 exhaust. If a card is big and hot it doesn't matter to me.

The kickass card has no possibility of ever running SM3, or in SLI.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Rollo, I've noticed a good deal of ATI love by Kyle the last few years. But it hasn't been totally unwarranted, the cards were praised by many a site. It's getting really confusing and I guess the answer(fastest card) won't be known until the retail solutions are tested in a similar cpu scaling manner by Anand, who's own results with the strict xt and ultra comparison showed an equal playing field. ;)


But honestly Rollo,
"Answer changes as of today, or should: I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man. I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not. The 6800 is all about options and flexibility, for users and developers alike. "

you really, really sound like nvidia PR with that quote! Shame on you!

gururu
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: Rollo
Originally posted by: gururu
reading these threads for the past few months, I wsa convinced that the 6800u was the card to have. In reality, the x800xt smokes the 6800ultra, which performs on par with the x800pro.
see hardocps new article.

I have a feeling anything ATI is going to "smoke" anything nVidia on Kyle's page Gururu, he'll run every setting he can till he finds one that makes ATI look better. If that doesn't work, he'll disable the nVidia optomizations, ask ATI how to ratchet the level of theirs, and say it's because to him it looks comparable. If that doesn't work he'll use GF1 drivers because they're supposed to be unified right?

I know I'm exaggerating, but H seems to have developed a need to put nVidia in a negative light of late. I don't know if they stopped sending him cards, or Brian Burke dissed Kyle's mom, but he really seems to go out of his way not to put up standardized, optomizations either both enabled or both disabled reviews the last year or so.

I forsee his review this fall:
X800 Pro 133% faster then 6800U SLI! Save your money!
Kyle = The Anti Rollo! :p
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Old Fart:
Kyle = The Anti Rollo!
LOL literally. Thank you.

Gururu:
you really, really sound like nvidia PR with that quote! Shame on you!
LOL again. Sorry, I was/am pretty hyped over the prospect of SLI. It just throws everything else out the window in terms of what we consider a "high performing" card.
You do realize that the difference we've just seen between last gen and this gen is about to be doubled?!
I loved my sli rig, for some of the same reasons I loved my 5800U. It wasn't elegant by a long shot, but it had the "brute force" factor.

So Kyle can pick a few settings with some drivers where the X800 Pro seems as fast as a 6800U. NOTHING is going to seem fast compared to 6800x SLI. There won't be anything that remotely compares.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
They...are even, seriously, they each have their advantages. I myself will be getting a 6800 series card since its on sale at compusa.(if they ever get it in stock)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: CaiNaM
Originally posted by: Rollo
Answer changes as of today, or should:
I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man.
I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not.
The 6800 is all about options and flexibility, for users and developers alike.

well. that certainly is a poor answer.

1) there are no 6800 pci-e cards.
2) there are no dual pci-e mainboards.

you might want to save this answer for this fall, cause if you bought a couple now, you'd be SOL ;)

WE KNOW it cant be purchased right now. Dont mean to shout, but I just wanted to make sure you got that. We are thinking out of the box. We have gotten past the fact that we can't buy this now and are now in waiting. September is not that far away and many enthusiasts will gladly wait, me included, for the PCI-e cards and dual pci-e mobos. Intel has one that can be purchased right now. Though intended to be a server board, it is available for quite a bit of cashola. Yes, you say it's interesting, and it is very interesting indeed. This announcement of "soon to be" SLI nvidia cards has probably made a good number of people who were intending to go out and buy ATI this weekend, pause. And say, "hmm. I just might be interested in that. I think I'll wait a bit." And they turn their cars around and go home. Not making their ATI purchase. I am willing to wager ATI will lose many sales over this, not just because of SLI, but because of the scalability and the option to buy one now, buy one later if I need to. Even if it is only a small percentage of high end users compared to the total scope of consumers.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: FiberoN
My X800 is running flawlessly. I don't understand why you would want a card that uses more power, is bigger, and runs hotter, than a card that is faster, smaller (heatsink), runs on less power and is just plain kickass :)

Trust me. No 6800 user is going to open their electric bill at the end of the month and say, "Awww, crap. I knew I should have bought the ATI card."

I swear, people are reacting to the 15 to 25w difference (or whatever it is) as if they are adding central A/C to their houses.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: gururu
Rollo, I've noticed a good deal of ATI love by Kyle the last few years. But it hasn't been totally unwarranted, the cards were praised by many a site. It's getting really confusing and I guess the answer(fastest card) won't be known until the retail solutions are tested in a similar cpu scaling manner by Anand, who's own results with the strict xt and ultra comparison showed an equal playing field. ;)


But honestly Rollo,
"Answer changes as of today, or should: I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man. I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not. The 6800 is all about options and flexibility, for users and developers alike. "

you really, really sound like nvidia PR with that quote! Shame on you!

gururu

Regardless of how Rollo sounds here, gururu, he is still correct. Name one part of his post that is incorrect besides the "now" part. We all know he means when it first becomes available.
Nvidia just opened up the floodgates with pure power x2 and respect for them is warranted here. Nvidia is being extremely brutal this go round. I think they may have changed their company motto to "Enough is Enough".. J/K. :)
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: keysplayr2003


Regardless of how Rollo sounds here, gururu, he is still correct. Name one part of his post that is incorrect besides the "now" part. We all know he means when it first becomes available.
Nvidia just opened up the floodgates with pure power x2 and respect for them is warranted here. Nvidia is being extremely brutal this go round. I think they may have changed their company motto to "Enough is Enough".. J/K. :)

incorrect, yea...
"fastest, most feature rich, best IQ setup known to man"

have you ever seen a pixar film?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I am no more biased today than I was then, which is "not at all"?
I don't know how biased you were in the past but your nVidia bias has been clearly visible ever since your fetish with the 5800U began about a year ago.

You're right, a year ago I said over and over that PS2 was irrelevant because there were no PS2 games.
But there were SM 2.0 games out already. The problem is that you dismissed them because you didn't like them, you don't stop to look at pixels and/or "smack smack smack, oohhh... ahhhh... shiny pipes".

Ironically you used the last "reason" on Far Cry but now you're claiming SM 3.0 matters because Far Cry is a good game and because the SM 3.0 patch is due.

So when ATi had the Far Cry advantage in SM 2.0 it was a crap game and didn't matter. But when nVidia potentially has the Far Cry advantage in SM 3.0 it's not only a good game but the fact that the patch isn't even out yet doesn't matter.

The cards that you can buy now far outperform the first gen PS2 parts.
Likewise the next gen cards will kill the current generation in SM 3.0 performance. Why then do you recommend nVidia?

The difference with SM3 is there are two good games out right now (Far Cry/Painkiller) that will be patched to SM3 in a couple months.
There are no SM 3.0 games at all right now. The only difference here is that your nVidia bias has caused you to do a complete 180 in your position.

There are nine more games in development that will be SM3, some of which will come out in the next year.
And?

At the very high level of performance of the R420 and nV40 cards, I don't see a reason to deny yourself the ability to see what SM3 offers to get a few more frames at a level of performance you won't be able to tell the difference anyway.
And just how exactly would you know what the performance hit of running SM 3.0 would be? Do you have a link to a single game showing the difference between running SM 2.0 and SM 3.0 paths?

Likewise, why deny yourself the ability to run ATi under SM 2.0 and both double nVidia's SM 2.0 performance and marginally decrease from SM 1.x (I'm referring to the last gen)?

I don't see anything "biased" about that, it seems like common sense.
That's the problem - you don't see anything wrong with your behaviour. You also don't see anything wrong with your 5800U shenanigans and have somehow convinced yourself that what you did is normal behaviour.

I think it's deceptive ATI advised reviewers to disable it for comparison, and when called on their fraud, stated they consider their brilinear true trilinear.
Yet when nVidia secretly slipped in brilinear into all Direct3D games your answer was that you don't spend your days looking at still screenshots and instead play games.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: keysplayr2003
Originally posted by: CaiNaM
Originally posted by: Rollo
Answer changes as of today, or should:
I'd buy a 6800/6800GT/6800U because I could SLI it and have hands down the fastest, most feature rich, best IQ setup known to man.
I could pay a lot of money now for this as an enthusiast, or wait until the card shows it's age and stretch it's life if I'm not.
The 6800 is all about options and flexibility, for users and developers alike.

well. that certainly is a poor answer.

1) there are no 6800 pci-e cards.
2) there are no dual pci-e mainboards.

you might want to save this answer for this fall, cause if you bought a couple now, you'd be SOL ;)

WE KNOW it cant be purchased right now. Dont mean to shout, but I just wanted to make sure you got that.

you need to read the comment in the context in which it's said....
 

NewBlackDak

Senior member
Sep 16, 2003
530
0
0
The other day in Sams I saw a test where they had a regular lightbulb, and a flourescent bulb hooked up to a power meter. You could flip back and forth, and see which was drawing more current. Of course this can be totally faked, but I'd like to see identically configured machines(barring these 2 card) put through a battery of graphic benchmarks. Then tell us how much power they used. It'd be cool if they did this with A64 vs prescott, and other things aswell.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
I am no more biased today than I was then, which is "not at all"?
I don't know how biased you were in the past but your nVidia bias has been clearly visible ever since your fetish with the 5800U began about a year ago.
Err, I've never said anything untrue about 5800Us. All I've ever said is that up to 12X10 4X8X it's they are generally comparable to 9700Pros, on non DX9 games. (and that I like the engineering of them) You can believe me, if ATI was still making MAXXs I'd be all over them as well. For that matter, I was much less biased than you.
I bought a 9700P (used 8 months) 5800 (used 2 months) 9800 Pro (used 7 months) 5800U (used 3 months)
So I actually knew what I was talking about. You used a 9700Pro for those two years and quoted articles as justification for your skewed exaggerations. :roll:

You're right, a year ago I said over and over that PS2 was irrelevant because there were no PS2 games.
But there were SM 2.0 games out already. The problem is that you dismissed them because you didn't like them, you don't stop to look at pixels and/or "smack smack smack, oohhh... ahhhh... shiny pipes".

Ironically you used the last "reason" on Far Cry but now you're claiming SM 3.0 matters because Far Cry is a good game and because the SM 3.0 patch is due.
No there weren't SM 2 games out last year, other than Wallet Raider: She Won't Go Where I Point Her. Like I said all along, there would be no relevant SM2 games until a year after the nV3X series were launched/available, and by the time there would be, new cards would make R300/nV3X irrelevant. The "shiny pipes" comment you refer to was made this April, a year after nV30 availability.
Rollo=Nostradamus BFG="Well, the games might have come out during the year and a half I needlessly pimped SM2"


So when ATi had the Far Cry advantage in SM 2.0 it was a crap game and didn't matter. But when nVidia potentially has the Far Cry advantage in SM 3.0 it's not only a good game but the fact that the patch isn't even out yet doesn't matter.
Jury's still out whether I like Far Cry, I just started it. It's a popular game though and I recognize that. It seems a lot better than my original opinion of the demo. (and I said at the time I only knew the demo was bad)

The cards that you can buy now far outperform the first gen PS2 parts.
Likewise the next gen cards will kill the current generation in SM 3.0 performance. Why then do you recommend nVidia?
There will be actual SM3 games that come out in the year I consider a vid cards life. Rollo=Nostradamus ;)
For sure there will be SM3 games out by the time the usefull life of the SLI nV40 is over. Rollo=Not in Denial

The difference with SM3 is there are two good games out right now (Far Cry/Painkiller) that will be patched to SM3 in a couple months.
There are no SM 3.0 games at all right now. The only difference here is that your nVidia bias has caused you to do a complete 180 in your position.
LOL- it doesn't take much to guess the SM3 patches for two games sitting on my desk will probably be done a lot sooner than the year it took for a couple games to show up back when I made those comments.


There are nine more games in development that will be SM3, some of which will come out in the next year.
And?
And I'd like to hear the logic of why someone would deny themselves the ability to see what they're like in SM3, deny themselves the possibility of SLI power and longevity, for a few more frames at levels a few more don't matter. (and even that only on some games)

At the very high level of performance of the R420 and nV40 cards, I don't see a reason to deny yourself the ability to see what SM3 offers to get a few more frames at a level of performance you won't be able to tell the difference anyway.
And just how exactly would you know what the performance hit of running SM 3.0 would be? Do you have a link to a single game showing the difference between running SM 2.0 and SM 3.0 paths?
Who knows? I'm giving it a chance at least. If I bought a X800XT I might regret the choice as the games come out.

Likewise, why deny yourself the ability to run ATi under SM 2.0 and both double nVidia's SM 2.0 performance and marginally decrease from SM 1.x (I'm referring to the last gen)?
There weren't games BFG. Now that there are, my $300MSRP 6800, that some people are getting for $200, smacks a $400 9800XT around.

I don't see anything "biased" about that, it seems like common sense.
That's the problem - you don't see anything wrong with your behaviour. You also don't see anything wrong with your 5800U shenanigans and have somehow convinced yourself that what you did is normal behaviour.
LOL- my head hangs low for my crimes of liking to try video cards, especially enthusiast setups like SLI, MAXX, 5800U. LOL

I think it's deceptive ATI advised reviewers to disable it for comparison, and when called on their fraud, stated they consider their brilinear true trilinear.
Yet when nVidia secretly slipped in brilinear into all Direct3D games your answer was that you don't spend your days looking at still screenshots and instead play games.
[/quote]
That is true, and remains true to this day. The only reasons I make a big deal about "Trylinear" is the 8,034,266X guys like you shoved brilinear down my throat as if it's some proof nVidia are Nazi war criminals. Damn skippy you're going to hear about it when your beloved, championed, put on pedestal Canadian Gods not only do the same thing but compound it by a. denying it b. telling reviewers to turn off nVidia optomizations c. refuse to remove it in their drivers.
Sue me for not just chuckling and saying,"What a wacky turn of events".
Yeah, I didn't mind pointing that out at all.

BTW- I got GnR Live Era yesterday, if you don't have it, it's worth your money. Brings back a lot of good memories of my own youth.
Please take no offense at the "heated" debate, it reads more hostile than I mean it to.
 

bcoupland

Senior member
Jun 26, 2004
346
0
76
I'd take a 9800pro for 200$ and save the remaining money for the next-gen Ati or Nvidia card.
R500 or NV50, which ever one performs better. Or, I'd wait five months if I already had at least
a 9600XT or a 5900XT, and wait for the prices of the 6800's and X800's to come down.
 

DarkKnight

Golden Member
Apr 21, 2001
1,197
0
0
I'd take 2 6800Ultras:D
If i had to only choose one, I'd still take the 6800Ultra cause then later down the road when My pc is getting older, I get get another one(when the prices are lower), and get a 77% boost in performance.

If it's true that alienware is gonna let other companies use their video array, then I'd have to see how the 2 setups would compare.
 

UnTech

Member
Mar 25, 2002
169
0
0
Originally posted by: bcoupland
I'd take a 9800pro for 200$ and save the remaining money for the next-gen Ati or Nvidia card.
R500 or NV50, which ever one performs better. Or, I'd wait five months if I already had at least
a 9600XT or a 5900XT, and wait for the prices of the 6800's and X800's to come down.


Good point unless Half-life2 and Doom3 are released in the next few months. But I couldn't wait, I just got my BFG 6800GT today.

BTW-Rollo- GnR Live Era rocks. I just picked up a CD the other day called "Hollywood Rose-the roots of guns n' roses". It's early GnR (before Slash).
 

SmuvMoney

Member
Sep 9, 2002
28
0
0
I'd take a single slot 6800 Ultra over a X800 XT. However, only two board makers are making single slot Ultras. I would definitely take a X800 XT over a reference/dual-slot Ultra.