Video Card Problem - X850XT or 6800 Ultra

jenneth

Member
Mar 4, 2005
125
0
76
Hi all,

I have decided to get a new computer soon (probably in about two weeks), and I have narrowed down my choices of video cards to the following:

1.)eVGA nVIDIA GeForce 6800 Ultra - Dual DVI

2.)SAPPHIRE ATI RADEON X850 XT

3.)Gigabyte GV-RX85T256V Radeon X850 XT - Dual DVI

I have a couple of questions:
1.) How's Gigabyte's X850XT? I know it has dual DVI output, but how does it compare to the Sapphire X850XT?
2.) I have a 2405FPW, and I've heard that some 6800 Ultras were having some problem with this LCD(when using DVI connectors), can someone give me some claarify on this?

Thank you,
jenneth
 

BillyBobJoel71

Platinum Member
Mar 24, 2005
2,610
0
71
not sure of the lcd thing, but the x850 xt is prolly better than the 6800 ultra, being an nvidia fan, i would get the 6800 u, but its up to you. both awesome but expensive cards. research them on ati and nvidia websites.
 

imverygifted

Golden Member
Dec 22, 2004
1,368
0
0
game performance wise i'd go for the 6800ultra jus because of the games on the doom 3 engine run better on the nvidia card
 

kini62

Senior member
Jan 31, 2005
254
0
0
Originally posted by: imverygifted
game performance wise i'd go for the 6800ultra jus because of the games on the doom 3 engine run better on the nvidia card

Games, What games? You have Doom III and well Doom III. The Doom Engine is not so hot. It's a resource hog. Dumps everything off on the GPU because the Id was to lazy to code it right. HL2 looks everybit as good and will run on half the resources. Vivendi is not using it for their new game F.E.A.R. Nor is anyone else that I know of.


Not a good reason to by a video card.

The X850XT beats it in everything else.

Or if you have to get an nVidia, get the GT instead and OC to Ultra levels.

Monarch has them on sale.
 

bjc112

Lifer
Dec 23, 2000
11,460
0
76
I like the X850xt w/ dual DVI.

Even though my 6800GT @ Ultra kicks ass as well.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: kini62
Originally posted by: imverygifted
game performance wise i'd go for the 6800ultra jus because of the games on the doom 3 engine run better on the nvidia card

Games, What games? You have Doom III and well Doom III. The Doom Engine is not so hot. It's a resource hog. Dumps everything off on the GPU because the Id was to lazy to code it right. HL2 looks everybit as good and will run on half the resources. Vivendi is not using it for their new game F.E.A.R. Nor is anyone else that I know of.


Not a good reason to by a video card.

The X850XT beats it in everything else.

Or if you have to get an nVidia, get the GT instead and OC to Ultra levels.

Monarch has them on sale.

Yeah no ones gonna play quake 4 :roll:

And i dont hear anyone bitching about the inconsistant performance of the engine for doom 3, only HL2 and CS:Source.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
If you've got the money to consider this stuff, you should be getting the Ultra and a SLI motherboard.
I don't know much about the widescreen lcd issue, but I believe it's only with SLI and could be solved with future driver releases.
Edit: AFAIK this "issue" only applies to widescreen lcds and one game engine (HL2) so I don't even consider it an "issue" given that you can simply opt not to run widescreen and play fine with all SLI benefit.

If you're buying this set to last you a year or more, the 6800U offers SM3 and the ability to dramatically increase performance with SLI.

The X850XT PE offers marginally better performance on some of the games that are out now.

At least one upcoming game, Splinter Cell Chaos Theory, is SM3 only. That means you will have to play that in SM1.1 on an X850.

If you're planning to play Quake 4 (or any other Doom3 engine game) on an X850XT PE, good luck. I played Doom3 on an X800XT PE this month with the latest drivers, it ran like ass. It benched about the same as one 6800GT, but I can tell you it was nowhere near as smooth. I think there were more low spikes.

If you want to see HDR in Far Cry, you will not on an ATI card.

If you want to see soft shadows in Riddick, you will not on an ATI card.

There are upcoming games coded in SM3, it's possible you will not get the displacement mapping with an ATI card.

This may all sound like I'm anti-ATI, but I am not. They make excellent products. The problem is the X850XT PE is a 2003 feature set limited stop gap card that is going to be obsolete when their real 2005 card, the R520 is released. If you want ATI, wait for that. Don't waste $500 on a faster version of a card they put out 2 years ago that doesn't even meet the MS standard for DX.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
6800GT and OC to Ultra for me. HDR and Sprinter Cell 3 are worth it for me...
 

jenneth

Member
Mar 4, 2005
125
0
76
The problem is the X850XT PE is a 2003 feature set limited stop gap card that is going to be obsolete when their real 2005 card, the R520 is released. If you want ATI, wait for that. Don't waste $500 on a faster version of a card they put out 2 years ago that doesn't even meet the MS standard for DX.

Yeah, that's what I'm afraid of. What I really want to do is to wait for the next gen (NV70? and R520), but the problem is that I don't know when they're going to be out.:( Anyone knows?

BTW, thanks for the helps everyone...:)
jenneth
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: jenneth
The problem is the X850XT PE is a 2003 feature set limited stop gap card that is going to be obsolete when their real 2005 card, the R520 is released. If you want ATI, wait for that. Don't waste $500 on a faster version of a card they put out 2 years ago that doesn't even meet the MS standard for DX.

Yeah, that's what I'm afraid of. What I really want to do is to wait for the next gen (NV70? and R520), but the problem is that I don't know when they're going to be out.:( Anyone knows?

BTW, thanks for the helps everyone...:)
jenneth

It is rumored R520 will be out in May but that's a rumor. In other words, expect it to be able to buy it sometime next year :roll: (ok I'm exaggerating). It's also rumored to cost $600-700, which is why I advise you to save your money and not buy an X850 this year. It doesn't have SM3 or fast stencil shadows.

I don't know about this gen, but last gen was full of paper launches with cards that didn't show up at MSRP until 6+ months later...

On your WUXGA LCD (1920x1200 native resolution), you're not going to play new games at any decent FPS. If you didn't already know, other resolutions will be chunky/blurry on LCDs, because they have a static amount of pixels. Here are some 2048x1536 benchmarks of the GeForce 6800 Ultra. The ATI X850 series will be the same or slower on these particular OpenGL engines. http://www.nvnews.net/previews/geforce_6800_ultra/page_6.shtml

You may be able to play Wolf: ET and Call of Duty, but don't even try Doom 3 or Far Cry.

----------------------------------------------------------------------
Some games on Doom 3 engine (not officially confirmed):

Quake 4
Return to Castle Wolfenstein 2
There may be some other ones that I'm unaware of as well.
----------------------------------------------------------------------

My advice:

Wait for next generation.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
On your WUXGA LCD (1920x1200 native resolution), you're not going to play new games at any decent FPS. If you didn't already know, other resolutions will be chunky/blurry on LCDs, because they have a static amount of pixels. Here are some 2408x1536 benchmarks.

He could play at that resolution fine with 6800GT or 6800U SLI- maybe not at 4X8X, but for sure at some setting. That's something no single card will give him, and at 19X12 the need for much AA becomes debatable anyway.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
the x850xt is the better card right now but with that monitor I would get a 6800GT sli setup.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
If SM 3.0 matters to you, go with the 6800 Ultra. If SM 3.0 doesn't matter to you, go with the X850 XT. Don't go with Gigabyte for the X850. Sapphire have been known for some of the best quality ATI cards. Go with BFG if you don't plan to OC. Go with Leadtek if you do plan to OC. BFG also has a 512mb 6800 Ultra. I don't know who sells it though.
 

OnEMoReTrY

Senior member
Jul 1, 2004
520
0
0
Originally posted by: Acanthus
Originally posted by: kini62
Originally posted by: imverygifted
game performance wise i'd go for the 6800ultra jus because of the games on the doom 3 engine run better on the nvidia card

Games, What games? You have Doom III and well Doom III. The Doom Engine is not so hot. It's a resource hog. Dumps everything off on the GPU because the Id was to lazy to code it right. HL2 looks everybit as good and will run on half the resources. Vivendi is not using it for their new game F.E.A.R. Nor is anyone else that I know of.


Not a good reason to by a video card.

The X850XT beats it in everything else.

Or if you have to get an nVidia, get the GT instead and OC to Ultra levels.

Monarch has them on sale.

Yeah no ones gonna play quake 4 :roll:

And i dont hear anyone bitching about the inconsistant performance of the engine for doom 3, only HL2 and CS:Source.

That's because no one plays doom 3 ^_^
 

kini62

Senior member
Jan 31, 2005
254
0
0
Originally posted by: Rollo
If you've got the money to consider this stuff, you should be getting the Ultra and a SLI motherboard.
I don't know much about the widescreen lcd issue, but I believe it's only with SLI and could be solved with future driver releases.
Edit: AFAIK this "issue" only applies to widescreen lcds and one game engine (HL2) so I don't even consider it an "issue" given that you can simply opt not to run widescreen and play fine with all SLI benefit.

If you're buying this set to last you a year or more, the 6800U offers SM3 and the ability to dramatically increase performance with SLI.

The X850XT PE offers marginally better performance on some of the games that are out now.

At least one upcoming game, Splinter Cell Chaos Theory, is SM3 only. That means you will have to play that in SM1.1 on an X850.

If you're planning to play Quake 4 (or any other Doom3 engine game) on an X850XT PE, good luck. I played Doom3 on an X800XT PE this month with the latest drivers, it ran like ass. It benched about the same as one 6800GT, but I can tell you it was nowhere near as smooth. I think there were more low spikes.

If you want to see HDR in Far Cry, you will not on an ATI card.

If you want to see soft shadows in Riddick, you will not on an ATI card.

There are upcoming games coded in SM3, it's possible you will not get the displacement mapping with an ATI card.

This may all sound like I'm anti-ATI, but I am not. They make excellent products. The problem is the X850XT PE is a 2003 feature set limited stop gap card that is going to be obsolete when their real 2005 card, the R520 is released. If you want ATI, wait for that. Don't waste $500 on a faster version of a card they put out 2 years ago that doesn't even meet the MS standard for DX.


If you want to see HDR in Far Cry then you'll need a SLI setup as it sucks the life out of a single nVidia. Plus it really doesn't do much for the look, ethier does SM3. The numorous threads on this site explain all the hype concerning SM3. Hype is about all it is. Same with HDR.

Any developer going with the Doom III engine is a fool. There are much better alternatives out their that don't require a card with 512mb of memory to run at high resolution. I mean how pathetic is it when they (Id) didn't even bother to support wide screen monitors.
 

jenneth

Member
Mar 4, 2005
125
0
76
At this time I'm not thinking about SLI, but since so many people recommended it...

This is what I think:

1.) a single X850XT,
2.) dual 6800GT (now), or
3.) one 6800Ultra (now), another 6800Ultra later.

jenneth
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
If you want to see HDR in Far Cry then you'll need a SLI setup as it sucks the life out of a single nVidia. Plus it really doesn't do much for the look, ethier does SM3.
Spoken like someone who's never seen HDR on their aged GPU.

The numorous threads on this site explain all the hype concerning SM3. Hype is about all it is. Same with HDR.
Will it be "hype" when you're playing Splinter Cell 3 at SM1 from the 1990s because the developer doesn't care that ATI can't keep up with the MS standard?
You don't know if upcoming games will use displacement mapping, and if they'll go to the trouble to retro code for your ancient X850 feature set if they do?

Any developer going with the Doom III engine is a fool.
Gee, since you're not a developer, I guess your opinion means.....nothing. The factis that Quake 4 and Wolfenstein2 (and others) are being developed on the Doom3 engine. I guess they're "fools". :roll:

There are much better alternatives out their that don't require a card with 512mb of memory to run at high resolution. I mean how pathetic is it when they (Id) didn't even bother to support wide screen monitors.
Well, if you were Id, you might consider the 512MB a nod toward the longevity/future of the engine. You see Kini62, Id makes a big chuink of their money off of licensing their engines. If they write them to look good on tomorrows hardware, they sell more licenses.
You're obviously a noob if you think only "fools" will license the Doom3 engine- Carmacks engines are always licensed more than any other, because no one really disputes they're technically as good as any and better than most. :roll:

As far as the widescreen monitors goes, :roll:, yeah tough break to miss out on that .001% of the monitor market.

All spoken like a man who just spent $500 on an obsolete GPU.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
SM 2.0b can support HDR. Valve is implementing it to HL2 and CSS and the requirements are at least SM 2.0b. As for SC3 and U3, ATI users are damn out of luck. But I think that Ubisoft isn't using SM 2.0b because they are partners with Nvidia. Why use SM 1.1 and 3.0 but skip 2.0b? But most games need SLI configuration while using Nvidia cards to run a game properly, which is expensive. The X850 XT can match 6800 Ultra SLI very closely in some games. Many future games might be too demanding for the 6800's to run at max resolution anyway. Besides SC3 of course.

Jeneth. I suggest holding off until the new cards come out. X850 is too outdated. 6800 Ultra needs SLI configuration which may be too expensive.

And Rollo, since you have both cards, can you take pics of Far Cry's HDR Bloom and one without?
 

kini62

Senior member
Jan 31, 2005
254
0
0
Originally posted by: Rollo
If you want to see HDR in Far Cry then you'll need a SLI setup as it sucks the life out of a single nVidia. Plus it really doesn't do much for the look, ethier does SM3.
Spoken like someone who's never seen HDR on their aged GPU.

But I did see it (sort of, because the FPS was so slow on my 6800GT I had before) it made the game unplayable at 1680x1050. Even with it off the X850 is MUCH more playable at HIGHER settings.

The numorous threads on this site explain all the hype concerning SM3. Hype is about all it is. Same with HDR.
Will it be "hype" when you're playing Splinter Cell 3 at SM1 from the 1990s because the developer doesn't care that ATI can't keep up with the MS standard?
You don't know if upcoming games will use displacement mapping, and if they'll go to the trouble to retro code for your ancient X850 feature set if they do?[/quote]

I don't have any intention of playing splinter cell 3. And if the developer wants to alienate 99% of the market that doesn't have SM3 then let em. Who cares? I don't cause the game probalby sucks anyway, that's why they have it coded to only look nice with 1% of the video cards.

Any developer going with the Doom III engine is a fool.
[Q}Gee, since you're not a developer, I guess your opinion means.....nothing. The factis that Quake 4 and Wolfenstein2 (and others) are being developed on the Doom3 engine. I guess they're "fools". :roll:[/quote]

That they would be. Let's look at Doom III. It has some nice ligthing effects but what else? Let's see, the character modeling is pretty good, close ups of the enemies look good for the few seconds they're actually on screen, there is no AI to speak of, their is little physics ie. hardly any interaction with the environment. Sure you can kick a box and the barrels blow up and move things, wow! Impressive. You can't interact with anything. Walls, consoles, facilities look horrible up close. Even in the one or two areas where there is enough light to see there is nothing much to see. The game just doesn't look very good given the resources it takes. HL2 looks way better and actually offers a good physics engine and almost complete interaction with the envirnoment. Have you played HL2 from the start with the Super Gravity gun? It's awesome. Oh, and it has AI in the code already.

There are much better alternatives out their that don't require a card with 512mb of memory to run at high resolution. I mean how pathetic is it when they (Id) didn't even bother to support wide screen monitors.
Well, if you were Id, you might consider the 512MB a nod toward the longevity/future of the engine. You see Kini62, Id makes a big chuink of their money off of licensing their engines. If they write them to look good on tomorrows hardware, they sell more licenses.
You're obviously a noob if you think only "fools" will license the Doom3 engine- Carmacks engines are always licensed more than any other, because no one really disputes they're technically as good as any and better than most. :roll:

Only games like Quake that have no AI or physics. Just multi-player matches. Boring single player gaming again becasue the engine is so one dimensional- looks (and it's not that good looking) at the expense of decent game play. Doom III sucks. It's boring. Same thing over and over. Run around in the dark and shoot the same things. Every level looks the same and plays the same. Gee, can't wait for Quake 4.

As far as making them look good for tomorrows hardware- the majority, the vast majority of buyers don't even have todays hardware let alone tomorrows.

As far as the widescreen monitors goes, :roll:, yeah tough break to miss out on that .001% of the monitor market.

Just goes to show how lazy the code really is. Yeah it makes alot more sense to put their effort into making buyers fork over $800 for a 512mb video card than to have widescreen support, like every other modern game!

All spoken like a man who just spent $500 on an obsolete GPU.

It came with the system, and if it's so obsolete then how do you explain the fact that it will outperform your (if you have one) 6800 in EVERY game expcept for the pathetic Doom III. I guess that makes the 6800 beyond obsolete. Oh I guess you'll say they're planning for the future games. So if that's the case then their is no need for nVidia to come out with their next gen GPU since according to you it's their already.

 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: kini62
Originally posted by: Rollo

Even with it off the X850 is MUCH more playable at HIGHER settings.
Espicially with AA and AF on. Not just for Far Cry either.

I don't have any intention of playing splinter cell 3. And if the developer wants to alienate 99% of the market that doesn't have SM3 then let em. Who cares? I don't cause the game probalby sucks anyway, that's why they have it coded to only look nice with 1% of the video cards.
Lots of people have 6800's.
 

jenneth

Member
Mar 4, 2005
125
0
76
Okay, I've narrowed down my choices to the following:

1.) a single X850XT, or
2.) one 6800 Ultra (now), another 6800 Ultra later.

I just have a couple of questions regarding the SLI:
1.) Does anyone know how stable SLI is?
2.) Let's say I buy an eVGA 6800 Ultra now, and an year from now I buy another eVGA 6800 Ultra (w/ newer BIOS perhaps), will I have any compatibility problem between these two video cards? Do video card manufacturer frequently change their designs?

Thanks,
jenneth