• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Which is better _overclocked_, X800XL or 6800GT?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Originally posted by: ScrewFace
Definatley go with the Radeon X800XL. It has better drivers that are updated every month unlike nVidia drivers which are updated only every 6 months. The X800XL I've heard can be overclocked to X800XT speeds and beyond. It's also better in Direct3D in which most games are made and, of course, it's cheaper!🙂


hmm theres a reason ATI release mothnly drivers......their Driver team still has alot to do catch up to Nvidias driver team,also theres pretty much weekly Beta drivers for NV cards, and i dont think ATi's drivers are all WHQL certified (not sayin all nvidias are) coz they release them so fast

they are workin hard to support they hardware and customers though, you cant ask for more than that

 
Originally posted by: housecat
Originally posted by: yliu
Hi,

I know that at stock speeds, many consider X800XL to be slightly better performing than 6800GT over all. However, from what I read the overclockability of the 6800GT is alot better since many can go to ultra speeds and more (that's over 14% of gain) while the X800XL is often limited to about 10-12% of gain. So has any compared the performance of a 6800GT at 400/1100 and an X800XL at 450/550? I am only asking about performance so cost is not a factor

The bottom line answer to your question

overall, ATI and NV are about equal in DirectX performance
But in OpenGL, NV wins by a landslide, see Doom3

Well, actually, in most OGL games they're fairly close these days (certainly not a "landslide" in NVidia's favor). GF5/6 cards get a huge performance boost in Doom3 because of the way they handle stencil shadows (essentially, these cards were built around the Doom3 engine). If you want to play Doom3 or games based on that engine (so far, just Quake4), obviously an NV card is for you. ATI has a decent edge in HL2, a smaller one in Far Cry, and is pretty damn close in almost everything else, especially once you turn on AA/AF.

Now on the X800XL, its too late to be buying one of these, ATIs new cards are right around the corner with DX9C support and beyond.

If you buy today, you want DX9C (the current DX standard) and ONLY NV cards have that

So you can get almost unusably slow HDR in Far Cry, or so you can have almost unusably slow SM3.0 features/HDR in the new Splinter Cell game? Yeah, that's worth $100 more.

This has been talked to death in a dozen threads -- IMO, SM3.0 is not that big a feature for this generation of cards. The advanced IQ features you really want to be able to use with it (such as displacement mapping) are not going to run well on today's cards. And almost all of the performance boost (which is not all that much) is available through SM2.0b on the R4XX hardware (if the developer chooses to support it).

The X800XL does not overclock well. Not nearly as well as the Geforces.

The X800XL has more pipelines (than x800pro), and the X800Pro has lowK dialectric insulation which is why it overclocks well.

So the X800XL is kind of stunted in its ability. If it were built like the X800Pro, you'd have a VERY fast and overclockable card.. too much for ATI's taste, it'd cannibalize their X850 sales.

Not everybody wants to overclock their card, and the X800XL was built on a smaller process than the X800Pro -- 110nm versus 130nm (I thought both cards used low-k dialectrics, but I might be wrong). It's a smaller die and cheaper to manufacture, but it doesn't overclock as well. That's the price you pay for a $300 MSRP card with nearly X800XT/6800GT performance.

You can listen to the ATI fanboys who know nothing other than "X800XL is $300 ROCKSORZ!", or you can listen to me. Take your pick, but you'll notice they offer short quips as to why you should get a X800XL.. and I give you the WHOLE story.
From top to bottom, the Geforce6 lineup hands down sprays feces all over the ATI lineup. Sorry its the truth. The best ATI card is the X800xl, and as stated, even it isnt so great all things considered.

Good luck.

Yeah... there's an unbiased opinion for you. :disgust:

Both cards have about the same stock performance. The 6800GT has SM3.0 and can do FP framebuffers (which is NOT an SM3.0 feature, but is what HDR in Far Cry and the new SC game uses). However, it runs games quite slowly with those features enabled, so it's not clear how useful they will really be in the long run. It generally seems that the 6800GT overclocks better (percentage-wise) than the X800XL, although maybe not drastically so (ie, an X800XL that overclocks well will likely still do better than a 6800GT that overclocks poorly).

However, the X800XL PCIe costs under $300, whereas the 6800GT PCIe costs close to $400. If possibly better overclocking and SM3.0 (even given the perforance hits you take with today's games) are worth $100 for you... hey, it's your money. In AGP, they're about the same price, so there the 6800GT is a much better choice.
 
I have my GT running at 410/1110, which is as high as I can get it to run, and I understand that's on the low side, overclock-wise. If you get lucky, you might get an XL that OC's good, but Coolbits is a better ocing utility than anything ATi has. Been happy with my GT (or rather, Ultra OC 😉) but I'm the kind of guy that always thirsts for more. Shader 3.0 isn't that big a deal, get whichever. The REAL upgrade you need is WGF 2.0, but thats more than a year off. The performance difference is miniscule. Here's my best advice: give 400 dollars to your lawyer, or someone, and tell them no matter what, to not let you have the money for 6 months. Then, in October, spend it on a video card. Next-gen performance will be not to be missed.
 
Originally posted by: slash196
I have my GT running at 410/1110, which is as high as I can get it to run, and I understand that's on the low side, overclock-wise. If you get lucky, you might get an XL that OC's good, but Coolbits is a better ocing utility than anything ATi has. Been happy with my GT (or rather, Ultra OC 😉) but I'm the kind of guy that always thirsts for more. Shader 3.0 isn't that big a deal, get whichever. The REAL upgrade you need is WGF 2.0, but thats more than a year off. The performance difference is miniscule. Here's my best advice: give 400 dollars to your lawyer, or someone, and tell them no matter what, to not let you have the money for 6 months. Then, in October, spend it on a video card. Next-gen performance will be not to be missed.

yeah..right you'll get it back in 6 months. i promise😕

anyway, ATI ocs less(max i've see on the core is maybe a 65mhz oc. memory does better than that though, but not by much, maybe 100mhz max), but the 6800gt is slightly slower in most DX games to start out with. OGL is usually Nvidia's forte so yuo really should choose what you want to paly more. HL2 and the source engine games, or the DOOM3/Quake 4 engine games. Quake4 has a good boost from NV cards and HL2 has a good boost form ATI cards.
 
People who say farcry HDR and SP CT SM3.0+HDR in not playable..... you should have a look at it in my comp. Best graphics ever!! and very playable 99,9% of the time (Farcry HDR sometimes is overdone and you don't see crap, but that's once or twicw in the whole game and just for a minute). Sprinter Cell CT rocks with everything max 1280x1024 8xAF... have to see it moving to know what I mean...
 
have you seen benchmarks for SC:ct running in SM3.0??? Without HDR and some other exclusive features to SM3.0 (which do NOT kill your system like you would think) (and i am leaving Displacement mapping/ parallax mapping in...that's one of the features left on), the 6xxx cards see a good boost in performance...and you know what?? it looks BETTER than 1.1 (or if the had 2.0 even). Have you seen displacement mapping in use? the rocks in a pathway REALLY (i mean physically) come OUT of the ground...can SM2.0 do that with normal/bump mapping, NO.

Last time i checked you can get a 6800gt for around $350 (correct me if i am wrong, newegg wont load up)...now, for better OCing, better drivers, and SM3.0 support, i would think that you would go for the 6800gt
 
Originally posted by: MedicalEntropy
ati didnt have good drivers suport and still dont have the best opengl drivers, i think they still lacking windows 64bit drivers even though o/s is now final.


haha NO xp64bit drivers!? Are you nuts. The xp64bit drivers are great, rock solid, I don't have a single problem with them. And that's considering I've been running windowsxp64bit without a 32bit os in like, say... 6 months maybe?
 
I dont know what's up with all the B&M over ati's drivers. i mean, when i can still find enchanced drivers for a rage pro 8mb at their webisite, i think that's pretty good. espically when they add extra unsupported stuff and work flawlessly witohut hogging resources.
 
Originally posted by: crazySOB297
Originally posted by: MedicalEntropy
ati didnt have good drivers suport and still dont have the best opengl drivers, i think they still lacking windows 64bit drivers even though o/s is now final.


haha NO xp64bit drivers!? Are you nuts. The xp64bit drivers are great, rock solid, I don't have a single problem with them. And that's considering I've been running windowsxp64bit without a 32bit os in like, say... 6 months maybe?

im sure its performing great on those 64 bit games youre running.
 
Out of curiousity.

Can someone post an article or whatever on SM 3.0 improvement over SM 2.0 and how it affects performance, and the comparison on those in games that use SM 3.0 against the same game with an equivalent card that does not support SM 2.0.

have you seen benchmarks for SC:ct running in SM3.0???
No, i haven't. Could you link?

Sorry, but I don't usually concentrate on the SM 3.0 or even pay attention to it.
It's just that, it seems loads of people keep "supporting" the SM 3.0 thing, and haven't really been looking it up.

Gosh, people are still talking about ATI's driver issues? Those where fixed YEARS ago.

But that is WHY you dont experience the horrors ATI users traditionally have.

What horrors?
I use ATI fine. Why? Cause I don't use the bloated CCC piece of crap. In fact, they have driver package that only come with the driver and the old control panel.
And most wouldn't use the CCC (at least the more informed user) unless it's for OverDrive 3.
 
I find that hard to believe, since i am part of the biggest/best windows beta irc channel and only the last 2 builds at most before final were any good to run as a daily primary o/s, and i use nvidia anyhow but i do see between 200-1000, peeps and the feedback, and ati had bad beta drivers for 64bit windows.
 
Originally posted by: BouZouki
The cards are equal. x800XL is $100 less. The 6800GT offers SM3.

SM3 (including HDR), DirectX 9C hardware compatibility (the newest standard), and SLI if you are looking at PCIE cards.


why all the silence about SLI?

If you are going PCIE X800XL, you would be a retard to pretend that the LESS THAN $100 price difference is worth just writing off the 6800GT.

I'd still rather have the GT if I was buying THIS late, this gen. Might as well get your DX9C hardware.


Then you must consider the Geforce6 will age better, with its DX9C support.. as well as its SLI support. People are going to actually want old GF6 cards, because if cheap enough, ppl will upgrade their existing card with another one.

Either way, it makes the card more desirable in the future, and more desirable now as SLI is the fastest gaming configuration possible.




Between DX9C, equal DirectX performance, SLI capability and VASTLY superior Doom3 Engine performance (the most advanced, best looking engine on the market that will power QUAKE4, future Call of Duty games and more) make it an easy choice to the educated.

$100 is squat (and that is exaggerated, more like $70 or less difference) when you consider how the GF6 is equal in DX speed, and superior in nearly every other way to the X800s.
 
Originally posted by: housecat
Originally posted by: BouZouki
The cards are equal. x800XL is $100 less. The 6800GT offers SM3.

SM3 (including HDR), DirectX 9C hardware compatibility (the newest standard), and SLI if you are looking at PCIE cards.


why all the silence about SLI?

If you are going PCIE X800XL, you would be a retard to pretend that the LESS THAN $100 price difference is worth just writing off the 6800GT.

I'd still rather have the GT if I was buying THIS late, this gen. Might as well get your DX9C hardware.


Then you must consider the Geforce6 will age better, with its DX9C support.. as well as its SLI support. People are going to actually want old GF6 cards, because if cheap enough, ppl will upgrade their existing card with another one.

Either way, it makes the card more desirable in the future, and more desirable now as SLI is the fastest gaming configuration possible.




Between DX9C, equal DirectX performance, SLI capability and VASTLY superior Doom3 Engine performance (the most advanced, best looking engine on the market that will power QUAKE4, future Call of Duty games and more) make it an easy choice to the educated.

$100 is squat (and that is exaggerated, more like $70 or less difference) when you consider how the GF6 is equal in DX speed, and superior in nearly every other way to the X800s.


Fanboy post.

$100 dollars might not mean a lot to you. If you looked at things differently you would realize it means a lot more than you think to some people.

As far as I know the x8xx series is compatible with DX9. DX9 has been on ATI cards since the 9700.

I wouldnt get the 6800GT just for SLI support because SLI doesnt mean future proof, it means having the fastest setup right now.

You keep talking about Doom 3, anyone can say the same for the Source engine.

Considering the PCI-E 6800GTs go for $380 mininum, $400 avg and the x800xl goes for $280 mininum and $300 avg I would say the $100 differance is accurate.
 
Originally posted by: housecat
Originally posted by: Lonyo
Originally posted by: hans030390
dont get the x800...get the 6800...i can't stress how important the shader model 3.0 is on the 6xxx cards. even the x850 will probably run late 2005/2006 games like crap cuz it only has shader model 2.0. SM3.0 will be needed if you want to play future games.

so go for the 6800gt
A better card will be needed to play 2005/2006 games with high settings.
Of course the X850 will run 2006 games pretty poorly, if they are demanding games. Pretty much any card will (except for the new top end cards of the time).

Originally posted by: Acanthus
Originally posted by: ScrewFace
Definatley go with the Radeon X800XL. It has better drivers that are updated every month unlike nVidia drivers which are updated only every 6 months. The X800XL I've heard can be overclocked to X800XT speeds and beyond. It's also better in Direct3D in which most games are made and, of course, it's cheaper!🙂

Ati with better drivers... lol.
Dude, welcome to last year (or the year before?)
ATi drivers are absolutely fine, so STFU unless you know what you are talking about.
Although I have only been using an ATi card for a year or so with absolutely no issues, so maybe I am wrong.


Well I DO know what I am talking about.. and yes, you are wrong.

Welcome to RIGHT NOW.
NV drivers are and have ALWAYS been superior.

Since you wont believe me, or him read THIS
http://www.elitebastards.com/page.php?pageid=9412&head=1&comments=1

Its on the efficiency of NV drivers over ATI.. I could go into the MAJOR differences between ATI and NV, which NV comes out on top EASY.

But theres some hints as to the detail that NVIDIA puts into their driver development. And ATI does NOT.

So as you said to him, STFU if you dont know what you're talking about.
You may have had no issues, but NV drivers are better. Sorry.

And the reason you had no issues is because ATI has had virtually the same graphics card since the 9700 Pro. Its easy to create killer drivers when you never fundamentally change the card, they basically added a single string in those drivers to recognize the X800s.
Nothing wrong with this, but we'll see how they do on a new architechure wont we.
But that is WHY you dont experience the horrors ATI users traditionally have.

Not because ATI drivers parallel NV's, they dont.. read the article, if you'd like to know the PLETHORA of other reasons NV drivers blow ATI's out of the game.. I'd be happy to oblidge.

ATI drivers are so horrible that my 8xxx-9xxx series cards all worked just as flawlessly as my GF2/3/6 cards.

If ATI drivers are killer, what does that make NV drivers? Must be pretty good to blow "killer" drivers out of the water.

:roll:
 
Originally posted by: housecat
Originally posted by: BouZouki
The cards are equal. x800XL is $100 less. The 6800GT offers SM3.

SM3 (including HDR), DirectX 9C hardware compatibility (the newest standard), and SLI if you are looking at PCIE cards.


why all the silence about SLI?

If you are going PCIE X800XL, you would be a retard to pretend that the LESS THAN $100 price difference is worth just writing off the 6800GT.

I'd still rather have the GT if I was buying THIS late, this gen. Might as well get your DX9C hardware.


Then you must consider the Geforce6 will age better, with its DX9C support.. as well as its SLI support. People are going to actually want old GF6 cards, because if cheap enough, ppl will upgrade their existing card with another one.

Either way, it makes the card more desirable in the future, and more desirable now as SLI is the fastest gaming configuration possible.




Between DX9C, equal DirectX performance, SLI capability and VASTLY superior Doom3 Engine performance (the most advanced, best looking engine on the market that will power QUAKE4, future Call of Duty games and more) make it an easy choice to the educated.

$100 is squat (and that is exaggerated, more like $70 or less difference) when you consider how the GF6 is equal in DX speed, and superior in nearly every other way to the X800s.


Typical fanboy posts by nvidia camp. They will eat up whatever PR trash they can.

SLI = WASTE OF MONEY. If the OP had the financial ability to spend as much as he/she wants, we wouldn't even have this topic. He would just pick 6800 Ultra or X850XTPE. SLI costs alot of money more than just another video card. You have to get SLI mobo that costs $50+ more than typical non-SLI NF4. You have to get a PSU that feeds enough power for two hungry 6800GT/Ultra. I think there's been 100+ threads about the pro/con on SLI setup. Just think about this, in about 6-8 months, we will have R520 and new NV chip. Will that single card solution trump your SLI rig? Think about that. Is it worth investing in $800+ on video cards, and extra $100 for upgraded mobo/psu, whereas you can buy a X800XL/6800GT for $300/$400 and saveup the other $600/$500(plus the sale of old card) for the next generation card? Please don't tell me people will actually shell out money to get another 6800GT for SLI when next-gen cards comes out. Because that would be foolish. Why spend another $250-300(new card value) on a 6800GT PCI-E when you can sell your old card for $200-$250 (old card value) and add $250-300 for a next-gen card? Wouldn't that be more "future-proofing" by your logic?

DX9C on 6800GT is SLOW. Even if it runs acceptable in current batch of games, what about next batch of games? Who's to say that 6800GT will be able to run those games perfectly? Remember, Far-Cry (HDR) / SC (DX9C) are the first gen game that's using the new standard. As the game developer get used to the new shaders and tools, they will create more complex graphics and details. You cannot assume that your 6800GT will work fine with rest of the crop of games.

And the $100 issue. I don't know about you, but $100 is ALOT of money for me. With $100 savings, you can invest in your CPU, instead of getting 3000+ you can go for 3500+. Add another 1GB of RAM or go from Value RAM to TCCD RAM or buy a better case or ENDLESS POSSIBLITIES. For me, I would rather spend that $100 on a hardware with features I can be sure to use rather than "future-proofing". BTW, it is about $100 difference still. There were lots of deals where you can get X800XL for below MSRP. I got mine @ $245 shipped from ATI. There's plenty of deal where you can get X800XL for $280-$290. Cheapest 6800GT PCI-E right now on Pricewatch is $388. I don't think there's any deal on the PCI-E version of 6800GT.

X800XL > 6800GT in DX speed. The difference is minor, but X800XL does win against 6800GT in alot of DX games. I can easily counter Doom3/NV with HL2/ATI, if you want to go that route.

Here is the REAL ISSUE. What game will OP play this year. If OP wants to play mainly OPGL games, then he can go for the 6800GT. If OP wants to play mainly DX9 games, he should go X800XL.

 
People who even attempt to diss ATI drivers make me go wow. I own both ATI and Nvidia cards and nither have any problems. I dont see any advantages over one or the other. We all know ATI is well over their driver problems which were fixed years ago. The only advantage I see from ATI is their monthly updates. The only disadvantage I see with Nvidia is so many beta drivers which come out so frequently. Nither bother me much...


Fanboys :disgust:

 
I would like ATi drivers much better if there weren't XG nvidia drivers. XGs are great but nvidia drivers don't seem to be that good. In oficial released drivers ATi has the lead IMO.
 
Originally posted by: McArra
People who say farcry HDR and SP CT SM3.0+HDR in not playable..... you should have a look at it in my comp. Best graphics ever!! and very playable 99,9% of the time (Farcry HDR sometimes is overdone and you don't see crap, but that's once or twicw in the whole game and just for a minute). Sprinter Cell CT rocks with everything max 1280x1024 8xAF... have to see it moving to know what I mean...

me n McArra are on the same wavelength.

i played the entire far cry begining to end with HDR on (12x10 res and 4xAF) all other settings maxed out

ive jus finished splintercell chaos theory. 12x10 16xAF, SM3, HDR, Parallax Mapping, Soft Shadows.

ok FC slowed to a few nasty FPS on odd ocassions but most of the time it was, for me, completely playable. ive got screens with FRAPS to show you

SC well after turning FRAPS off i enjoyed FLUID gameplay throughout the game. amazing visuals .
 
Leave it to nVidiot Fanboi's to ignore the OP's question by shoving SM3.0 and HDR down our throats.. once again! You don't need any of them to get amazing visuals, 2.0++ gives you enough if you ask me. And those who say any 2.0 cards will run future games like crap is just fooling themself, if the 800XL can't run game XYZ well then I'll bet you neither will a 6800GT. I like both ATi and nVidia, infact NV was my favorite maker for years until I learned to broaden my horizons. And being an owner of both cards I can say that I usually got more OC out of a nVidia, but ATi has been very good for me at nice high clockspeeds so far.
 
Originally posted by: BrokenVisage
Leave it to nVidiot Fanboi's to ignore the OP's question by shoving SM3.0 and HDR down our throats.. once again! You don't need any of them to get amazing visuals, 2.0++ gives you enough if you ask me. And those who say any 2.0 cards will run future games like crap is just fooling themself, if the 800XL can't run game XYZ well then I'll bet you neither will a 6800GT. I like both ATi and nVidia, infact NV was my favorite maker for years until I learned to broaden my horizons. And being an owner of both cards I can say that I usually got more OC out of a nVidia, but ATi has been very good for me at nice high clockspeeds so far.


BrokenVisage has it down right. Here's the main problem with alot of gamer today, they're trying to buy hardware in anticipation for next-gen games. You DO NOT EVER buy hardware in anticipation for next-genereation games/software. It just doesn't make sense. If you're buying current gen video card to be "future-proof" against games coming out in 1-2 years, you're doing yourself a disservice. No-one will be 100% sure how current hardware will run those software. What you SHOULD base your hardware purchase is how well current gen video card runs current video games. Even if games 1-2 years later uses same engine as Doom3/HL2, it doesn't gurantee you that your card will run them perfectly. Like I said before, when the developers gets more comfortable with the tools they use, they will create much more massive and complex graphics compared to the ones showcased by Doom3/HL2.
 
BrokenVisage, YOUR missing the point, who is mad enough today to buy a old tech dx9b no p.s 3.0 support card, stop making excuses, cause even Ubisoft fooked ATI by only allowing menu choice of p.s 1.0 or 3.0 in the game not any 2.0.

Its april 2005 and nvidias tech is 1 year old as of launch and its more futureproof than ati's x800 or even new rip of x850 (that u are said to ahve to pay premiuim of $50 if u want agp).

Anyone saying it not matter about dx9c or p.s 3.0 is a noob, cause its starting to show its head and will more and more in 2005.

It bit late in the day for old tech purchases, go on E-GAY and get 2nd hand card then and save your money for a next gen card due approx april-may, or keep your money till then and get one of current gen cards at a dropped down price.
 
Back
Top