x800 xl

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

masshass81

Senior member
Sep 4, 2004
627
0
0
Err got my retail ATI X800 XL from shop.ati.com for $283 shipped and no tax charge (ships from TX). Use the coupon code 1010
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
That passively cooled X800 XL from Gigabyte looks awesome. If I was in the market for a new card right now it would be awefully tempting.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
First of all, please learn to quote dfloyd, this is very hard to read?

Originally posted by: dfloyd
You posted a link in reply to me stating there was no banding. I read the link and I do not see it saying anything about banding, it does claim that his cheekbones are slightly more defined and 3.0 is a little brighter, no where do I see anything about banding so may I ask what the link is supposed to show? At least in the context of what you quoted and replyed to.
I was replying to your assertion SC is the best looking game ever in SM1.1 and showing it's better in SM3.
Here's your banding link:
http://www.beyond3d.com/misc/benchguide/index.php?p=sc3
visually, some of the banding seen with SM1.1 disappears for SM3.0 due to the higher precision used in the shaders. However, the latter also offers better performance and the option to apply additional graphical effects.
Tough call there- no banding and more effects in less passes.

This above quote I find most amusing of all. Obviously you have not read your box. My box, and most everyone I have talked to whom has read theirs) claims that my 6800 NU has WMV9 Hardware acceleration.
Everyone on this board does, or should, know that nVidia made some mistakes with the video acceleration feature advertising of the nV40. The fact that it never worked with WMV9, and they state on their website that it never will, tells me they made a mistake with advertising, not that it's "broken". For something to be "broken" it had to have once "worked". I can link you to definitions of "broken" if it will help?

"I've tested those videos on what probably amounts to more cards than you've owned in your life over the last eight months or so. (ten different cards if you're wondering)"

This quote I find highly amusing as well. You know what happens when you make assumptions? I do. :p
What happens when I make assumptions? You use a 6800NU as your primary card. The odds that you have tested more cards on WMV9 than me in the last 8 months is low. I own 2 PCIE 6600GTs, 2PCIE 6800GTs, and an AGP 6800NU. In that time frame I've also owned/tested a X800XTPE, an AGP 6800GT, GF3, and 2 PCIE 6800NUs. Sorry, the guy thats using a a 6800NU just isn't likely to have owned many video cards in the last 8 months- it's a midrange card. (no offense)

I have been in the computer field since 1985 and have been in tech support for the past twelve years.
How is that relevant? What sort of tech support do you do where you're doing hands on testing of WMV9 clips on modern video cards?

I do believe I have worked with or owned quite a few video cards in that time (I would guesstimate in the hundreds). So your ten somehow seems very small by comparison and quite pointless to the argument.
Really? Please list the cards that you've owned in the last eight months that makes my ten seem like a small number? Like I did? Also, what do you tech support that your are testing WMV9 hands on like I have?

Are you trying to make yourself seem like some kind of expert? Owning and handling a video card does not make one a expert. Heck I am far from even dreaming of being a expert on video cards. I am just stating what I see and dont see.
Dude, you can't even get your own computer to run the video smooth- from my experience all you have to do is set up the computer properly. BTW- I do primarily online tech support as well. Big deal. Me logging on to networks, fixing database problems, writing reports, etc doens't have much to do with WMV9, does it?


"SLI has nothing to do with this;however, my dual 6800GTs run SIL1080 smooth as glass. (thanks to my 3800+ no doubt)"

It may not have anything to do with it. I never said it did.
You said maybe it's my SLI- SLI only workis on video games AFAIK.


"My computers run it in the high 70s-low 80s."

Thats very impressive, I wish I knew how as a 1080i HD DX9 video on my 6800NU never dreamed of coming close.
That's about average from what I've seen.

"Sorry- but this is wrong as well. While the 6600GTs have the working hardware WMV acceleration, they are waiting on MS like the PCIE 6800s. The only people who've seen this working are at review sites that nVidia gave updated beta versions of WMP10, AFAIK.

You shouldn't post so much misinformation about this stuff, people might believe you?"

Ok how is my statement false? In your fist sentence you agree with me by saying "While the 6600 GTs have the working hardware WMV acceleration....." I stop at that point because that is what I said. You agreed with me and told me I was wrong in the same sentence. What are you talking about? Your not making any sense. You go on to say the only people who have seen it working are reviewers. So how does that make my statement false? Reviewers are real people whom have seen and used this feature. So again how is my statement false? I never said every person on earth can use it now, I said the feature is working in the 6600 GT cards, and apparently because reviewers can use it, then it must be working. And even then I still dont think you are correct. I have seen people post who have 6600 GTs and claim that they had less CPU utilitzation with the HW acceleration turned on. So I am sorry, but I will believe the many people I have seen claim that.
This is all spin. You or I can't see it working, we have no access to the version of WMP10 it works with. You stated it's working on 6600GTs, it's not anywhere outside NDA review sites.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Rollo,

Your selective quotes and leaving only partial info is amusing. For instance when you quote me as saying I never said SLI was an advantage you left out the part where I said claimed that what I said was "maybe" that was what was helping (Maybe does not mean it is, or it is not). If you read the whole statement you will see what I said. Also I went on to say that SLI should not make any difference according to Nvidias website. So again, you disagree with me by agreeing with me, a very amusing tactic.

My primary video card is a X800XL. In the last six months I have used and tested, 9700 Pro, 9800 Pro, Geforce III, 9600 XT, X800 XL, 6800 NU, 6600 GT, Voodoo III (Gotta love UltraHLE), several on board cards, and installed and setup quite a few more. So another assumption I see, another incorrect one though.

My statement about my computer history was based on the fact that you claimed you had used and tested more cards in the last eight months than I had probably used in my lifetime. The post about my history had nothing to do with WMV9, it had to do with showing that you were severly incorrect in assuming I had very little experince with video cards.

And assuming I do not know how to set up my pc correctly is another silly assumption. I obviously do as every other game I run, runs perfect, smooth, and fast. Every other program I use, runs perfect, smooth, and fast. I get similar test scores to the majority of online testing sites including Anands. You assuming that the WMV 9 video running poorly on my system has anything to do with my ability to setup and configure my system is entirely incorrect.

I guess your next statement sums up this discussion perfectly.

"Everyone on this board does, or should, know that nVidia made some mistakes with the video acceleration feature advertising of the nV40. The fact that it never worked with WMV9, and they state on their website that it never will, tells me they made a mistake with advertising, not that it's "broken". For something to be "broken" it had to have once "worked". I can link you to definitions of "broken" if it will help?"

You claim everyone knows that Nvidia made some mistakes with their advertising. This was not true friend. For a long time Nvidia stood by their claim that their 6800 cards had the acceleration that was printed on the boxes. With that in mind one would expect the feature to work. Yet when one tried it, it did not. So since Nvidia claimed it was there, but it did not work, the logical deduction would be that the feature is broken. Nvidia even tried to hide the fact by planning to add something totally different in drivers to make it look like the lie was correct (Purevideo - Nvidias Current Claim of Accelration (They are still claiming it). (And it was a lie, if you tell someone something that is not true it is called a lie. Even if its done in marketing, it is still a lie). So no friend I do not need to see the definition of broken.

And the fact that the ability is working or not working on a 6600GT is not a spin. I have seen several claims that people notice a difference in CPU Utilization with their 6600 GT cards when running these videos. They have made this claim now, not as in beta, not as in reviewer. And even if they are incorrect the fact that it works for reviewers does mean that it works. Your argument could just as easily be called a spin to justify your stance on the topic. (Which I still cant figure out what it is) All I have said is that the feature does not work with 6800 cards, be it broken, be it a lied about feature, be it markting bs. The fact is my claim is correct and you have done nothing but substantiate this. So again I ask you, what are you trying to discuss?

Edit:

This is the Video that I used to do the testing on the 6800 NU for reference purposes. And I do not know the actual FPS, I just now it was not smooth at all.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Dfloyd:

The fact of the matter remains: my computers run the video you link to flawlessly on one, and with 2 split second hitches near the end on the other.
The fact that they have worked the same for me on multiple computers with all variants of the nV4X gpu doesn't fill me with hope for your PCs.

Good luck getting them to work properly, you might want to get another persons perspective on your system?

Your 3200+ has enough power to run SIL without hardware acceleration, you must have other issues. Perhaps if you made a thread others could help you with it?
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
I'd get the cheapest XL you can find, like the $249 Connect3d, find out if noise is acceptable to you first. Then, if not, purchase almost silent solutions like acrtic coolings or Zalmans stuff for $20. You're still $60 less this way and just as quiet.

No way should you even think about a GT. Hot Hot Hot power sucking!!! requires massive cooling e.g. loud or water to be stable.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Actually Rollo the video runs perfectly smooth with my X800XL card. May I ask what system you were running with that you had the slight hiccups?

Oh and thanks for advice on having friends come over and help me out, many do come over and play around with my system, but most ask me for advice after doing so. I am pretty dilligent about tweaking and cleaning up my system.

Also here are the facts:

An A64 3200+, 1GB Corsair XMS Xtreme CLS 2 Memory, BFG 6800OC ran the video poorly

An A64 3200+, 1GB Geil 2.5 CLS 2.5 Memory, X800XL runs the video flawlessly

So to the point of this thread I still suggest the X800XL over the 6800 GT. Not just for video, for gaming as well. As I have played on a 6800 GT and it did not feel as smooth as my X800XL even if the benchmarks in some games were faster. (I Played Far Cry and Riddick Quite a Bit and my Radeon X800XL felt much smoother)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: dfloyd
Actually Rollo the video runs perfectly smooth with my X800XL card. May I ask what system you were running with that you had the slight hiccups?


:laugh: This may earn you a petition.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
A petition?

Did I do something wrong?

Edit In Reply to the Guy Above and Below:


Yes yes a petition, down with the Ati fanboy... Yes ban me.

Wait what am I saying ;)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Just a poor joke, as this very subject has gotten quite heated. Leading to at least one holiday as I remember. :beer:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dfloyd
Actually Rollo the video runs perfectly smooth with my X800XL card. May I ask what system you were running with that you had the slight hiccups?

Oh and thanks for advice on having friends come over and help me out, many do come over and play around with my system, but most ask me for advice after doing so. I am pretty dilligent about tweaking and cleaning up my system.

Also here are the facts:

An A64 3200+, 1GB Corsair XMS Xtreme CLS 2 Memory, BFG 6800OC ran the video poorly

An A64 3200+, 1GB Geil 2.5 CLS 2.5 Memory, X800XL runs the video flawlessly

So to the point of this thread I still suggest the X800XL over the 6800 GT. Not just for video, for gaming as well. As I have played on a 6800 GT and it did not feel as smooth as my X800XL even if the benchmarks in some games were faster. (I Played Far Cry and Riddick Quite a Bit and my Radeon X800XL felt much smoother)


The system with the two hiccups is an A64 3000+/1 GB Corsair Value Select/reference model 6800NU. There's 2 spots near the end where it hitches, rest is like watching tv- totally smooth.

Your X800XL has no WMV9 acceleration last I heard either, their method is a workaround using the shaders that requires an unreleased MS .dll as well. So there should be no advantage of one over the other.

There's no way your X800XL could have felt smoother at Riddick, the card is 33% slower at that game, a huge deficit:
The truth about X800XLs and Riddick

X800s basically suck in OGL, I used to have a X800XTPE until I sold it to Keysplayr2003 back in March. Although it benched about the same in Doom3 as a 6800GT, it was a lot jerkier. When Doom3 came out, the 6800NU I was using was as fast as an X800XT PE- which tells you a bit about just how much they had to "optomize" the drivers for D3.

Why should anyone care? Quake4, Prey, and Return To Wolfenstein2 (three HUGE games) are Doom3 engine based and will likely also run like ass if people take your X800XL advice.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
Why should anyone care? Quake4, Prey, and Return To Wolfenstein2 (three HUGE games) are Doom3 engine based and will likely also run like ass if people take your X800XL advice.

Run like ass? I don't think so. While not nearly as fast as a 6800GT in Doom3, its performance is still fairly close. And in OVERALL benchmarks, the X800XL is just as fast as the 6800GT.

So unless your gaming world revolves around nothing but Doom3 based games, get the X800XL and save yourself a good chunk of $.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Okay, I just ran that step into liquid video. On my system, equip with a 6800 gt there was lots of chop throughout. I think Rollo is suffering from nvidism again. I'm not sure how it would run on the X-series since I no loger have my X800 pro, but for sure I cant say too much about the performance I experienced. Check out my system, Its fairly powerful and I dont think that there is much left to optimize.
 

wesman6

Senior member
Jan 5, 2001
541
0
0
I noticed a huge drop in frame rate if I had Fast Writes disabled on my rig playing that movie. Once Fast Writes was re-enabled, I got smooth as glass frames.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: Rollo
Why should anyone care? Quake4, Prey, and Return To Wolfenstein2 (three HUGE games) are Doom3 engine based and will likely also run like ass if people take your X800XL advice.

Run like ass? I don't think so. While not nearly as fast as a 6800GT in Doom3, its performance is still fairly close. And in OVERALL benchmarks, the X800XL is just as fast as the 6800GT.

So unless your gaming world revolves around nothing but Doom3 based games, get the X800XL and save yourself a good chunk of $.

The thing is even the benchmarks don't tell the story on Doom3 games and X800 cards.

By "run like ass" I mean "run like ass". Like I said, by the time ATI had gone through a few driver tweaks to get the Domm3 performance benchmark as high as a 6800GT on a X800XT PE, the actual gameplay had gotten "jerky". All I can think is their "optomizations" made for higher highs but some lower lows, with the result being a higher average but more pauses.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: lavaheadache
Okay, I just ran that step into liquid video. On my system, equip with a 6800 gt there was lots of chop throughout. I think Rollo is suffering from nvidism again. I'm not sure how it would run on the X-series since I no loger have my X800 pro, but for sure I cant say too much about the performance I experienced. Check out my system, Its fairly powerful and I dont think that there is much left to optimize.

Meh. If you live in WI come over, watch it, and hang your head in shame.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Rollo please do me one favor.

Stop assuming you know what games run or feel like on my system. You cant know. You may base it off of numbers you have seen elsewhere or off of your own experiences but that does not change the fact that those experiences are not with my system or setup.

If you want to come to my house in North Carolina though I will show you the games and show you that they run very smoothly.

With that said, you are completly wrong friend. My Radeon X800XL feels WAY smoother in Riddick than my 6800NU did. Maybe you did not have your system configured properly is why it felt less smooth to you. OpenGL games do not run like Crap on Ati cards. Maybe they do not run as fast as some Nvidia cards in some games but they still dont run like crap. (Please pay attention the benchmarks below and you will learn something very interesting)

You Claimed:

"There's no way your X800XL could have felt smoother at Riddick, the card is 33% slower at that game, a huge deficit"

Notice that part about YOUR? As in my card. Remeber the part where your talking about my 6800 NU. Notice that part? I dont think you did as you did not send me benchmarks that compared my X800XL to a 6800 NU? You sent me benchmarks comparing my X800XL to a 6800 Ultra or GT. Guess what, I never said my X800XL ran smoother than either of those cards in that game. I said it ran way smoother than my 6800 NU. Want to know something funny? According to the exact same website that you quoted benchmarks from (You know the benchmarks that had nothing to do with the card I was comparing my X800XL to?) I was right and you are wrong.

These Benchmarks! clearly show that my Ati Radeon X800XL is running faster than a even better Nvidia 6800 than the 6800 I had. (The one I had, had only 128 MB of Ram).

So if Ati is running it like Crap, then I guess the 6800 on that site is running it like a bad case of diarrhea. (Sorry for the colorful analogy but it appeared to fit the heart of the discussion). These benchmarks proove that I was telling the truth and that my X800XL does feel a heck of alot smoother than my 6800 Nu did. Imagine that :p

"I noticed a huge drop in frame rate if I had Fast Writes disabled on my rig playing that movie. Once Fast Writes was re-enabled, I got smooth as glass frames."

Wiseman6, Hmm that may explain why my Nvidia ran so badly on that video as I know I had fast writes turned off. I do not have the system anymore so I cannot verify this. And that would also explain why my X800XL is running it smoothly, its a PCI-E card. (Whether the video card is helping any or not, if its a Fast Writes issue then that would explain the slowdown).

One way to verify if this is the case, Lavaheadache can you please turn on fast writes (Or let us know if you had them on when it was running slowly) and let us know what your performance is like? Would be much appreciated friend.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Just to let you know, SM3 has a huge visual improvement...all next gen games use Sm3 and wouldnt look nearly as good with SM2...if that says anything...on and displacement mapping looks to be really awesome (only SM3).

And SC:CT has more that just cheekbone upgrades with Sm3...someone had comparisons somewhere...i dunno where...

 

Pr0d1gy

Diamond Member
Jan 30, 2005
7,774
0
76
Originally posted by: baddog121390
for 320 i would get a 6800gt instead. a bit more performance + SM3.0

You really shouldn't just flat out lie like this. The only game a 6800gt beats the x800xl on is Doom3. Pixel shader 3.0 (or Sm3 or whatever) is overrated.

Unless you plan on going SLi or playing nothing but Doom3, get an x800xl.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Hmm,

"Just to let you know, SM3 has a huge visual improvement"

Sorry friend I dont believe you. It may have a huge visual improvement some day but I have seen nothing at all in any current game to prove this is a valid statement at this moment or even in the near future. In fact in the game most likely to even show the differences out right now, Splinter Cell: Chaos Theory, the actual difference is very slight.

Image Quality Difference SM3 vs SM1.1

As you notice the difference in the above comparison is anything but huge. In fact in some cases the difference is not even noticeable at all.

Quote from Firingsquad:

"Looking at the images, it?s hard to spot any differences in the water between the two shader modes."

Hard to spot any difference at all is not a HUGE difference friend.

And as far as your claim that it does make a huge difference, could you please find that site. I have looked through quite a few review sites myself and on Every one of them I can hardly find any difference, on current games, in image quality with SM 3.0 vs 2.0. Yes there are a few slight differences like more defeined cheekbones and greater intensity of the lighting, but these are very slight.

The biggest improvement I have personally seen was with Parallax Mapping, can be seen at this site. But even then in a decent paced game its still hardly noticeable.

Even SM3 soft shadows is not all that impressive as its mostly only noticeable at very low resolution, so sort of pointless unless you want to take your high end video card and run it at very low res.

So please show me how there is a HUGE improvement over SM2? I am sure there is an improvement, but huge is definatly not the correct adjective to describe it.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Pr0d1gy
Originally posted by: baddog121390
for 320 i would get a 6800gt instead. a bit more performance + SM3.0

You really shouldn't just flat out lie like this. The only game a 6800gt beats the x800xl on is Doom3. Pixel shader 3.0 (or Sm3 or whatever) is overrated.

Unless you plan on going SLi or playing nothing but Doom3, get an x800xl.

Dude you are the one who's "flat out lieing".

Anyone who can read can see the X800XL loses a lot more benches than it wins- at LOTS of games- not just Doom3. :roll:

The truth

is out there

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dfloyd
Rollo please do me one favor.

Stop assuming you know what games run or feel like on my system. You cant know. You may base it off of numbers you have seen elsewhere or off of your own experiences but that does not change the fact that those experiences are not with my system or setup.
Oh. So you have the magic system where games running 33% slower seem faster?

With that said, you are completly wrong friend. My Radeon X800XL feels WAY smoother in Riddick than my 6800NU did.
You said in your quote above the X800XL is faster than a GT, not a NU? Now you're switching to NU?

Maybe you did not have your system configured properly is why it felt less smooth to you.
It's easy enough to see my system is configured correctly, compare my benches to Tom's, Anands, or Firing Squads- they're similar.

OpenGL games do not run like Crap on Ati cards. Maybe they do not run as fast as some Nvidia cards in some games but they still dont run like crap. (Please pay attention the benchmarks below and you will learn something very interesting)
Doom3 ran poorly on my XTPE.

You Claimed:

"There's no way your X800XL could have felt smoother at Riddick, the card is 33% slower at that game, a huge deficit"

Notice that part about YOUR? As in my card. Remeber the part where your talking about my 6800 NU. Notice that part? I dont think you did as you did not send me benchmarks that compared my X800XL to a 6800 NU? You sent me benchmarks comparing my X800XL to a 6800 Ultra or GT. Guess what, I never said my X800XL ran smoother than either of those cards in that game.
WTH? Here's what you said:
Originally posted by: dfloyd
So to the point of this thread I still suggest the X800XL over the 6800 GT. Not just for video, for gaming as well. As I have played on a 6800 GT and it did not feel as smooth as my X800XL even if the benchmarks in some games were faster. (I Played Far Cry and Riddick Quite a Bit and my Radeon X800XL felt much smoother)


I said it ran way smoother than my 6800 NU. Want to know something funny? According to the exact same website that you quoted benchmarks from (You know the benchmarks that had nothing to do with the card I was comparing my X800XL to?) I was right and you are wrong.
Look at what you said again, chief- you don't mention 6800NUs? Only GTs?

These Benchmarks! clearly show that my Ati Radeon X800XL is running faster than a even better Nvidia 6800 than the 6800 I had. (The one I had, had only 128 MB of Ram).

"I noticed a huge drop in frame rate if I had Fast Writes disabled on my rig playing that movie. Once Fast Writes was re-enabled, I got smooth as glass frames."

Wiseman6, Hmm that may explain why my Nvidia ran so badly on that video as I know I had fast writes turned off. I do not have the system anymore so I cannot verify this. And that would also explain why my X800XL is running it smoothly, its a PCI-E card. (Whether the video card is helping any or not, if its a Fast Writes issue then that would explain the slowdown).

One way to verify if this is the case, Lavaheadache can you please turn on fast writes (Or let us know if you had them on when it was running slowly) and let us know what your performance is like? Would be much appreciated friend.

It would be nice if a few other 6800 owners with 3000+> cpus could post on this- I doubt my system is abnormal in any way, and it run SIL fine. (albeit with 80% cpu usage)
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: dfloyd

One way to verify if this is the case, Lavaheadache can you please turn on fast writes (Or let us know if you had them on when it was running slowly) and let us know what your performance is like? Would be much appreciated friend.

Yup I have fast writes enabled. Any other suggestions as to why I get chop?

**edit I just ran it with fast writes disabled and the chop was greatly increased, very undesirable to watch
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
"You said in your quote above the X800XL is faster than a GT, not a NU? Now you're switching to NU?"

Not switching to NU, I did not realize which of my posts above you were replying to, in one I am speaking about the 6800 NU and another about the 6800 GT. I did not own the GT. I played it on a friends computer (He had it over for a lan party) and yes my X800XL felt smoother at 1280 x 1024 w/ 4x AA and 8x AF than his 6800 GT card. We went back and fourth testing it on that and several other games.

Why do you claim I am switching to the NU? I have spoken of it many times above. You in fact quoted me speaking about it by calling it a mid end card. Remember?

"There's no way your X800XL could have felt smoother at Riddick, the card is 33% slower at that game, a huge deficit"

No way huh? At the resolution and settings I was playing the 6800 GT gets ten more fps on the benchmark that is being run from that website. If you call ten fps a huge amount then I have a uber Voodoo I card that I will sale you. This thing pulls twenty fps in GLquake, imagine that, if ten fps is huge, then twenty fps is absolutly mind blowing. But back to the point friend you are incorrect, there are many reasons the 6800 GT could of felt less smooth than my X800XL, your just not considering any of them.

"Look at what you said again, chief- you don't mention 6800NUs? Only GTs?"

Again you are wrong friend, I mention and you even quote me mentioing a 6800 NU. Please read what I posted, heck dont read what I posted, read what you posted, sound good Hoss?.

And to the point of this dicussion I still say choose the X800XL over the 6800 GT for many reasons. (Oh and me stating that is not saying that I never discussed the 6800 NU. I am stating that as that is what the person asked about. I only brought in my experince with my 6800 NU as it was the 6800 series card I had the most experince with.

You ask why I would choose the X800XL? Price, features, the games I play, sharper 2d text than my BFG 6800OC had, etc. Right now my X800XL runs more games that I prefer to play faster than the 6800 GT does. If Doom III is your game, right now as Quake IV and others are quite a ways off, then by all means choose the 6800 GT. But if you like games like World of Warcraft (So much better on my X800XL its not even funny, my 6800 NU was glitching so bad in D3D I had to switch it to OpenGL which ran much slower overall and did not look as good. In fact it was so bad it felt like my 9600 XT ran this game far better than my 6800 NU did), Far Cry, UT 2004, Half Life II, Counter Strike Source, then the X800XL is a better card for those choices.

But anyways I feel me and you have butted heads long enough, we have definatly pushed the limits of what this dicussion was originally about. Can we call a truce and sum up our thoughts in one line? Mine is below:

I suggest the X800XL

Sound good Rollo?

Edit: Hmm well Lava I dont have my 6800 anymore so I cant test things out for you. I just know when I had it, the video was horribly choppy and there did not appear to be anything I could do to fix it. I did not try to enable fast writes at the time. The most confusing thing is you should not have to do anything for it to run smoothly. The fact that it does not almost appears as if the 6800 is somehow causing it to be worse, at least on some 6800s (The reason I say this is the card according to Nvidia now it is not supposed to be accelerating anything so it honestly should be the CPU making the difference, at least according to Nvidias web site now). One thing I did notice is that your 6800 is a BFG. My 6800 was a BFG. Maybe there is a problem running these videos with the BFG cards. I will keep reading though and looking around to see if I can help find a fix for your problem friend.


Edit Edit:

May have found a culprit. I am using a Nforce 4 motherboard. I just now updated my drivers using the nForce_6.53_WinXP2K_english.exe package. Guess what, the video is now running like a slide show again. Something that the package installed went from running it perfectly smooth, to very poor fps again. I have not changed anything else, the only reason I decided to update it was so I could use the on board sound for my headphone mic and my Soundblaster Audigy II ZS for gaming. Now either its sound related as in the Nforce overwrote one of my Soundblaster files or its somehow else related. But it was running perfectly smooth before. Anyone seen this problem before or know a fix off hand? Going to play with it some myself so hopefully will figure it out.