masshass81
Senior member
- Sep 4, 2004
- 627
- 0
- 0
Err got my retail ATI X800 XL from shop.ati.com for $283 shipped and no tax charge (ships from TX). Use the coupon code 1010
Originally posted by: dfloyd
I was replying to your assertion SC is the best looking game ever in SM1.1 and showing it's better in SM3.You posted a link in reply to me stating there was no banding. I read the link and I do not see it saying anything about banding, it does claim that his cheekbones are slightly more defined and 3.0 is a little brighter, no where do I see anything about banding so may I ask what the link is supposed to show? At least in the context of what you quoted and replyed to.
Here's your banding link:
http://www.beyond3d.com/misc/benchguide/index.php?p=sc3
Tough call there- no banding and more effects in less passes.visually, some of the banding seen with SM1.1 disappears for SM3.0 due to the higher precision used in the shaders. However, the latter also offers better performance and the option to apply additional graphical effects.
Everyone on this board does, or should, know that nVidia made some mistakes with the video acceleration feature advertising of the nV40. The fact that it never worked with WMV9, and they state on their website that it never will, tells me they made a mistake with advertising, not that it's "broken". For something to be "broken" it had to have once "worked". I can link you to definitions of "broken" if it will help?This above quote I find most amusing of all. Obviously you have not read your box. My box, and most everyone I have talked to whom has read theirs) claims that my 6800 NU has WMV9 Hardware acceleration.
What happens when I make assumptions? You use a 6800NU as your primary card. The odds that you have tested more cards on WMV9 than me in the last 8 months is low. I own 2 PCIE 6600GTs, 2PCIE 6800GTs, and an AGP 6800NU. In that time frame I've also owned/tested a X800XTPE, an AGP 6800GT, GF3, and 2 PCIE 6800NUs. Sorry, the guy thats using a a 6800NU just isn't likely to have owned many video cards in the last 8 months- it's a midrange card. (no offense)"I've tested those videos on what probably amounts to more cards than you've owned in your life over the last eight months or so. (ten different cards if you're wondering)"
This quote I find highly amusing as well. You know what happens when you make assumptions? I do.![]()
How is that relevant? What sort of tech support do you do where you're doing hands on testing of WMV9 clips on modern video cards?I have been in the computer field since 1985 and have been in tech support for the past twelve years.
Really? Please list the cards that you've owned in the last eight months that makes my ten seem like a small number? Like I did? Also, what do you tech support that your are testing WMV9 hands on like I have?I do believe I have worked with or owned quite a few video cards in that time (I would guesstimate in the hundreds). So your ten somehow seems very small by comparison and quite pointless to the argument.
Dude, you can't even get your own computer to run the video smooth- from my experience all you have to do is set up the computer properly. BTW- I do primarily online tech support as well. Big deal. Me logging on to networks, fixing database problems, writing reports, etc doens't have much to do with WMV9, does it?Are you trying to make yourself seem like some kind of expert? Owning and handling a video card does not make one a expert. Heck I am far from even dreaming of being a expert on video cards. I am just stating what I see and dont see.
"SLI has nothing to do with this;however, my dual 6800GTs run SIL1080 smooth as glass. (thanks to my 3800+ no doubt)"
You said maybe it's my SLI- SLI only workis on video games AFAIK.It may not have anything to do with it. I never said it did.
That's about average from what I've seen."My computers run it in the high 70s-low 80s."
Thats very impressive, I wish I knew how as a 1080i HD DX9 video on my 6800NU never dreamed of coming close.
This is all spin. You or I can't see it working, we have no access to the version of WMP10 it works with. You stated it's working on 6600GTs, it's not anywhere outside NDA review sites."Sorry- but this is wrong as well. While the 6600GTs have the working hardware WMV acceleration, they are waiting on MS like the PCIE 6800s. The only people who've seen this working are at review sites that nVidia gave updated beta versions of WMP10, AFAIK.
You shouldn't post so much misinformation about this stuff, people might believe you?"
Ok how is my statement false? In your fist sentence you agree with me by saying "While the 6600 GTs have the working hardware WMV acceleration....." I stop at that point because that is what I said. You agreed with me and told me I was wrong in the same sentence. What are you talking about? Your not making any sense. You go on to say the only people who have seen it working are reviewers. So how does that make my statement false? Reviewers are real people whom have seen and used this feature. So again how is my statement false? I never said every person on earth can use it now, I said the feature is working in the 6600 GT cards, and apparently because reviewers can use it, then it must be working. And even then I still dont think you are correct. I have seen people post who have 6600 GTs and claim that they had less CPU utilitzation with the HW acceleration turned on. So I am sorry, but I will believe the many people I have seen claim that.
Originally posted by: dfloyd
Actually Rollo the video runs perfectly smooth with my X800XL card. May I ask what system you were running with that you had the slight hiccups?
Originally posted by: dfloyd
Actually Rollo the video runs perfectly smooth with my X800XL card. May I ask what system you were running with that you had the slight hiccups?
Oh and thanks for advice on having friends come over and help me out, many do come over and play around with my system, but most ask me for advice after doing so. I am pretty dilligent about tweaking and cleaning up my system.
Also here are the facts:
An A64 3200+, 1GB Corsair XMS Xtreme CLS 2 Memory, BFG 6800OC ran the video poorly
An A64 3200+, 1GB Geil 2.5 CLS 2.5 Memory, X800XL runs the video flawlessly
So to the point of this thread I still suggest the X800XL over the 6800 GT. Not just for video, for gaming as well. As I have played on a 6800 GT and it did not feel as smooth as my X800XL even if the benchmarks in some games were faster. (I Played Far Cry and Riddick Quite a Bit and my Radeon X800XL felt much smoother)
Originally posted by: Rollo
Why should anyone care? Quake4, Prey, and Return To Wolfenstein2 (three HUGE games) are Doom3 engine based and will likely also run like ass if people take your X800XL advice.
Originally posted by: Creig
Originally posted by: Rollo
Why should anyone care? Quake4, Prey, and Return To Wolfenstein2 (three HUGE games) are Doom3 engine based and will likely also run like ass if people take your X800XL advice.
Run like ass? I don't think so. While not nearly as fast as a 6800GT in Doom3, its performance is still fairly close. And in OVERALL benchmarks, the X800XL is just as fast as the 6800GT.
So unless your gaming world revolves around nothing but Doom3 based games, get the X800XL and save yourself a good chunk of $.
Originally posted by: lavaheadache
Okay, I just ran that step into liquid video. On my system, equip with a 6800 gt there was lots of chop throughout. I think Rollo is suffering from nvidism again. I'm not sure how it would run on the X-series since I no loger have my X800 pro, but for sure I cant say too much about the performance I experienced. Check out my system, Its fairly powerful and I dont think that there is much left to optimize.
Originally posted by: baddog121390
for 320 i would get a 6800gt instead. a bit more performance + SM3.0
Originally posted by: Pr0d1gy
Originally posted by: baddog121390
for 320 i would get a 6800gt instead. a bit more performance + SM3.0
You really shouldn't just flat out lie like this. The only game a 6800gt beats the x800xl on is Doom3. Pixel shader 3.0 (or Sm3 or whatever) is overrated.
Unless you plan on going SLi or playing nothing but Doom3, get an x800xl.
Originally posted by: dfloyd
Oh. So you have the magic system where games running 33% slower seem faster?Rollo please do me one favor.
Stop assuming you know what games run or feel like on my system. You cant know. You may base it off of numbers you have seen elsewhere or off of your own experiences but that does not change the fact that those experiences are not with my system or setup.
You said in your quote above the X800XL is faster than a GT, not a NU? Now you're switching to NU?With that said, you are completly wrong friend. My Radeon X800XL feels WAY smoother in Riddick than my 6800NU did.
It's easy enough to see my system is configured correctly, compare my benches to Tom's, Anands, or Firing Squads- they're similar.Maybe you did not have your system configured properly is why it felt less smooth to you.
Doom3 ran poorly on my XTPE.OpenGL games do not run like Crap on Ati cards. Maybe they do not run as fast as some Nvidia cards in some games but they still dont run like crap. (Please pay attention the benchmarks below and you will learn something very interesting)
WTH? Here's what you said:You Claimed:
"There's no way your X800XL could have felt smoother at Riddick, the card is 33% slower at that game, a huge deficit"
Notice that part about YOUR? As in my card. Remeber the part where your talking about my 6800 NU. Notice that part? I dont think you did as you did not send me benchmarks that compared my X800XL to a 6800 NU? You sent me benchmarks comparing my X800XL to a 6800 Ultra or GT. Guess what, I never said my X800XL ran smoother than either of those cards in that game.
Originally posted by: dfloyd
So to the point of this thread I still suggest the X800XL over the 6800 GT. Not just for video, for gaming as well. As I have played on a 6800 GT and it did not feel as smooth as my X800XL even if the benchmarks in some games were faster. (I Played Far Cry and Riddick Quite a Bit and my Radeon X800XL felt much smoother)
Look at what you said again, chief- you don't mention 6800NUs? Only GTs?I said it ran way smoother than my 6800 NU. Want to know something funny? According to the exact same website that you quoted benchmarks from (You know the benchmarks that had nothing to do with the card I was comparing my X800XL to?) I was right and you are wrong.
These Benchmarks! clearly show that my Ati Radeon X800XL is running faster than a even better Nvidia 6800 than the 6800 I had. (The one I had, had only 128 MB of Ram).
"I noticed a huge drop in frame rate if I had Fast Writes disabled on my rig playing that movie. Once Fast Writes was re-enabled, I got smooth as glass frames."
Wiseman6, Hmm that may explain why my Nvidia ran so badly on that video as I know I had fast writes turned off. I do not have the system anymore so I cannot verify this. And that would also explain why my X800XL is running it smoothly, its a PCI-E card. (Whether the video card is helping any or not, if its a Fast Writes issue then that would explain the slowdown).
One way to verify if this is the case, Lavaheadache can you please turn on fast writes (Or let us know if you had them on when it was running slowly) and let us know what your performance is like? Would be much appreciated friend.
It would be nice if a few other 6800 owners with 3000+> cpus could post on this- I doubt my system is abnormal in any way, and it run SIL fine. (albeit with 80% cpu usage)
Originally posted by: dfloyd
One way to verify if this is the case, Lavaheadache can you please turn on fast writes (Or let us know if you had them on when it was running slowly) and let us know what your performance is like? Would be much appreciated friend.