A quick eye opener...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Apr 17, 2003
37,622
0
76
Originally posted by: Rollo
Originally posted by: GeneralGrievous
Most if not all have said x800xt is the winner by a slim margin. Then I hear the GT is fully "unlocked" meaning above 6850 levels for us industrious overclcokers.
The XT beats the GT by more than a slim margin.


You can make a GF4 beat a X800XT by looking at lower IQ, so I'd say it doesn't really matter what any X800 is doing now.
As you can turn off brilinear on nVidia cards and get the best possible IQ, and you have to look at the shimmering and distortion with ATI, I guess it comes down to what you prefer:
Perfect IQ at great framerates, or flawed IQ at better framerates.
(unless you consider the impact of the Far Cry and Painkiller patches to SM3, and the other games upcoming, and Doom3, and linux, and mpeg encoding)
Oh well. The X800XT is starting to seem like a latter day V5-6K to me. Old tech at a new price.


i'm more concerned with current performance, not when a patch is realeased to support SM3.0 whose merits are still yet to be proven.

all i know is the X800XT owns in farcry and i cant notice the IQ difference
 

fsstrike

Senior member
Feb 5, 2004
523
0
0
Wether nVidia has better IQ or not, it doesnt really matter. No one will notice any quality diferences in any games unless seriously, and closely analysed. Until we see PS3.0 and SM3.0 come it to play, you will not see the true benefits of the 6800. You should also take into account what games you think you might be playing in the future. If your like me, youll waiting for HL2 and Counter-Strike2. If so, youll probably want an X800. If your not like me, and your waiting for Doom3, then you should probably get a 6800. Either way, its too early to tell as none of those games mentioned are available for benchmarking.
 
Apr 14, 2004
1,599
0
0
You can make a GF4 beat a X800XT by looking at lower IQ, so I'd say it doesn't really matter what any X800 is doing now.
As you can turn off brilinear on nVidia cards and get the best possible IQ, and you have to look at the shimmering and distortion with ATI, I guess it comes down to what you prefer:
Perfect IQ at great framerates, or flawed IQ at better framerates.
Wow. Bit of an exaggeration here.

I guess you are right though. If trilinear filtering is worth $100 and a 5% or so drop in performance, then I guess that's what you prefer.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: BugsBunny1078
They are like the Lamborghinis of video cards. One per day is all they can make. Im sure there will be twice as many 6800 GTs though so do not worry.
Actually, the latest reports have said nVidia is going to sell two 6800U chips for every 6800GT chip. So, that doesn't say "widespread availability" to me.

All of you guys waiting for the 6800GT are foolhearted IMO. I mean, sure, the thing will more than likely be the best value for the money in the high-end sector when it's released. You can be guaranteed that ATi will come up with something to compete if it starts to get out of hand for them though.

In any event, there is absolutely no sign of the 6800GT, yet it's competitor the X800PRO can be found if you look hard enough. By the time the 6800GT is trickling out and you're beating people to get your hands on one, the X800PRO will be in widespread availability. On top of that, I have a feeling the X800PRO will be cheaper due to supply/demand and price gouging.

As for Rollo, I'm quite used to his unobjective comments concerning ATi. As soon as ATi releases something more powerful than what nVidia can put on store shelves, he gets his panties in a bunch and starts posting FUD.

I would like to see a GF4 score over 12,000 in 3Dmark '03 by the way. The thing doesn't even support DirectX 9!!!

And as for image quality, Rollo, why did nobody notice it on the 9600 cards for over a year??? It's interesting to hear the nVidia people suddenly complaining about image quality. AFAIK Far Cry still doesn't run on the correct PS2.0 paths on an NV30 card, and maybe even the 6800U.
 

sodcha0s

Golden Member
Jan 7, 2001
1,116
0
0
Originally posted by: Rollo
Originally posted by: GeneralGrievous
Most if not all have said x800xt is the winner by a slim margin. Then I hear the GT is fully "unlocked" meaning above 6850 levels for us industrious overclcokers.
The XT beats the GT by more than a slim margin.


You can make a GF4 beat a X800XT by looking at lower IQ, so I'd say it doesn't really matter what any X800 is doing now.
As you can turn off brilinear on nVidia cards and get the best possible IQ, and you have to look at the shimmering and distortion with ATI, I guess it comes down to what you prefer:
Perfect IQ at great framerates, or flawed IQ at better framerates.
(unless you consider the impact of the Far Cry and Painkiller patches to SM3, and the other games upcoming, and Doom3, and linux, and mpeg encoding)
Oh well. The X800XT is starting to seem like a latter day V5-6K to me. Old tech at a new price.

So, I guess you actually own an x800 and have seen this for yourself? Or are you like me and have seen various articles on the subject? I assume the latter. Why don't you take some of your own advice here Rollo and keep your mouth shut until you know firsthand what you are talking about. From what I have seen the iq loss is very minimal, hardly the "shimmering, distortion and flawed IQ" you speak of. ATI should have been straightforward about their use of brilinear instead of claiming full trilinear support. It was wrong and stupid of them to do so. They would have been caught a lot sooner had they not done such a good job implementing it however.

Latter day V5-6K huh? Sounds more like 6800. It was "released" over a month ago. How many people own one? Where can I buy one right now? By the time these things hit the market en masse, ATI may be releasing a refresh.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
LOL truth hurts, eh Sodchaos? Why don't you check out Cainam's first hand experiences and his posts of how he can see the moving mip bands?

Last I checked, owning a card has never been a criteria for debating it.

I have a 5800 Ultra, you wouldn't believe how many non owners tell me about it.

No, the X800XT is the better comparison. See, some people have bought 6800Us, and many more will this month.
The comparison is that the V5 6K was a brute force old tech card, just like the X800s. Two year old tech with more pipes re-sold. Time will prove me right that they're selling them firesale at first release because they know they're done once the 6800s hit the market and people have a choice between old and new tech, brilinear or not.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
Originally posted by: Ackmed
Rollo and his ignorance is always good for a few laughs.

Yeah, I think he might be worse then Gstanford. If that is even possible. All that comes out of this guys mouth is mindless garbage :(
 

ZombieJesus

Member
Feb 12, 2004
170
0
0
Unless you play games with a giant 4x magnifying glass infront of your monitor you cannot see the adaptive trilinear filtering ati or nvidia use its impossible. So I prefer to use these adaptive algorithms as they offer better performace with out impacting iq. At any rate trilinear filtering is nothing people should focus more on anistropic filtering. Nvidia and Ati use unique AF algorithms with ati making transitions between mip levels much more smoothly then Nvidia. Nvidia however leaves the mips truer for longer then ati and then makes a slightly more abrupt change to a lower level MIP. Which is superior? I cannot say.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: ZombieJesus
Unless you play games with a giant 4x magnifying glass infront of your monitor you cannot see the adaptive trilinear filtering ati or nvidia use its impossible.
Actually the nVidia one is very easy to point out without magnification. The screenshots I've seen make it clear as day when it's using brilinear; the AF doesn't "clear up" distant textures along the ground nearly as well and leaves alot of stuff blurry.

ATi's method OTOH looks identical to a trilinear image, from my eyes anyway. Apparently someone has found a way to exploit their method, but it appears nearly flawless. As I said, it went unnoticed for over a year, how bad can it be?

For example, if you had a mole on your face for over a year, don't you think you would have noticed it?
 
Apr 17, 2003
37,622
0
76
Originally posted by: Rollo
LOL truth hurts, eh Sodchaos? Why don't you check out Cainam's first hand experiences and his posts of how he can see the moving mip bands?

Last I checked, owning a card has never been a criteria for debating it.

I have a 5800 Ultra, you wouldn't believe how many non owners tell me about it.

No, the X800XT is the better comparison. See, some people have bought 6800Us, and many more will this month.
The comparison is that the V5 6K was a brute force old tech card, just like the X800s. Two year old tech with more pipes re-sold. Time will prove me right that they're selling them firesale at first release because they know they're done once the 6800s hit the market and people have a choice between old and new tech, brilinear or not.

if anything, i think the old tech argument works against nvidia. i mean R300 has been around for a while and its supped of version is still managing to edge out a victory over the super duper new tech NV40
 

sodcha0s

Golden Member
Jan 7, 2001
1,116
0
0
LOL truth hurts, eh Sodchaos?

Never hurt me but I was thinking maybe it's hurting you, seeing how this turbocharged old tech 12 pipe card is right there with your glorious 6800. See, it used to be that nVidia had new cards that came out sooner and were actually much more powerful than the competition, although they always lacked in iq, and as i recall their drivers were absolutely horrible, especially in d3d. (see half-life) You are the one making false accusations about iq and ATI, thereby showing your true nVidia fanboi status. Yes I know you've owned almost every card produced by both companies, who cares.

BTW, I am not a banner carrying crusader for ATI, or any card maker for that matter. I own ATI simply because for the past year or so they have been the better choice IMO. When I'm ready to upgrade again I'll buy what I think is the best choice, no matter who produces it.
 

James3shin

Diamond Member
Apr 5, 2004
4,426
0
76
no "opinion" the ati cards have been infact better the last couple of years...but anyways, this gen, things seem to be alot more competetive.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Compddd
Originally posted by: Ackmed
Rollo and his ignorance is always good for a few laughs.

Yeah, I think he might be worse then Gstanford. If that is even possible. All that comes out of this guys mouth is mindless garbage :(


That's a pretty strong slur. Can you quote any of this "mindless garbage"? Or are you just angry I've been disappointed in ATIs latest GPU because they added no new features, implemented a filtering technique that lowers IQ in some situations, can't be disabled, and was hidden?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: fsstrike
Originally posted by: Compddd
Don't pay attention to Rollo shady. He's just a troll :/

I second that.

I think we saw your true colors here Fsstrike:
http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1314558&highlight_key=y&keyword1=Rollo
Actually your wrong buddy. When the GeForce3 first came I payed about 800$ canadian for it. I bought all the best sh!t that was avialable and my PC still runs great considering its 3 years old. GeForce3, p4 1.5, 512mb PC-800. I bet when I got all that stuff, you were running a TNT2, with a p4 450 and 128mb SD 133. My next PC will also be top of the line, and will last me another 3 years. Someone who buys computers wisely will have something that will last them a long time. I bet your f-cking sh!t computers dont even last 8 months before you need to buy a major upgrade.

I love the way that you pointed out how the X800 only beats the 6800 slightly. Well, beating it slightly still means its better. Whether its by an inch or a mile, winning is winning. Are you to stupid to understand that concept?

Go do some math homework or something, i bet your some stupid retard with glasses who thinks they can out-nerd other people. your just a moron who believes marketing gimmicks! Have fun with lower resolutions, lower AA, lower AF, and lower FPS LMAO! Then try to tell me that your 6800 is better! lol.

You're that kid that trolled so bad everyone told me to ignore you and wait for you to go away. You tried to call me poor and say your 3 year old POS computer is teh roxor.

Yeah, I'm a troll alright. BTW- how are your games running on your secretary computer? LOL
 

James3shin

Diamond Member
Apr 5, 2004
4,426
0
76
lets either get things back on track(discussion concerning the 68gt) or lets just let this one die...
 

imported_obsidian

Senior member
May 4, 2004
438
0
0
Originally posted by: Rollo
Originally posted by: Zebo
It's gonna be real hard me to buy this card unless it overclocks massive. Every reviewer out there is telling me X800XT is the best video card and now it can be had for less than the mid range GT.:(

Those willing to actually wait a month should have no problem buying GTs for less than X800XTs. Beyond that, I don't even think I could buy a X800 anything now with the fine cheater drivers enabling the texture shimmer and distortion feature.

If ATI would figure out how to disable their brilinear, they'd be a lot easier to recommend.
Someone already figured it out:
http://www.theinquirer.net/?article=16473

If you don't mind taking a 20% performance hit go ahead and do it.

BTW, basing what cards will sell for off of pre-sale prices is always a bad idea. Wait until both company's cards are actually out THEN compare prices.