• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Firingsquad's NV40 vs R420 final evaluation

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: keysplayr2003
Originally posted by: RussianSensation
Originally posted by: Jeff7181
It's funny how when the 6800 outperforms the x800 in HL2... the difference in performance is insignificant... but in cases where the x800 outperforms the 6800, it "wins."

Whats even MORE funny is that X800 beats 6800 in HL2
And if you think ATI's advantage in Far Cry, and Nature Test in 3dmark03 are not significant wins...then....
And also, Nvidia has only won benchmarks in Old openGL games, or old games period.

The fact of the matter is ATI x800xt is faster in Splinter Cell, Halo, Far Cry, HL2 demo, STALKER, UT2K3/UT2k4, Nature Test in 3d03 and every other shader intensive game out there. A win is a win by a mm or a mile, it doesnt matter. If you can afford a 6800Ultra or X800xt, I would never call either owner a loser, as both are magnificent cards. The reality is that even X800Pro and 6800NU/Pro or whatever else is out there (and possible even the x800se) are still faster than any fastest previous generation product. Now in some cases the performance advantage is up to 2x + in current games. If that is not impressive, I don't know what is. Back in the days you got lucky if Geforce 4 gave 40% improvement over Geforce 3 Ti 500, and no one complained about it. Even if Nvidia loses by 5-10% in games it still smokes anything "old" out there. But for pure performance, right now, it seems X800XT has a slight edge by simply being faster in shader intensive games. But why even argue about the cards that most of the people (even on this forum) will not own?

Link for X800 whooping 6800 in HL2 Please?

http://www.xbitlabs.com/articles/video/display/r420-2_12.html

It doesn't whoop it, and neither does the 6800 whoop the x800. Of course one card might whoop the other when aa/af is added
 
And Russian Sensations 'true colors" show again:

A win is a win by a mm or a mile, it doesnt matter.

A win is not a win by a mm or a mile. Only a zealot would think so in terms of video card, car, or other evaluations where a win that shows no discernible difference is pretty meaningless.

If the ATI part ran every single benchmark 1fps higher at min/ave/max on every single game with identical IQ, would it be the better card?
Apparently to Russian Sensation, it would.

I would say it's a draw, because you wouldn't know which card was in your computer without the counter on and the benchmark running.

Most of all, you're apparently conveniently forgetting that the drivers on the ATI part are mature, it's the same part they've been making for a year and a half.
The nV40 is an entirely new architecture, significantly more complex than ATIs part, that hasn't even seen the light of day yet.
If you think nVidia has never produced signicant performance gains on new parts with driver revisions without loss of IQ, "you must be smoking something hallucinogenic".
I think the nVidia parts are close enough at the benches they lose to regain that ground with drivers, with the possible exception of "Tomb Raider: Angel of Sloppy Monotony".
 
Originally posted by: Acanthus
X800SE $300
X800P $400
X800XT $500
X800XT PE Unannounced

From what I've read the X800XT=X800XT PE there will be no "plain" X800XT. Which is I think is fine, cuz I don't think the performance will be any different....
 
Originally posted by: Rollo
And Russian Sensations 'true colors" show again:

A win is a win by a mm or a mile, it doesnt matter.

Most of all, you're apparently conveniently forgetting that the drivers on the ATI part are mature, it's the same part they've been making for a year and a half.
The nV40 is an entirely new architecture, significantly more complex than ATIs part, that hasn't even seen the light of day yet.
If you think nVidia has never produced signicant performance gains on new parts with driver revisions without loss of IQ, "you must be smoking something hallucinogenic".
I think the nVidia parts are close enough at the benches they lose to regain that ground with drivers, with the possible exception of "Tomb Raider: Angel of Sloppy Monotony".

And how did Nvidia make the cards faster with newer drivers, was it new AF optimizations, something that's already included in the drivers now? Was it reduced precision shaders, which already exist in the drivers from NV3x optimizations, which are also running on the NV40? If NV can increase driver speed while doing the same shader work as the R420(which it doesn't do now) to over the levels it has with all their optimizations in(including shader work), then that is an improvement.
 
Originally posted by: reever
Originally posted by: Rollo
And Russian Sensations 'true colors" show again:

A win is a win by a mm or a mile, it doesnt matter.

Most of all, you're apparently conveniently forgetting that the drivers on the ATI part are mature, it's the same part they've been making for a year and a half.
The nV40 is an entirely new architecture, significantly more complex than ATIs part, that hasn't even seen the light of day yet.
If you think nVidia has never produced signicant performance gains on new parts with driver revisions without loss of IQ, "you must be smoking something hallucinogenic".
I think the nVidia parts are close enough at the benches they lose to regain that ground with drivers, with the possible exception of "Tomb Raider: Angel of Sloppy Monotony".

And how did Nvidia make the cards faster with newer drivers, was it new AF optimizations, something that's already included in the drivers now? Was it reduced precision shaders, which already exist in the drivers from NV3x optimizations, which are also running on the NV40? If NV can increase driver speed while doing the same shader work as the R420(which it doesn't do now) to over the levels it has with all their optimizations in(including shader work), then that is an improvement.

Actually many respectable sites prefered Nv's IQ over ati's. Others, said it was close. (majority of games considered) Just look at their 60.xx vs their 61.11 drivers. It turns the tides in some settings/games.

On a side note, many of these games benchmarked with max settings are above what people would desire to play them on, on both cards. Shouldnt people be benching these things on higher resolutions if this comes up? I mean what is the difference between 60 vs 68 fps? I think we need to see the res being put up to 2048 in many situations. Y? Because these cards already outdo themselves in many games at maxed settings and in the future we will be having the desire for higher resolutions. (longlivity)
 
Originally posted by: keysplayr2003

Link for X800 whooping 6800 in HL2 Please?

XbitLabs - x800xt comes out on top at 1600x1200 but too bad they didnt test with AA/AF enabled.
Digit-Life - X800xt beats 6800U at all resolutions
In this one you can all see X800Pro is so close to 6800Ultra, that one can easily suspect it will beat 6800GT as well. Of course playing at 40FPS is hardly a "victory" as that is rather slow anyway at the highest detail setting.

Of course Nvidia will play better at Doom 3 so it's 1:1 between the 2 for top games right now.
 
Originally posted by: Rollo
And Russian Sensations 'true colors" show again:

A win is a win by a mm or a mile, it doesnt matter.

I said many times before that both ATI's and Nvidia's new cards are excellent. I'd take either card if I had the $$$. However, by looking at HL2 beta performance and Far Cry, and shader Intensive Nature Test, ATI does not win by 1FPS, but by a lot more, especially at high resolutions where you would have smoother gaming. Even look at HardOcp review and it actually shows the levels of gameplay they were able to reach at reasonable framerates while having various IQ settings and they also chose x800xt as the most playable card. From all the reviews on the net I've seen at least 5 claim ATI's card as the better card, then some who do not proclaim victory as it is too close, and not 1 that gave the honour to Nvidia yet. So obviously I am not the only one with this opinion.
 
Originally posted by: Marsumane

On a side note, many of these games benchmarked with max settings are above what people would desire to play them on, on both cards. Shouldnt people be benching these things on higher resolutions if this comes up? I mean what is the difference between 60 vs 68 fps? I think we need to see the res being put up to 2048 in many situations. Y? Because these cards already outdo themselves in many games at maxed settings and in the future we will be having the desire for higher resolutions. (longlivity)

Yeah you are right when a videocard displays 200FPS vs another one that does 100FPS at the highest detail settings it's a moot point. I agree with you completely even with 60vs 68 comparison. Again this round is very close to call a winner in a true sense where one card delivers much better performance. Of course the cards should be very closely matched, afterall they are competitors. But in some games, ATI's card does beat Nvidia's slightly (and those just happen to be more intensive games - that's all i was saying). Of course Nvidia also has its share of wins but they come when ATI already gets 100FPS at 1600x1200 4AA/8AF.
 
It's funny how when the 6800 outperforms the x800 in HL2... the difference in performance is insignificant... but in cases where the x800 outperforms the 6800, it "wins."

You're going to be very disappointed when the full version is released. Gabe Newell has already stated that the x800 line outperforms the 6800 by 40%.
 
If you believe that line stated by Gabe Newell is actual performance numbers (using an internal benchmark to get those figures) more than a pathetic marketing gimmick to lure the 'ignorant' into buying a product (since both Gabe and ATi share the same bed), than I have a nice bridge to sell you with a great view of lady liberty. I also believe his statement, clarified by Snowman on page 4, indicated a 40% gain in current gen video cards like that of the 5950 and 9800XT, and not those based on NV40x or R420.

And besides, I highly doubt anyone willing to drop 500 dollars on a videocard is only concerned about one game.
 
So you're saying your purchase revolves around one title? Not only is that sad, but lame.

However, it that were the case, and considering that I played played both (which could have changed for the better or worse since then), my money soley rests with Id Software and Doom as being much more immersive than HL2.
 
Originally posted by: ChkSix
So you're saying your purchase revolves around one title? Not only is that sad, but lame.

Looking through your posts I don't see you mention any title other than Doom3 when you're talking about games you want to play that you would buy a Nvidia card for
 
From all the reviews on the net I've seen at least 5 claim ATI's card as the better card, then some who do not proclaim victory as it is too close, and not 1 that gave the honour to Nvidia yet.
I agree. Every site I've been to, including this one, has given ATI an edge, whether slight or huge.
 
Looking at your posts, and considering that everyone else has gone back to the reviews (read Anandtech's for instance) and beat them to a pulp, it is moot. But if you must know:

STALKER (which in my opinion looks better than HL2)
Far Cry (once the patch and drivers get better, Nvidia will own this..but that is only my speculation)
Doom III (which will completely rewrite how FPS games are programmed, just take a look at Id's history and it's influence in the genre since Wolfenstein over 10 years ago)
etc

Couple that with the fact then when ATi wins, it wins by little, and now Nvidia shares the better IQ and rendering (funny how true Ati fanboys won't acklowedge this), as even confirmed by Microsoft themselves (go read the THG article on both cards), my investment is pretty much set in concrete.

And should I mention the superior technology and anticipation of better drivers, all in Nvidia's favor? How much further do you think ATi's Catalysts can go with the R420 (considering that they are already quite mature and haven't had to be completely rewritten since the 9700pro, as it shares the same architechture as the R300) other than implementing 3DC (which as of right now, only one game has confirmed it's use, which to no suprise, is Valve)? Should I go on?

If the performance right now is neck and neck, and the drivers from Nvidia still have to be overhauled because of NV40, what do you think will happen once Nvidia programmers feel comfortable with the 6800 series? A significant improvement, or something along the lines of what we have seen through early testing and beta drivers (which ironically, even though they are beta, offer better IQ over the more mature catalysts). Just look at the recent history of Nvidia with NV3x alone and how far they were able to push the NV3x with drivers only, and you can answer that yourself.

Should I also mention overclocking? Since most do not use water or phase change or dry ice for that matter in their home rigs, and seeing the less than significant mhz gains so far when attemtping to overclock the ATi using air, how much headroom do you think most average people will have with the ATi's? From early examples and reviews done so far, not much at all.

What else would you like me to address? The fact that ATi performs equally, slightly better or even slightly worse, all with over a 100mhz speed advantage on it's core and mem? The fact that there have been some sites giving the crown or edge to ATi because they didn't have to do much to achieve this, like Nvidia did? I don't know, but please try to educate me.

For me in the end, I refuse to lay down 500 dollars on a card that one: is built on something already available with no support for what may come in the future, no matter how fast it may perform (unless that speed gain is completely killing the competitors offering), and two: is so far extremely stubborn in the overclocking department using conventional cooling methods.

Whether or not PS/VS3.0 makes it debut today or next year, just knowing that my current purchase will be able to handle those enhancements when they do come ( and they will, but again maybe next month, or maybe next year) while possibly offering performance boosts, greatly solidify my purchase. If I am going to spend that much, I would hope to skip a generation and not have to chase down the NV50/R500 models, as longevity is something I take quite seriously.
 
now Nvidia shares the better IQ and rendering
Can you show one example of this? The only thing I've read about superior nV IQ is Lars' fleeting comment that the 6800 offered better texture filtering. Otherwise, ATi still seems to have the better shader IQ in next-gen games (Far Cry and Lock On). This may and hopefully will change, of course, but I don't see how you can say nV has the IQ edge in anything but marketing specs ATM. (Yes, I'm still less than optimistic WRT nV, but I think the 6800 will change my mind.)

(Sorry for nitpicking, but I prefer evidence over assumption. We got burned enough assuming with the last gen, no need to relive it. I hope that one month after each card is in retail, we can see drivers and games patched enough to make a relatively conclusive final analysis on current game performance.)
 
Hey guess what guys , its a sales strategy called Marchitecture. It involves 2 companies optimizing thier cards for 2 extremely popular games ,one game per side. Translation : ATi has HL2 stickers al over its face which means that HL2 will run faster on ATi cards. NViDiA has Doom 3 stickers all over its face , that means Doom 3 will run faster on NViDiA cards. MARCHITECTURE. Learn the word well because you'll be seeing it ALOT in the future
 
Originally posted by: SilverTrine
It's funny how when the 6800 outperforms the x800 in HL2... the difference in performance is insignificant... but in cases where the x800 outperforms the 6800, it "wins."

You're going to be very disappointed when the full version is released. Gabe Newell has already stated that the x800 line outperforms the 6800 by 40%.

where did he state that?
last I saw he had stated this

"the developer of one of the most eagerly awaited 3D games confirmed that the X800 is the fastest chip on the market, at least in running its own code. "In terms of performance, it's pretty fast," said Gabe Newell, president of Valve Software, the developer of Half-Life II. "When I say 'pretty fast' I mean that its 40 percent faster in internal testing, faster than any next-generation parts that are coming out. That's a huge advantage in developing a game."
"

which was quoted before in this thread. and heres a link for you

he never directly stated 6800....thats just you assuming...and you know what assuming does dont you?
And besides...its the fastest chip on the market RUNNING ITS OWN CODE.........think about it...
 
Originally posted by: SilverTrine
It's funny how when the 6800 outperforms the x800 in HL2... the difference in performance is insignificant... but in cases where the x800 outperforms the 6800, it "wins."

You're going to be very disappointed when the full version is released. Gabe Newell has already stated that the x800 line outperforms the 6800 by 40%.



no he didn't!

read my post above.
 
Originally posted by: RussianSensation
Originally posted by: keysplayr2003

Link for X800 whooping 6800 in HL2 Please?

XbitLabs - x800xt comes out on top at 1600x1200 but too bad they didnt test with AA/AF enabled.
Digit-Life - X800xt beats 6800U at all resolutions
In this one you can all see X800Pro is so close to 6800Ultra, that one can easily suspect it will beat 6800GT as well. Of course playing at 40FPS is hardly a "victory" as that is rather slow anyway at the highest detail setting.

Of course Nvidia will play better at Doom 3 so it's 1:1 between the 2 for top games right now.

Ok, as far as the Xbit graphs go, does less equal better? Because the 6800 is ahead in 4 out of 6 graphs and loses 1 and ties with another. What exactly am I looking at here? Because as far as I can see, the 6800U is ahead majority of the time. Tell me if I am reading it wrong.

Your second link does not work.
 
Originally posted by: keysplayr2003
Well, heres a question for you guys. Out of these two cards, the X800pro and the 6800GT, and assuming both were available right now for 400.00 each, and if performance were dead even in every game out there. Would you get the 16 pipe card? or the 12 pipe card. Even though performance was dead even ( which it isn't) but if it was. I think this is the no brainer.

Firing Squad found many reasons besides just pipes to favor the 6800GT. I agree with them.

Well, is there anyone here who would answer this question without debating every syllable?
Straightforward question. Which one.


Well it's too early yet, because the prices haven't matured yet,but from the first indications I would go for the 6800GT, due to its advanced features and the pipelines.
I do have a contra-question though:

Where will that lead 6800U sales??
 
Back
Top