• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

I'm glad I got 6800 GT instead of X800 XL/XT after all

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Hmmmm, look at that huge lead Nv has in Doom 3
http://www.techreport.com/reviews/2005q2/agp-radeons/index.x?pg=4
It's a whole 4 fps higher, lol.
It's worth noting that both of those are custom demos rather than the standard demo1.

I find it rather interesting that nVidia's massive lead vanishes when custom demos are used in Doom 3.

It'd be interesting to go off the rails in demo1 to see if nVidia are cheating like they were in 3DMark with Z buffer cheats and static clip planes.
 
Originally posted by: BFG10K
Hmmmm, look at that huge lead Nv has in Doom 3
http://www.techreport.com/reviews/2005q2/agp-radeons/index.x?pg=4
It's a whole 4 fps higher, lol.
It's worth noting that both of those are custom demos rather than the standard demo1.

I find it rather interesting that nVidia's massive lead vanishes when custom demos are used in Doom 3.

It'd be interesting to go off the rails in demo1 to see if nVidia are cheating like they were in 3DMark with Z buffer cheats and static clip planes.


yea, gotta watch ati.. they're pretty sneaky.. this would not be the first time they cheated in a id game to try and downplay how badly they performed compared to nvidia.. oh wait.. you meant nvidia? wtf... :roll:

 
Originally posted by: BFG10K
Hmmmm, look at that huge lead Nv has in Doom 3
http://www.techreport.com/reviews/2005q2/agp-radeons/index.x?pg=4
It's a whole 4 fps higher, lol.
It's worth noting that both of those are custom demos rather than the standard demo1.

I find it rather interesting that nVidia's massive lead vanishes when custom demos are used in Doom 3.

It'd be interesting to go off the rails in demo1 to see if nVidia are cheating like they were in 3DMark with Z buffer cheats and static clip planes.

Thats more probably because Doom 3s timedemos do not reflect the games real world performance. The timedemos do not use physics or AI and only one one thread is exectued during timedemos, while playing the game the physics and AI are executed on a 2nd thread, which would see a large scale performance difference (especially on HT and dual cpu or dual core systems).
 
I seek redemption for starting this thread... everyone please just get the damn card you want. Enjoy whatever games you want. Enjoy the sun, play some sports, live life.

Uzair out!
(couldn't resist saying that 😛)
 
Originally posted by: munky ....handpicked benchmarks

If you want to play the hand-picked benchmark game you could also do this -

http://www.xbitlabs.com/articles/video/display/radeon-x800xl-512mb_9.html
http://www.xbitlabs.com/articles/video/display/radeon-x800xl-512mb_11.html
http://www.xbitlabs.com/articles/video/display/radeon-x800xl-512mb_10.html
http://www.xbitlabs.com/articles/video/display/radeon-x800xl-512mb_14.html

You can always find benchmarks to support a point this generation because things are so close. Point is, only the fanb0ys are out there exaggerating each IHV's advantage - the numbers speak for themselves.

 
Originally posted by: Acanthus


Thats more probably because Doom 3s timedemos do not reflect the games real world performance. The timedemos do not use physics or AI and only one one thread is exectued during timedemos, while playing the game the physics and AI are executed on a 2nd thread, which would see a large scale performance difference (especially on HT and dual cpu or dual core systems).

Which is why I like Hards reviews, because they usually dont rely on timedemos, because as you say, they do not represent real gameplay numbers.
 
Originally posted by: Ackmed
Originally posted by: Acanthus


Thats more probably because Doom 3s timedemos do not reflect the games real world performance. The timedemos do not use physics or AI and only one one thread is exectued during timedemos, while playing the game the physics and AI are executed on a 2nd thread, which would see a large scale performance difference (especially on HT and dual cpu or dual core systems).

Which is why I like Hards reviews, because they usually dont rely on timedemos, because as you say, they do not represent real gameplay numbers.


I like Hards reviews too- notice how the X850XT has 25% lower minimum fps (28 vs 35) at the same setting as a 6800GT? Ands all the spots the blue X850XT line is way below the yellow 6800GT line?

And X850XTs are much better cards than X800XLs- uh oh.........
 
Originally posted by: Rollo
I like Hards reviews too- notice how the X850XT has 25% lower minimum fps (28 vs 35) at the same setting as a 6800GT? Ands all the spots the blue X850XT line is way below the yellow 6800GT line?

And X850XTs are much better cards than X800XLs- uh oh.........

Yeah but explain why both the XT and the GT are faster than the Ultra :roll::laugh:
 
Originally posted by: Rollo
Originally posted by: Ackmed
Originally posted by: Acanthus


Thats more probably because Doom 3s timedemos do not reflect the games real world performance. The timedemos do not use physics or AI and only one one thread is exectued during timedemos, while playing the game the physics and AI are executed on a 2nd thread, which would see a large scale performance difference (especially on HT and dual cpu or dual core systems).

Which is why I like Hards reviews, because they usually dont rely on timedemos, because as you say, they do not represent real gameplay numbers.


I like Hards reviews too- notice how the X850XT has 25% lower minimum fps (28 vs 35) at the same setting as a 6800GT? Ands all the spots the blue X850XT line is way below the yellow 6800GT line?

And X850XTs are much better cards than X800XLs- uh oh.........


Also notice that the frames from the Ultra dip much more frequently and dramatically than the X850XT....I'm not sure about you but I much prefer to play a game that has fewer dips than one that has an advantage in minimun framerates. Its what I call "smoothness" 😉As far as the 6800Gt is concerned, I don't see enough yellow on the graph to make a conclusion.
 
Originally posted by: Rollo
Originally posted by: Ackmed
Originally posted by: Acanthus


Thats more probably because Doom 3s timedemos do not reflect the games real world performance. The timedemos do not use physics or AI and only one one thread is exectued during timedemos, while playing the game the physics and AI are executed on a 2nd thread, which would see a large scale performance difference (especially on HT and dual cpu or dual core systems).

Which is why I like Hards reviews, because they usually dont rely on timedemos, because as you say, they do not represent real gameplay numbers.


I like Hards reviews too- notice how the X850XT has 25% lower minimum fps (28 vs 35) at the same setting as a 6800GT? Ands all the spots the blue X850XT line is way below the yellow 6800GT line?

And X850XTs are much better cards than X800XLs- uh oh.........


Good job pointint the obvious, that NV cards are a little faster than ATis in Doom3. This is news!

Triniboy, the Ultra has AA.
 
Dont you all ever get tired of trying to impress each other about what the better card is? Any of the 6800 series, or any of the x800 series cards are reall nice cards. Some do things better then the others, and vice versa.

A thread like this comes up every week and the same results happen, flames with lots of people stating the obvious and nothing ever gets cleared up...the Nvidia folks love their cards and the Ati folks love theirs...let it go.

 
After seeing next gen games, who cares about GPU's. We need to look into getting a cell CPU for desktops...

Still, those with SM3 ARE more likely to be more prepared for next gen games...but, now that you think about it, i'm really doubting our single core processors can keep up with the physics...

i'll just get a ps3....
 
Originally posted by: CaiNaM
yea, gotta watch ati.. they're pretty sneaky.. this would not be the first time they cheated in a id game to try and downplay how badly they performed compared to nvidia.. oh wait.. you meant nvidia? wtf... :roll:
I thought that Q3 "cheating" was cleared up by the fact that ATI released a similarly-performing driver a few releases later that fixed the IQ problem but kept the speed? I don't think nV did the same with the FX's DX9 performance. No idea about D3, though.
 
Seeing as this darn thread I started just won't die, let me add the card I am really looking forward to is a 32 pixel/12 vertex pipeline monster running at 600 MHz at the same or better IPC as 6800 series. It should be at least 3.5 times as fast as 6800 GT, thus making it worthwile to upgrade.
 
this would not be the first time they cheated in a id game to try and downplay how badly they performed compared to nvidia..
Are you still referring to Quack? That myth was squashed a long time ago.

Thats more probably because Doom 3s timedemos do not reflect the games real world performance.
How does that comment relate to the fact that the built-in demo has nVidia consistently much higher than competition while custom demos show no such advantage?

Basically you appear to be answering a question that was never asked.
 
Originally posted by: BFG10K
this would not be the first time they cheated in a id game to try and downplay how badly they performed compared to nvidia..
Are you still referring to Quack? That myth was squashed a long time ago.
Quack was not a myth, it was an actual cheat in the drivers ATI gave reviewers to bench their 8500 that reduced IQ and increased performance. The fact that they fixed it when called on it is irrelevant.

Thats more probably because Doom 3s timedemos do not reflect the games real world performance.
How does that comment relate to the fact that the built-in demo has nVidia consistently much higher than competition while custom demos show no such advantage?

Basically you appear to be answering a question that was never asked.

Don't you think if nVidia were doing a "quack-like" cheat review sites and/or ATI would be all over it? ATI OGL has always suxored. There are no built in time demos for Riddick and nVidia performs 33% higher.

 
Quack was not a myth, it was an actual cheat in the drivers ATI gave reviewers to bench their 8500 that reduced IQ and increased performance.
No it wasn't, it was a bug left over from earlier builds of beta drivers. After ATi fixed it performance went up. Seems like a silly cheat if it degrades performance, wouldn't you say?

Don't you think if nVidia were doing a "quack-like" cheat review sites and/or ATI would be all over it?
Like 3DMark's comprehensive whitepaper publishing nVidia's cheats? Surely you remember your antics when I pointed the findings to you back then?

There are no built in time demos for Riddick and nVidia performs 33% higher.
If you look carefully I never once mentioned that game in the post you are quoting. I personally chalk that one up to nVidia's superior OpenGL implementation.
 
Originally posted by: Rollo
Quack was not a myth, it was an actual cheat in the drivers ATI gave reviewers to bench their 8500 that reduced IQ and increased performance. The fact that they fixed it when called on it is irrelevant.
Um, that's the whole point. Don't you think that if ATI could in fact improve performance without cheating, they would have preferred to do so from the start? The fact is that a later driver set brought performance up to "cheat" levels and fixed IQ. nV was caught with 3DM, but benchmarks of more complex DX9 games show that their DX9 performance is as deficient as 3DM showed with earlier drivers.

Don't you think if nVidia were doing a "quack-like" cheat review sites and/or ATI would be all over it? ATI OGL has always suxored. There are no built in time demos for Riddick and nVidia performs 33% higher.
Rare is the review site that has the balls or the time/competance to point out cheating. HOCP was fed the Quack story from nVidia. I think ET dug up the 3DM story themselves--the huge jump nV took from one driver release from another was a huge clue--but it wouldn't surprise me if ATI helped. 3DVelocity published that FX review that pointed out its IQ deficiencies, and now they're apparently blacklisted. I guess HOCP and ET are big enough that they won't be blacklisted.

Anyway, Riddick was designed for the Xbox, which probably has more than a few similarities with subsequent nV architecture. It may just be the devs responsible for not optimizing enough for ATI (a la Bioware and KOTOR).

Yeah, yeah, the simplest answer is usually the right one, but I'm not sure it's that simple to say that nV is simply "better at OGL." They may well be better at stencil shadowing, though, which appears to be the crux of D3 and Riddick rendering performance. But it's interesting to look at Riddick performance and see that the PCX 5900 is slower than an X600P and even the 6200, a 6800 is only slightly faster than an X800 when you account for clock speed differences, but the 6600GT and 6800GT/U are just stupidly faster than comparable ATI cards.

(Nice to see THG still thinks the 5900 has 8 pixel pipes, too. :roll😉
 
Back
Top