- Apr 5, 2004
- 4,426
- 0
- 76
did someone actually mention that a card being superior in aquakmark3 means anything??? frankly, benchmark tools like 03 and aqua are just useless, use a freaking game...
Originally posted by: Genx87
Showing the results to back up my claim of the GT smoking the X800 does show I am quite concerned about the topic at hand. If I wasnt, then there would be no reason for me to calculate the performance difference.
Yes, I know comparing the benchmarks in the link provided. /gasp, what lengths
Do the math and it is correct.
Originally posted by: GeneralGrievous
Good luck with that. The GT is unplayable at 1600x1200 4x/16x in several games already. Note how that card only gets 40 fps at 1600x1200 with only 8xAF and both optimizations on.
My overclocked XT, otoh, handles Farcry at the above setting quite nicely.
Without them it's pretty much invalid to test such high performance cards.Personally I have never been interested in any of the af or aa settings
reading comprehension... the 'length' is the need to exaggerate a small advantage into something much larger (which i stated in my original reply), which as i stated above, screams the need for attention/superiority.
again, reading comprehension: it's a good thing, try it. this isn't about the math (i never mentioned math, not that any figures were incorrect; why you'd bring that up i have no clue).
the only thing which concerns me is that all the issues nv40 has with far cry wasn't mentioned - fixed, not fixed? is it still running a lower precision path than the ati cards? from previous comments i was under the impression fixing many of these issues would require a patch from crytek, not a driver from nv....
Originally posted by: Genx87
reading comprehension... the 'length' is the need to exaggerate a small advantage into something When you start tossing insults like "reading comprehension" around. Your argument is obviously lost.
that wasn't an insult, it was a statement in fact. my problem was the exaggeration used, your response was to ignore the orignal subject and talk about something irrelevant to the reply.
even if 10% is accurate, 55 fps is 'smoking', but 50 fps is not? and i love overclocking one but not the other in order to showcase it's alleged 'superioriy' lol....
To which I replied "do the math". I think taking a little of your own medicine might be a good thing.
again, the assumption of accuracy was already given. the point was giving undue significance to 55fps vs 50fps, which is the major point in my original reply, and one which you have continued to ignore while focusing on matters irrelevant to the point.
so again, the burden is on you to follow the point, not me, if you choose to participate. if not, then simply don't reply.
This is straight from the article
For Far Cry we did things a bit different then normal. First off, the game has been patched towards version 1.1 build 1256. Secondly we made sure that the graphics cards are forced to run Shader Model 2 as fair base of comparison. Next to that we are using our own Guru3D.com constructed timedemo to prevent driver cheats.
What were you saying about reading comprehension again?
umm.. again, the problem is not on this end.
while you can certainly attempt to try and 'force' ps2 usage, the fact is that far cry uses specifically optimized shaders in the nv ps2 path, running at lower precision; it has also been documented from multiple sources that the nv ps2 path contains far less actual ps2 shaders than the generic path.
the second part of my comment which you completely ignored is the visual anamolies associated with nv cards. there was no mention whether these remained or were resolved. again, it was my understanding this required a patch from crytek, which was delayed.
Originally posted by: michaelpatrick33
I have seen over the time I have gone to Guru3d that they are very Nvidia biased in their news section and reviews? Doesn't mean they are wrong though. Ati is slower than Nvidia when the eye candy is turned off and is definitely slower in OpenGL games so if a review uses that as a criteria than ATI will come out lookin poor. If the the eyecandy is turned on and up in DirectX games (especially the aniotropic [spelling] filtering) which ATI just crushed nvidia than the tables turn and the bencharks favor the ATI significantly. Hardocp did a review of the 6800ultra and the x800pro and x800xt and found that the x800pro beat the 6800ultra with all the eye candy turned on to highest level more than occasionally and the x800xt beat the ultra every time except once where there was a tie. Hardocp, however uses a 30frame per second playable rate apples to apples testing that some people may not like. I think the GT will be a great card but look at how the reviewer does the tests.
Open GL tests go to nvidia hands down
Pure speed tests go to nvidia hands down
Eye candy turned on at high frame rates goes to ati
Hmm. Is that with max AA/AF as well? (6x/16x)so does an overclocked PRO:
Na, the 6800s don't really run Farcry well, for one reason or another. I read here somewhere that someone with a 6800 U was getting average framerates in the 40s.given that however, i'm pretty sure the GT would play it well also. from the results we've seen in the preveiw/reviews, it's pretty comparable with the PRO, tho of course, as always YMMV with overclocking.
They were using beta catalysts as well. On another note, when will reviewers start using the 3.1s again? The 4.6s suck for non x800 cards.one thing to keep in mind tho, is that the reviews you are comparing used older nv drivers. the drivers used to run these comparisons on g3d and fs are only a couple weeks old.
Originally posted by: michaelpatrick33
i have some benchmarks showing the new nvidia drivers being slower so who the hell knows. I purchased my ATI x800xt Platinum for $435.00 to my door to upgrade my 9600pro so if the GT is somehow faster than its faster brother the 6800ultra while being clocked slower than I would say congratulations to those people. I simply went with a good deal from ATI.
Let us stand hand in hand and pray to the Computer God Hardwaricus that Her high priests Nvidia and ATI peacefully co-exist, innovate, decry their greed and gives us lowly mortals great game and pron chips. MMMMMMMM chips
Originally posted by: GeneralGrievous
Hmm. Is that with max AA/AF as well? (6x/16x)
If so I am surprised. At 1600x1200 the Pro usually starts falling well behind the XT. My minimum FPS though was 51 and average was 63, so I guess it does make sense.
Na, the 6800s don't really run Farcry well, for one reason or another. I read here somewhere that someone with a 6800 U was getting average framerates in the 40s.
They were using beta catalysts as well. On another note, when will reviewers start using the 3.1s again? The 4.6s suck for non x800 cards.
