6800GT preview up at guru3d, link inside

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

James3shin

Diamond Member
Apr 5, 2004
4,426
0
76
did someone actually mention that a card being superior in aquakmark3 means anything??? frankly, benchmark tools like 03 and aqua are just useless, use a freaking game...
 

Safeway

Lifer
Jun 22, 2004
12,075
11
81
That 6800 GT benchmark review is a load of horsesh!t. I own ATi and nVidia cards. They say the GT default beats the XT PE default? I am fvcking laughing.

x800XT 16xAF 54 54 53
x800XT default 55 54 54
6800 GT default 58 56 55

It looks like they pulled this sh!t out of their asses. Look an ANY other review, toms hardware, hard OC, etc and you will find XT PE fps in the 70s and 80s!

Digital Life X800XT 75 73 64
Anandtech 1280*1024 no AA/AF 113.2fps
1600*1200 no AA/AF 85fps
1280*1024 4xAA/8xAF 66.4fps
Tomshardware default 80-90fps

They pulled 54 out of their ass! although the GT may beat the PRO, it doesnt beat the XT.

So you are saying Anandtech is full of sh!t? This whole thread is based off guru3d nvidiots. Enough said! Like other people say, wait until the cards ship, people have them, and ATI releases new drivers, as Nvidia has done what, 3 or 4 times? Each one had a different graphical bug ... niiiiiiiiice.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Genx87
Showing the results to back up my claim of the GT smoking the X800 does show I am quite concerned about the topic at hand. If I wasnt, then there would be no reason for me to calculate the performance difference.

lol.. but your results you've shown doesn't show anything getting "smoked"t.. like i stated earlier, 55fps vs 50fps is 10% different, but saying 55fps "smokes" 50fps is nothing more than a sign someone is desperately seeking acknowledgement.

Yes, I know comparing the benchmarks in the link provided. /gasp, what lengths

reading comprehension... the 'length' is the need to exaggerate a small advantage into something much larger (which i stated in my original reply), which as i stated above, screams the need for attention/superiority.

the GT wins a few benchmarks. so what? in the FS review it does 'smoke' the PRO in CoD, but again, that's only one title (and only since the use of a (unofficial/beta) driver, and the PRO still plays it > 50fps at 2048 (not 1600 or 1280) res - hardly what you would call unplayable.

Do the math and it is correct.

again, reading comprehension: it's a good thing, try it. this isn't about the math (i never mentioned math, not that any figures were incorrect; why you'd bring that up i have no clue).

Originally posted by: GeneralGrievous
Good luck with that. The GT is unplayable at 1600x1200 4x/16x in several games already. Note how that card only gets 40 fps at 1600x1200 with only 8xAF and both optimizations on.

My overclocked XT, otoh, handles Farcry at the above setting quite nicely.

so does an overclocked PRO:

Benchmark session result
Operating System: Windows XP Professional (5.1, Build 2600) Service Pack 1 (2600.xpsp2.030422-1633)
Processor: Intel(R) Pentium(R) 4 CPU 3.20GHz (2 CPUs)
Memory: 1024MB RAM
DirectX Version: DirectX 9.0b (4.09.0000.0902)
Card name: RADEON X800 PRO
Driver Version: 6.14.0010.6451 (English)

-----------------------------------------------------------
Far Cry

Using Max settings.
Map: Fort Demo: BenchemallDefaultDemo

640x480
run# 0: Average FPS: 59.07

800x600
run# 0: Average FPS: 59.13

1024x768
run# 0: Average FPS: 60.22

1280x1024
run# 0: Average FPS: 58.56

1600x1200
run# 0: Average FPS: 53.09
(Min FPS: 40.40 at frame 565, Max FPS: 63.74 at frame 891)
Average Tri/Sec: 7651293, Tri/Frame: 144120
Recorded/Played Tris ratio: 0.44


given that however, i'm pretty sure the GT would play it well also. from the results we've seen in the preveiw/reviews, it's pretty comparable with the PRO, tho of course, as always YMMV with overclocking.

the only thing which concerns me is that all the issues nv40 has with far cry wasn't mentioned - fixed, not fixed? is it still running a lower precision path than the ati cards? from previous comments i was under the impression fixing many of these issues would require a patch from crytek, not a driver from nv....
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Personally I have never been interested in any of the af or aa settings
Without them it's pretty much invalid to test such high performance cards.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
reading comprehension... the 'length' is the need to exaggerate a small advantage into something much larger (which i stated in my original reply), which as i stated above, screams the need for attention/superiority.

When you start tossing insults like "reading comprehension" around. Your argument is obviously lost.

again, reading comprehension: it's a good thing, try it. this isn't about the math (i never mentioned math, not that any figures were incorrect; why you'd bring that up i have no clue).

Yes, I know this is what your reply was.

even if 10% is accurate, 55 fps is 'smoking', but 50 fps is not? and i love overclocking one but not the other in order to showcase it's alleged 'superioriy' lol....


To which I replied "do the math". I think taking a little of your own medicine might be a good thing.

the only thing which concerns me is that all the issues nv40 has with far cry wasn't mentioned - fixed, not fixed? is it still running a lower precision path than the ati cards? from previous comments i was under the impression fixing many of these issues would require a patch from crytek, not a driver from nv....

This is straight from the article

For Far Cry we did things a bit different then normal. First off, the game has been patched towards version 1.1 build 1256. Secondly we made sure that the graphics cards are forced to run Shader Model 2 as fair base of comparison. Next to that we are using our own Guru3D.com constructed timedemo to prevent driver cheats.



What were you saying about reading comprehension again?
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
I have seen over the time I have gone to Guru3d that they are very Nvidia biased in their news section and reviews? Doesn't mean they are wrong though. Ati is slower than Nvidia when the eye candy is turned off and is definitely slower in OpenGL games so if a review uses that as a criteria than ATI will come out lookin poor. If the the eyecandy is turned on and up in DirectX games (especially the aniotropic [spelling] filtering) which ATI just crushed nvidia than the tables turn and the bencharks favor the ATI significantly. Hardocp did a review of the 6800ultra and the x800pro and x800xt and found that the x800pro beat the 6800ultra with all the eye candy turned on to highest level more than occasionally and the x800xt beat the ultra every time except once where there was a tie. Hardocp, however uses a 30frame per second playable rate apples to apples testing that some people may not like. I think the GT will be a great card but look at how the reviewer does the tests.

Open GL tests go to nvidia hands down
Pure speed tests go to nvidia hands down
Eye candy turned on at high frame rates goes to ati

The ATI x800xt was always one resolution higher in games (except openGL) on the Hardocp Reviews
http://www.hardocp.com/article.html?art=NjEx
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Genx87
reading comprehension... the 'length' is the need to exaggerate a small advantage into something When you start tossing insults like "reading comprehension" around. Your argument is obviously lost.

that wasn't an insult, it was a statement in fact. my problem was the exaggeration used, your response was to ignore the orignal subject and talk about something irrelevant to the reply.

even if 10% is accurate, 55 fps is 'smoking', but 50 fps is not? and i love overclocking one but not the other in order to showcase it's alleged 'superioriy' lol....


To which I replied "do the math". I think taking a little of your own medicine might be a good thing.

again, the assumption of accuracy was already given. the point was giving undue significance to 55fps vs 50fps, which is the major point in my original reply, and one which you have continued to ignore while focusing on matters irrelevant to the point.

so again, the burden is on you to follow the point, not me, if you choose to participate. if not, then simply don't reply.

This is straight from the article

For Far Cry we did things a bit different then normal. First off, the game has been patched towards version 1.1 build 1256. Secondly we made sure that the graphics cards are forced to run Shader Model 2 as fair base of comparison. Next to that we are using our own Guru3D.com constructed timedemo to prevent driver cheats.

What were you saying about reading comprehension again?

umm.. again, the problem is not on this end.

while you can certainly attempt to try and 'force' ps2 usage, the fact is that far cry uses specifically optimized shaders in the nv ps2 path, running at lower precision; it has also been documented from multiple sources that the nv ps2 path contains far less actual ps2 shaders than the generic path.

the second part of my comment which you completely ignored is the visual anamolies associated with nv cards. there was no mention whether these remained or were resolved. again, it was my understanding this required a patch from crytek, which was delayed.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: michaelpatrick33
I have seen over the time I have gone to Guru3d that they are very Nvidia biased in their news section and reviews? Doesn't mean they are wrong though. Ati is slower than Nvidia when the eye candy is turned off and is definitely slower in OpenGL games so if a review uses that as a criteria than ATI will come out lookin poor. If the the eyecandy is turned on and up in DirectX games (especially the aniotropic [spelling] filtering) which ATI just crushed nvidia than the tables turn and the bencharks favor the ATI significantly. Hardocp did a review of the 6800ultra and the x800pro and x800xt and found that the x800pro beat the 6800ultra with all the eye candy turned on to highest level more than occasionally and the x800xt beat the ultra every time except once where there was a tie. Hardocp, however uses a 30frame per second playable rate apples to apples testing that some people may not like. I think the GT will be a great card but look at how the reviewer does the tests.

Open GL tests go to nvidia hands down
Pure speed tests go to nvidia hands down
Eye candy turned on at high frame rates goes to ati

one thing to keep in mind tho, is that the reviews you are comparing used older nv drivers. the drivers used to run these comparisons on g3d and fs are only a couple weeks old.

the drivers are also 'unofficial'; the wqhl driver distributed on nv's site is still a 57 series.
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
i have some benchmarks showing the new nvidia drivers being slower so who the hell knows. I purchased my ATI x800xt Platinum for $435.00 to my door to upgrade my 9600pro so if the GT is somehow faster than its faster brother the 6800ultra while being clocked slower than I would say congratulations to those people. I simply went with a good deal (seldom upgrade so I decided for once to not upgrade common sense wise) from ATI.

Let us stand hand in hand and pray to the Computer God Hardwaricus that Her high priests Nvidia and ATI peacefully co-exist, innovate, decry their greed and gives us lowly mortals great game and pron chips. MMMMMMMM chips
 
Apr 14, 2004
1,599
0
0
so does an overclocked PRO:
Hmm. Is that with max AA/AF as well? (6x/16x)

If so I am surprised. At 1600x1200 the Pro usually starts falling well behind the XT. My minimum FPS though was 51 and average was 63, so I guess it does make sense.

given that however, i'm pretty sure the GT would play it well also. from the results we've seen in the preveiw/reviews, it's pretty comparable with the PRO, tho of course, as always YMMV with overclocking.
Na, the 6800s don't really run Farcry well, for one reason or another. I read here somewhere that someone with a 6800 U was getting average framerates in the 40s.

one thing to keep in mind tho, is that the reviews you are comparing used older nv drivers. the drivers used to run these comparisons on g3d and fs are only a couple weeks old.
They were using beta catalysts as well. On another note, when will reviewers start using the 3.1s again? The 4.6s suck for non x800 cards.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: michaelpatrick33
i have some benchmarks showing the new nvidia drivers being slower so who the hell knows. I purchased my ATI x800xt Platinum for $435.00 to my door to upgrade my 9600pro so if the GT is somehow faster than its faster brother the 6800ultra while being clocked slower than I would say congratulations to those people. I simply went with a good deal from ATI.

Let us stand hand in hand and pray to the Computer God Hardwaricus that Her high priests Nvidia and ATI peacefully co-exist, innovate, decry their greed and gives us lowly mortals great game and pron chips. MMMMMMMM chips

honestly i don't think you can really go wrong with either card. overall the performance differences are minimal, and i'm sure you'll see the lead change several times (albeit minimally) with future performance increases of drivers from both sides. don't think there's no room for improvements on the "old" r420 arhcitecutre - ati's memory controller was changed quite a bit, and according to their engineers is only running at about 70% efficiency, which should leave room for improvement. of course, how much is only speculation until they actually deliver on this 'potential' ;)

i do like the fact nv40 supports sm3, but i have my doubts as to whether this will be of any significance in this generation. regardless, i've never seen more features as a bad thing.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: GeneralGrievous
Hmm. Is that with max AA/AF as well? (6x/16x)

If so I am surprised. At 1600x1200 the Pro usually starts falling well behind the XT. My minimum FPS though was 51 and average was 63, so I guess it does make sense.

no.. tho you'd be hard pressed to convince me that 6xaa is required or shows any significant benefit at 1600x1200. ;)

Na, the 6800s don't really run Farcry well, for one reason or another. I read here somewhere that someone with a 6800 U was getting average framerates in the 40s.

well, a couple reviews using the latest drivers contradict you. i would like to know however if the iq anamolies have been fixed (neither review mentioned that), or if a patch from crytek is required. also, if the new patch has an optimized path for nv40 (currently it does not), the GT may show an even better performance increase. then again, it's all speculation at this point, which is why i think it's a bit too early to crown a "performance king", especially with relatively few parts being avaialable from both sides.

the XT is certainly beginning to trickle out, but it still maybe be mid-July or later before they start shipping in quantity, and by then i'm sure we'll see if cat 4.7 lives up to the early hype some have been giving it.

They were using beta catalysts as well. On another note, when will reviewers start using the 3.1s again? The 4.6s suck for non x800 cards.

yea, but "non x800 cards" are outside the scope of this discussion ;)