GeForce FX 5800 Ultra preview

OatMan

Senior member
Aug 2, 2001
677
0
71
OK, I'm no rocket scientist.

Did I understand hard OCP's article? It seemed to me to paint a significantly different picture than do any of the other reviews? It seems to show the GFX as marginally ahead of the R9700 generally, as opposed to margianlly behind in the other articles I've read.

It seems like a fair shake tho and pretty much comes to the same conclusions as the others. I'm just wondering if this points to inconsistent hardware as these cards are refference samples and not shipping silicon. The test beds and methodologies are also different from site to site, AMD hardware "hOCP" and P4 hardware AT. Also I didn't compaire the driver versions, so it might not be apples and apples...

anyway I think HOCP is bang on in calling it a Preview and not a Review.

Good stuff!
 
May 15, 2002
245
0
0
nVIDIA is in deep trouble, if this preview is indicative of real performance. Takes up two slots, pumps out 60dB of fan noise, and barely (if at all) beats ATI's offering. I smell "failure" here...
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Originally posted by: OatMan
OK, I'm no rocket scientist.

Did I understand hard OCP's article? It seemed to me to paint a significantly different picture than do any of the other reviews? It seems to show the GFX as marginally ahead of the R9700 generally, as opposed to margianlly behind in the other articles I've read.

The issue is that most reviews--including, to my dismay, Anandtech's--are comparing the 5800 Ultra's HQ aniso settings to the 9700 Pro's LQ settings. Anand uses the excuse that, for now, the 9700 Pro's LQ setting looks "almost as good" as both of the card's HQ settings. The problem is, the HQ settings are similar, but the FX's HQ still beats the 9700's LQ, and my personal opinion is it's simply a driver issue and nVidia will fix the LQ to look better. So, comparing apples to apples, HQ to HQ, something Anand didn't do in this case, the FX is 20% faster, which is about what everyone expected.
 

Glitchny

Diamond Member
Sep 4, 2002
5,679
1
0
it shouldnt be HQ to HW if atis LQ looks as good as the FX's HQ which it does so therefore they compares for looks which is why people turn those settings on anyways, if the 9700 was at HQ it would have looked even better than the FX and probally ahve run around as fast or slower but the point was looks to looks not setting to setting
 
Aug 27, 2002
10,043
2
0
makes me glad I shelled out the $300 for my 9700 pro at the end of Nov. I had conteplated on waiting for FX to come out.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Xentropy
Originally posted by: OatMan
OK, I'm no rocket scientist.

Did I understand hard OCP's article? It seemed to me to paint a significantly different picture than do any of the other reviews? It seems to show the GFX as marginally ahead of the R9700 generally, as opposed to margianlly behind in the other articles I've read.

The issue is that most reviews--including, to my dismay, Anandtech's--are comparing the 5800 Ultra's HQ aniso settings to the 9700 Pro's LQ settings. Anand uses the excuse that, for now, the 9700 Pro's LQ setting looks "almost as good" as both of the card's HQ settings. The problem is, the HQ settings are similar, but the FX's HQ still beats the 9700's LQ, and my personal opinion is it's simply a driver issue and nVidia will fix the LQ to look better. So, comparing apples to apples, HQ to HQ, something Anand didn't do in this case, the FX is 20% faster, which is about what everyone expected.

he was comparing the quality of the image not the performance of the best against the best. If the ATI LQ looks better than Nvidia's HQ why compare the HQ?
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
he was comparing the quality of the image not the performance of the best against the best. If the ATI LQ looks better than Nvidia's HQ why compare the HQ?

Some facts, with a few very easy syllogisms even someone ultra biased could understand:
Fact 1) The 9700's LQ wasn't *the same as* its HQ, just damned close.

Fact 2) The FX's HQ matched the 9700's HQ basically exactly, quality-wise.

Conclusion 1) Based on the above, the 9700's LQ is not the same as the FX's HQ, just damned close.

Fact 3) The FX's LQ looks just like no anisotropy at all, and only drops framerates by 1%.

Conclusion 2) The FX's LQ is completely nonfunctional, and such a problem is almost guaranteed to be fixed in a patch (or at WORST a recall/new stepping of hardware, similar to ATi's recall of the 9700 due to AGP 8X issues).

Conclusion 3) The 9700's LQ *working* and the FX's LQ being *broken* makes a comparison between the 9700's LQ and *anything* a comparison which will be biased toward the 9700. Such benchmarks cannot be compared until a driver fix for the FX's LQ is released. If such a driver fix is not immediately forthcoming, then the benchmarks as they are now can stand, since nVidia would be letting ATi win due to DRIVERS of all things (what a topsy turvy world that would be). Therefore, the only comparisons which can be made at this time (at least unless and until nVidia *ships* with the same buggy drivers and validates the current benchmarks) which do not destroy the FX due to that single driver bug are HQ vs. HQ and none vs none. In both of these cases, the FX is ~20% faster than the 9700, which is basically the speed improvement everyone expected all along.

Addendum 1) The IQ comparisons may be completely invalid anyway if HardOCP's information is true, that there is a final filter applied *after* the frame buffer and before display, which makes screenshots of the finalized IQ more difficult (if not impossible). If this is the case, the FX's LQ may actually be working properly ON THE SCREEN, just doesn't look as good in screenshots. (Since this is merely an if at this time and not a fact as the above, I made this an addendum instead of Fact 4. IF this is the case, however, then the FX's LQ may be just as good as the 9700's LQ, which would mean we'd need to compare LQ to LQ, and Anand, REALIZING THAT THE FX'S LQ WAS BROKEN ENTIRELY, didn't even provide us with LQ benchmarks, so that comparison cannot be made yet.)

Turn off your blinders and you'll see the above conclusions are perfectly valid. Only the addendum is conjecture based on information that is not yet confirmed by multiple sites.

Edit: Just to put this out there, I will not be buying an FX 5800 Ultra unless a solution is found for the noise issue, anyway. If ATi fixes their drivers before nVidia fixes their noise, I'll be getting a 9700 Pro or R350 or whatever. If none of the above occurs in a reasonable period of time, I'll fall back on a Ti4600 and skip this pathetic excuse for a generation entirely, hoping the next generation offers performance WITHOUT the fatal flaws.