Geforce FX Benchmarks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
The 9700 Pro should look REALLY good once the GeForce FX benchmarks come out. ATi is going to be selling the 9700 Pro as a package: you get a fast as hell video card PLUS an extra PCI slot. :)
 

OldSpooky

Senior member
Nov 28, 2002
356
0
0
The initial reports from the Deutchlanders look bad, I'll admit, but I'm going to wait for Anand's review before I judge the technology. And who knows... the Pentium 4 was pretty awful when it launched over 2 years ago, and now it's doing quite well.

Hopefully the FX won't be all that bad. It'll whack my Radeon DDR that's for sure! :)
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Anyone else find it odd that this website which has a forum with a grand total of 3,000 posts, also claims to have tested a Prototype Hammer CPU. To me it seems kind of odd that such a seemingly small site would have access to this type of hardware. :confused:
 

OldSpooky

Senior member
Nov 28, 2002
356
0
0
Anyone else find it odd that this website which has a forum with a grand total of 3,000 posts, also claims to have tested a Prototype Hammer CPU

Seeing as AMD's main fab is in Germany, it's not out of the question that local sources, er "loaned" a sample or two :)
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
According to some reports, the Dustbuster (tm) is actually so large that it cuts off the SECOND PCI slot!! :Q So unless you feel like pushin' yer cards a little off their proper angle and resting against a hot ol' blower thing, you're likely down TWO slots.

Not good.... not good. Waiter! I'd like a 9700 please... non pro, or Pro if it's not too much more. :p
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81

ATI has certainly raised the stakes with Radeon 9700 Pro, and if GeForceFX was coming into this brawl looking for a knockout, it didn't get one. Actually, what we saw is that the memory sub-system of GeForceFX hits a pretty hard wall when you combine a high resolution and bandwidth-hungry rendering features like FSAA and AF...

For nVidia, GeForceFX represents a return to at the very least performance parity with ATI. For ATI however, Radeon 9700 Pro looks strong versus GeForceFX, and these results show just how much performance ground nVidia had lost to ATI when Radeon 9700 Pro first shipped.
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
WOW, I hate to say this as I am an Nvidia fan, but they just layed an egg.
WTH were those jackarses thinking by not using a 256 bit interface. 3dLabs, ATI, and even the jabrones at Matrox figured it out, Hell I think BitBoys had a 256 bit interface . WTF was Nvidia thinking.
rolleye.gif


I knew it, Nvidia should have never bought out 3dfx, they are cursed :D
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
The FX slams the Radeon in a couple of benchmarks but most of the time they're neck and neck. Once AF/AA get turned on ATI lays the smack down in a lot of cases. I'm not impressed by thsi showing, nVidia took 6 extra months to get this product out the door and it doesn't look all that strong beside the 9700 pro. I certainly wouldn't pay an extra ~$100 or so to get that kind of boost in performance, and I'm not too sure too many people would

barring any spectacular boosts in performance in the production drvers, booooo nVidia.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: GTaudiophile

ATI has certainly raised the stakes with Radeon 9700 Pro, and if GeForceFX was coming into this brawl looking for a knockout, it didn't get one. Actually, what we saw is that the memory sub-system of GeForceFX hits a pretty hard wall when you combine a high resolution and bandwidth-hungry rendering features like FSAA and AF...

For nVidia, GeForceFX represents a return to at the very least performance parity with ATI. For ATI however, Radeon 9700 Pro looks strong versus GeForceFX, and these results show just how much performance ground nVidia had lost to ATI when Radeon 9700 Pro first shipped.
I guess this site is getting HAMMERED (as it's the only review up I can find) with some pages not loading.

But it looks like exactly what I expected . . . too bad . . . I was hoping for a 9700 price drop. :p

3DMark 2001SE Overall Results

What we generally found is that in the baseline tests, the GeForceFX's 54% engine clock advantage gave it a clear advantage on those tests that stressed the GPU more than the graphics memory. On the other hand, when we piled on both FSAA and AF, the GeForceFX's memory interface showed that it has the potential to bottleneck this GPU's many and powerful processing engines.
Baseline: GeForceFX ahead by 6%
With FSAA & AF: Radeon 9700 Pro ahead by 30%
Fill Rate Performance:

Baseline Single-Texture: Radeon 9700 Pro ahead by 15%
Single-Texture with FSAA & AF: Radeon 9700 Pro ahead by 33%

Baseline Multi-Texture: GeForceFX ahead by 39%
Multi-Texture with FSAA & AF: Radeon 9700 Pro ahead by 21%


The only place the NV30 excels is in geometry. But generally - if you like hi res AND max AA an AF, the Radeon clearly shines.



Nvidia laid a BIG egg. :Q
 

OldSpooky

Senior member
Nov 28, 2002
356
0
0
WTH were those jackarses thinking by not using a 256 bit interface. 3dLabs, ATI, and even the jabrones at Matrox figured it out, Hell I think BitBoys had a 256 bit interface . WTF was Nvidia thinking

Bus width means nothing by itself - how efficiently a GPU uses its bandwidth is an important factor as well. Obviously the GeForce FX wasn't efficient enough though.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Jumping the gun again folks...read the fine print:

We would normally have put the GeForceFX through our usual battery of tests at the two standard test resolutions, 1024x768x32 and 1600x1200x32. But this being the clash of the titans, we wanted to really sock it to 'em. So, with that in mind, and because of the severely limited time window in which we had to test, we tested at 1600x1200x32, and gathered baseline values without either FSAA or Anisotropic Filtering (AF) enabled. We then added 4X FSAA and 8X AF to both the GeForceFX and the Radeon 9700 Pro to see who would it would hit harder.

How many of you actually run games at 1600x1200x32??? Maybe I should re-phrase and ask how many of you actually have monitors that support 1600x1200 w/out the use of a magnifying glass. ;) :p I'll wait til a thorough review is by a reputable site.

Chiz

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
How many of you actually run games at 1600x1200x32???
Heaps of people do, including myself. My standard gaming resolution is 1600 x 1200 x 32 with 16x anisotropic but I'll go to 1792 x 1344 if the game is fast enough.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: chizow
Jumping the gun again folks...read the fine print:

We would normally have put the GeForceFX through our usual battery of tests at the two standard test resolutions, 1024x768x32 and 1600x1200x32. But this being the clash of the titans, we wanted to really sock it to 'em. So, with that in mind, and because of the severely limited time window in which we had to test, we tested at 1600x1200x32, and gathered baseline values without either FSAA or Anisotropic Filtering (AF) enabled. We then added 4X FSAA and 8X AF to both the GeForceFX and the Radeon 9700 Pro to see who would it would hit harder.

How many of you actually run games at 1600x1200x32??? Maybe I should re-phrase and ask how many of you actually have monitors that support 1600x1200 w/out the use of a magnifying glass. ;) :p I'll wait til a thorough review is by a reputable site.

Chiz
It must really bug you that the NV30 is such a big DISAPPOINTMENT - at least for those expecting a "Radeon Killer".


NHL 2002 gave the GeForceFX fits, and ultimately wouldn't run. The error we repeatedly got is one we've seen testing on ATI hardware as well, so we don't believe it to be specific to the GeForceFX. But, as a result of our inability to get a complete test run in, we dropped NHL 2002 from this round of testing for both cards.
Looks like their drivers might not be up-to-par either. :p

Oh, wait . . . we're waiting for a "reputable" review . . . :p

rolleye.gif
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Originally posted by: chizow
Jumping the gun again folks...read the fine print:

We would normally have put the GeForceFX through our usual battery of tests at the two standard test resolutions, 1024x768x32 and 1600x1200x32. But this being the clash of the titans, we wanted to really sock it to 'em. So, with that in mind, and because of the severely limited time window in which we had to test, we tested at 1600x1200x32, and gathered baseline values without either FSAA or Anisotropic Filtering (AF) enabled. We then added 4X FSAA and 8X AF to both the GeForceFX and the Radeon 9700 Pro to see who would it would hit harder.

How many of you actually run games at 1600x1200x32??? Maybe I should re-phrase and ask how many of you actually have monitors that support 1600x1200 w/out the use of a magnifying glass. ;) :p I'll wait til a thorough review is by a reputable site.

Chiz
I would tend to agree Chiz, but it really is amazing that a company with the size and reputation of Nvidia cannot decisively beat an aged product 6 months after its release. Nvidia has never been in this position before, it will be interesting to see what they do with the market positoning/pricing of this card as it surely is not dominent, and will not command a premium price over the Radeons.
IMO, Nvidia has lost its edge (which I do not think they will get back soon).
I'm gonna be selling my stock ASAP. :(

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
How many of you actually run games at 1600x1200x32???
Heaps of people do, including myself. My standard gaming resolution is 1600 x 1200 x 32 with 16x anisotropic but I'll go to 1792 x 1344 if the game is fast enough.

What games do you play? Pong? There's not a single game I've purchased in the last year that runs acceptably at 1600x1200x32 with 16-tap AF with even 2x AA.

It must really bug you that the NV30 is such a big DISAPPOINTMENT - at least for those expecting a "Radeon Killer".

No, it doesn't bother me at all, I'll just keep buying the highest performing compatible part available. And the rest of the fanboys will continue to buy 2-generation old value parts from their favorite team and then talk about the latest product as if they know something about it.
rolleye.gif


Originally posted by: Snoop
I would tend to agree Chiz, but it really is amazing that a company with the size and reputation of Nvidia cannot decisively beat an aged product 6 months after its release. Nvidia has never been in this position before, it will be interesting to see what they do with the market positoning/pricing of this card as it surely is not dominent, and will not command a premium price over the Radeons.
IMO, Nvidia has lost its edge (which I do not think they will get back soon).
I'm gonna be selling my stock ASAP. :(


Snoop, don't forget, NV30's specs were finalized almost a year ago. Nvidia and TMSC had a rough time getting the .13 micron process to return high enough yields for NV30's production. Nvidia hasn't sat around while TMSC was able to produce high enough yields to launch NV30, they already planned for NV31, NV34 and NV35. I wouldn't sell that stock yet, as Nvidia has maintained its market share in the markets segments that count (sub-$200 and OEM/system integrator sales), as well as making substantial gains in the chipset market (Nforce2).

Chiz
 

Snoop

Golden Member
Oct 11, 1999
1,424
0
76
Snoop, don't forget, NV30's specs were finalized almost a year ago.
This is what bothers me, they had PLENTY of time to get this thing right, yet it cant beat a previous generation product in the majority of cases. Why didnt Nvidia continue to tweak this design while getting the .13 micron process down? SOMEONE at Nvidia needs to explain why they could not come up with a memory sub-system which can match the available bandwidth of all of the other major Video card Mfg's.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
Now, the real question is:

How well will this card play Doom 3?

Most existing games out on the market don't need half as this much graphics processing power to perform decently.
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
Most existing games out on the market don't need half as this much graphics processing power to perform decently.
Good point. The fact that these cards are able to play games with high framerates in conjunction with all the IQ settings maxed out is a testament to that. The only problem is there is no way to test for future games without early build releases for benchmarking purposes.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: apoppin

It must really bug you that the NV30 is such a big DISAPPOINTMENT - at least for those expecting a "Radeon Killer".

Posted by: chizow
No, it doesn't bother me at all, I'll just keep buying the highest performing compatible part available. And the rest of the fanboys will continue to buy 2-generation old value parts from their favorite team and then talk about the latest product as if they know something about it.
rolleye.gif

Aren't you the guy that can't even get his 9700Pro to coexist peacefully in your system without bitterly complaining in most threads here?

:p

rolleye.gif



:D