Geforce FX Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
My guess ze Germans as compitent as anyone to test these cards. Isn't Tom in Germany? Why doesn't he beat everyone else to the punch by posting his review early?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: UlricT

you better wait... dont know how much trust we can put on "ze germans" huh?:p


I don't doubt their results, its just they are a bit on the "skimpy" side. We're used to benches at 3 or 4 different resolutions with 5 or 6 cards in comparison. This lets you decide what the best solution would be for each individual user. Most people will "Ooooohhh" and "Aaaaahhh" at 1600x1200x32 results but in reality will never use them. The more practical ranges of 1024 to 1280 will be the area of emphasis for most users.

Chiz
 

bgeh

Platinum Member
Nov 16, 2001
2,946
0
0
Originally posted by: GTaudiophile
Originally posted by: bgeh
if this is true, nvidia is in big trouble
it still loses to ATI in the FSAA department

I think most people thought ATi would win with the 256-bit pipe.

true, but i expected more from DDR-2 at 1GHz
:)
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: chizow
Originally posted by: UlricT

you better wait... dont know how much trust we can put on "ze germans" huh?:p


I don't doubt their results, its just they are a bit on the "skimpy" side. We're used to benches at 3 or 4 different resolutions with 5 or 6 cards in comparison. This lets you decide what the best solution would be for each individual user. Most people will "Ooooohhh" and "Aaaaahhh" at 1600x1200x32 results but in reality will never use them. The more practical ranges of 1024 to 1280 will be the area of emphasis for most users.

Chiz

Well, maybe they just wanted to be first out of the gates, generate some hits for their site. Doing more benches would have postponed the article some.
 

UlricT

Golden Member
Jul 21, 2002
1,966
0
0
hey... have you guys noticed that the use DDR266? could this be holding back performnace any? just wondering....
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Ratchet over at Rage3D has this to say:

yeah for sure. nVidia spin-doctors will be working overtime to make sure the nV30 looks good no matter how bad it really is.

I just hope that Anand in particular mentions the R350 as many times in his NV30 review as he mentioned the NV30 in his R300 review... I'll have lost all faith in him if not... though I wouldn't be one bit surprised if he talks more about the NV35...


 

HendrixFan

Diamond Member
Oct 18, 2001
4,646
0
71
Originally posted by: bgeh
Originally posted by: GTaudiophile
Originally posted by: bgeh
if this is true, nvidia is in big trouble
it still loses to ATI in the FSAA department

I think most people thought ATi would win with the 256-bit pipe.

true, but i expected more from DDR-2 at 1GHz
:)

Remember that DDR2 at 1Ghz is equal to 256bit DDR at 500mhz, and the ATI runs at 620mhz, or equivalent to DDR2 at 1.24Ghz. That is why ATI has the memory bandwidth lead (and a lead in AA). The fillrate of the GF FX blows the 9700 away though.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: GTaudiophile
Originally posted by: chizow
Originally posted by: UlricT

you better wait... dont know how much trust we can put on "ze germans" huh?:p


I don't doubt their results, its just they are a bit on the "skimpy" side. We're used to benches at 3 or 4 different resolutions with 5 or 6 cards in comparison. This lets you decide what the best solution would be for each individual user. Most people will "Ooooohhh" and "Aaaaahhh" at 1600x1200x32 results but in reality will never use them. The more practical ranges of 1024 to 1280 will be the area of emphasis for most users.

Chiz

Well, maybe they just wanted to be first out of the gates, generate some hits for their site. Doing more benches would have postponed the article some.

Not really, just shows a lack of thorough testing, unless they got it later than the more reputable sites. Reviewers have had these cards in their hands for weeks, as evidenced from their comments the last few weeks.

Chiz
 

ed21x

Diamond Member
Oct 12, 2001
5,411
8
81
you guys, the geforce3 (original) came out several months before the radeon 8500, and in the end, ATi prospered accordingly. All it takes is a slight advantage (as in the r8500) to completely outlast the competition. its all a pr thing...
 

BentValve

Diamond Member
Dec 26, 2001
4,190
0
0
Fvck Nvidia, they deserve to go down since they tried to play it off. And I even gave them the benefit of the doubt. We are not friends here , this is business ..and they did not deliver the goods.

What is really sad is that they made it worse for themeselves by the way they have handled the situation, their marketing/PR people should all be fired for this.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
nVidia still have time to optimise drivers, that should boost the performance of the GF FX some, which should give it a lead on the 9700, but by the time they have, the R350 will probably be out and ATi will be back at the level of the GF FX in all likelihood, it's bee swings and roundabouts, drivers from nVidia, updated card from ATi.
 

HendrixFan

Diamond Member
Oct 18, 2001
4,646
0
71
The marketing/PR people arent to blame. Their job was to sell a product that exists in a 6 month product cylce and was released 5 months late. Its not their fault it didnt get released, but they had to keep people interested nevertheless.
 

UlricT

Golden Member
Jul 21, 2002
1,966
0
0
heh... look at what tecchannel.de has up on their main page now....

Starting from Monday, 9:00 o'clock: The first GeFORCE FX in the test
At present we work on a test of the first GeForce FX diagram map. The new generation of NVIDIAs diagram chip got high Vorschusslorbeeren in the apron. Undreamt-of performance and innovative technology should distinguish it. But the reality could look differently.


looks like nvidia called the NDA upon them. Desperation makes them strike fast.... though I do not know how keeping it away from the public for a day is going to help at all...
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: Lonyo
nVidia still have time to optimise drivers, that should boost the performance of the GF FX some, which should give it a lead on the 9700, but by the time they have, the R350 will probably be out and ATi will be back at the level of the GF FX in all likelihood, it's bee swings and roundabouts, drivers from nVidia, updated card from ATi.

I dunno. Turn FSAA on, and I just don't see the FX winning every bench...optimized drivers or not.
 

pillage2001

Lifer
Sep 18, 2000
14,038
1
81
LOL, I was shot down over at the video section when some members told me that AA and AF is not important as long as they game in high res............
rolleye.gif
 

First

Lifer
Jun 3, 2002
10,518
271
136
Originally posted by: chizow
Originally posted by: GTaudiophile
Originally posted by: chizow
Originally posted by: UlricT

you better wait... dont know how much trust we can put on "ze germans" huh?:p


I don't doubt their results, its just they are a bit on the "skimpy" side. We're used to benches at 3 or 4 different resolutions with 5 or 6 cards in comparison. This lets you decide what the best solution would be for each individual user. Most people will "Ooooohhh" and "Aaaaahhh" at 1600x1200x32 results but in reality will never use them. The more practical ranges of 1024 to 1280 will be the area of emphasis for most users.

Chiz

Well, maybe they just wanted to be first out of the gates, generate some hits for their site. Doing more benches would have postponed the article some.

Not really, just shows a lack of thorough testing, unless they got it later than the more reputable sites. Reviewers have had these cards in their hands for weeks, as evidenced from their comments the last few weeks.

Chiz

Actually no, Anand got his GeForceFX a couple days ago.

Btw, AT review will be up Monday. :)
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: UlricT
heh... look at what tecchannel.de has up on their main page now....

Starting from Monday, 9:00 o'clock: The first GeFORCE FX in the test
At present we work on a test of the first GeForce FX diagram map. The new generation of NVIDIAs diagram chip got high Vorschusslorbeeren in the apron. Undreamt-of performance and innovative technology should distinguish it. But the reality could look differently.


looks like nvidia called the NDA upon them. Desperation makes them strike fast.... though I do not know how keeping it away from the public for a day is going to help at all...

That's why I posted the benches...see the third post in this thread.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: GTaudiophile
Originally posted by: Lonyo
nVidia still have time to optimise drivers, that should boost the performance of the GF FX some, which should give it a lead on the 9700, but by the time they have, the R350 will probably be out and ATi will be back at the level of the GF FX in all likelihood, it's bee swings and roundabouts, drivers from nVidia, updated card from ATi.

I dunno. Turn FSAA on, and I just don't see the FX winning every bench...optimized drivers or not.

I never said they would win every bench, just get a lead of sorts. You don't have to be faster on every count to be ahead.
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Originally posted by: Evan Lieb
Originally posted by: chizow
Originally posted by: GTaudiophile
Originally posted by: chizow
Originally posted by: UlricT

you better wait... dont know how much trust we can put on "ze germans" huh?:p


I don't doubt their results, its just they are a bit on the "skimpy" side. We're used to benches at 3 or 4 different resolutions with 5 or 6 cards in comparison. This lets you decide what the best solution would be for each individual user. Most people will "Ooooohhh" and "Aaaaahhh" at 1600x1200x32 results but in reality will never use them. The more practical ranges of 1024 to 1280 will be the area of emphasis for most users.

Chiz

Well, maybe they just wanted to be first out of the gates, generate some hits for their site. Doing more benches would have postponed the article some.

Not really, just shows a lack of thorough testing, unless they got it later than the more reputable sites. Reviewers have had these cards in their hands for weeks, as evidenced from their comments the last few weeks.

Chiz

Actually no, Anand got his GeForceFX a couple days ago.

Btw, AT review will be up Monday. :)

Any comments, Evan? Are the Germans close?

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Evan Lieb


Actually no, Anand got his GeForceFX a couple days ago.

Btw, AT review will be up Monday. :)

What time?!?!?!? :D 12:01am EST??? Anand is on the EC, so.......... Sneak peak for subscribers??? ;) A couple days is plenty of time for Anand to crunch out a few hundred benches :D

Chiz
 

BentValve

Diamond Member
Dec 26, 2001
4,190
0
0
Originally posted by: chizow
Originally posted by: Evan Lieb


Actually no, Anand got his GeForceFX a couple days ago.

Btw, AT review will be up Monday. :)

What time?!?!?!? :D 12:01am EST??? Anand is on the EC, so.......... Sneak peak for subscribers??? ;) A couple days is plenty of time for Anand to crunch out a few hundred benches :D

Chiz



Yep 9am Silicon Valley time is when things usually go down.



 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: pillage2001
LOL, I was shot down over at the video section when some members told me that AA and AF is not important as long as they game in high res............
rolleye.gif

realistically, few people run games at 1600x1200 or more as few people have screens that can hold that resolution at a decent refresh rate. I think the emphasis should be on running @ 1024x768 w/FSAA and AF enabled.

The problem for nVidia will be that the people who buy a card like this will be looking to turn on all the goodies such as AA and AF at high res and they just cannot match ATI in this department. I mean, realistically, who the hell can tell the difference between 200fps and 250 fps w/o AA & AF? No one. nVidia will be winning most of the benchmarks that hold little relevance; ATI will be near nVidia and perhaps ahead in the benchmarks where it will matter to gamers (ie high res AA & AF enabled), which also happen to be the ones in which current GPUs struggle to maintain high FPS numbers.

On a side note, this is a bad start for Cg. Seeing results like these may cause programmers to just wait for the MS version of Cg to come out. Why would they use an nVidia-centric programming language if ATI is neck and neck with nVidia and a standard language is just around the corner.