• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Supposed NV40 numbers (it's packin' some serious heat).

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Flyermax2k3

Diamond Member
Mar 1, 2003
3,204
0
0
I'm not going to argue with you, you have your opinion, and I have mine. We'll have to wait until NV4X actually comes out to know who's right ;)
Until then :):beer:
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: Pete
If the benchmark xbit was using were CPU-limited, it'd show 40fps across the board for the 9800XT. It doesn't--the XT goes from 95fps @ 10x7 to 40fps @ 16x12 AA+AF. The main point of new cards is to increase framerate at taxing settings, and it don't get more taxin' than 16x12 AA+AF. And the fact that the XT scored 95fps shows that NV40 may indeed be scoring 80fps. Thus my puzzlement over the suggestion that Xbit's benchmarks are CPU-limited. CPU-limited would mean the same framerate from 10x7 to 16x12, and that doesn't seem to be the case there (or in any other graph that DW posted, AFAICT).
No, all I was pointing out was that the benchmarks shown by xbit tend to fall in line numbers-wise with what they are arguing in the thread. It would seem to me that botmatch (which should largely be unaffected by AA/AF or higher res) is NOT CPU limited.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Flyermax2k3
I'm not going to argue with you, you have your opinion, and I have mine. We'll have to wait until NV4X actually comes out to know who's right ;)
Until then :):beer:

Sounds fair lets wait. But you cant ignore the fact that graphics card matters more than CPU speed at 1600x1200 AA/AF and that is not an opinion; it is a fact unless you are running 33mhz p1. Just listen to what chsh1ca has to say, he is right on the matter.
 

Zephyr106

Banned
Jul 2, 2003
1,309
0
0
Originally posted by: BenSkywalker
Actually, there are 9 pages to that thread that question the validity of the benchmarks. Specifically the botmatch scores being DOUBLE those of the R9800Pro/XT which are currently CPU bound.

If that thread were not about the NV40 and it was an honest person who thought the bench was CPU limited the rabid loyalists would be pointing out the the 9800 is wiping the floor with the 5950 at those settings and that all the boards are showing a linear increase throughout when the vid cards are overclocked. Not saying anything about the validity, but you have to take in to context the level of fanatacism on that forum and the fact that through the first eight pages at least noone was willing to do this right here.

Wait, what's that.... is that the R9800 pulling 100FPS in the very bench they are saying is CPU limited at less then 40? Why yes, yes it is. Bunch of half-wits flinging insults cuz they don't like what the numbers show, not because they have an active left lobe brain cell working on it for d@mn sure.

Edit-

Just made it to page ten of the thread- props to Pete for showing some reason by taking the minute to track down the results :D

Uhhhhhhh are those XBit benchmarks from a botmatch? I don't think so....

Zephyr
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I agree with the comments about the CPU limited nature of those benchmarks, especially the UT2004 results. Unless the NV40 has found some magical way to do sound and AI then it's not going to be doubling current cards.

I'll believe these results when I see them at a proper review site.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I agree with the comments about the CPU limited nature of those benchmarks, especially the UT2004 results.

Check the link out, there is no way that bench is close to CPU limited at 16x12 w/AA+AF.

And you're calling THEM fanboys for attacking the validity of the benches

I've made no comment at all regarding the validity, it is their comments and insults flung at the guy because they have so deemed that a bench that isn't remotely close to CPU limited must be because otherwise the R9800 gets throttled. There has been considerable evidence shown to them that this bench is quite vid card limited at the settings they were using yet instead they choose to call the guy and idiot and other such names because they are too ignorant to simply look up just how CPU intensive the bench is(which it really isn't). I would tend not to believe these are real numbers, but mainly due to the nature of any pre release numbers that circulate around, however their reasoning for why the numbers are wrong are easily disproven by anyone that applies even a slight amount of effort in to checking on the bench.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
at 800x600 its probably cpu limited :) anything higher, especially with all the settings cranked, will be pushing the current gen cards pretty hard.

UT2K3 is most definately a very demanding engine.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I dont see any system specs, or how the game and drivers options were set.

Those numbers are worthless to me.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
NV40 is officially launched April 13th at an NVIDIA release LANparty :)

Got the email in my nV newsletter.



You're Invited to the Launch of Our Next-Generation GPU and the First Ever GeForce LAN Party in San Francisco!

Are you thirsty for a new level of graphics technology? Tuesday, April 13th, NVIDIA will be hosting an unprecedented all-day LAN party in conjunction with the introduction of our next-generation GeForce GPU. Play Unreal Tournament 2004 and Battlefield Vietnam the way it?s meant to be played, on custom NVIDIA gaming machines powered by our next-generation GeForce GPU! And, join NVIDIA executives and industry luminaries for the launch event that evening. Whether you want to play in the LAN party or just want to attend the NVIDIA launch event, register now for your chance to win cash and prizes!

Which means they have working release silicon available on non-whql certified drivers :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Ackmed
I dont see any system specs, or how the game and drivers options were set.

Those numbers are worthless to me.

Quotes from B3D forum from the guys who got their hands on the benches claiming a close family friend tested it:

"His system was, Athlon 64 3200+, 1GB Ram Pc-4000, HDD 160gb Sata, Gigabyte GA-K8VT800 (VIA KT880)."
"he has his cpu o/c to 2420mhz ( 11x220 )" - So around Athlon FX53 speed.
"Stuff I know. NV40 Ultra is clocked at 475/600, GDDR3, revision A2, will probably be an A3 at some point for power consumption, 16x1, very significant pixel shader improvements (guessing that for register-intensive FP32 shaders, we're looking at a 4-6x performance improvement versus NV35 clock-for-clock...), RGMS, adaptive AF (don't know if there will still be non-adaptive AF, we'll see), will ship with Forceware 60, review boards will be out end of March.

I'm still kind of curious about A3 and retail availability--I would imagine that it is coming to improve yields and reduce heat output (and the possible double molex connectors), but given the April 13 date, I'm forced to wonder about when it's going to be available.

(PS--it's 16x1. Period. There is no doubt.)"

Unreal Botmatch CPU or GPU limited?
Finally, the proof for once and for all that Unreal Botmatch is ONLY CPU LIMITED AT =< 800X600
and proof that unreal botmatch is ALSO GPU LIMITED at 1600x1024
HERE

Insights
2. When highest ("MAX") quality settings are used, the graphics card speed and memory does make a difference.

Take a look at the "1600x1024" graph where the G5/2.0MP with Radeon 9800 Pro "outruns" the same model with a Radeon 9600 Pro. Note also the G4/1.42MP Power Mac beats the G5/1.8MP Power Mac at 1600x1024, thanks to its Radeon 9700 Pro.

Chances of NV40 being double the speed of current solutions? If 9800Pro is 38% faster than same generation 9600Pro at 1600x1024 -- Definately possible.

End of Story.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: RussianSensation
Quotes from B3D forum from the guys who got their hands on the benches claiming a close family friend tested it
You omitted the post where the guy says the UT2K4 tests were done on a dual Opteron system.

Unreal Botmatch CPU or GPU limited?
Finally, the proof for once and for all that Unreal Botmatch is ONLY CPU LIMITED AT =< 800X600
and proof that unreal botmatch is ALSO GPU LIMITED at 1600x1024
HERE
I'm going to just let the absurdity of those 'benchmarks' (if you can even call them that, given that they compare entirely different system combinations) speak for themselves.

Insights
2. When highest ("MAX") quality settings are used, the graphics card speed and memory does make a difference.
Is that even an insight?

Take a look at the "1600x1024" graph where the G5/2.0MP with Radeon 9800 Pro "outruns" the same model with a Radeon 9600 Pro. Note also the G4/1.42MP Power Mac beats the G5/1.8MP Power Mac at 1600x1024, thanks to its Radeon 9700 Pro.
Uhh yeah, because the "benchmarks" are completely inane and useless. I will say one thing, they exemplify why you shouldn't game on a mac comparing the top end G5 system to the AFX system xbit tested.

Chances of NV40 being double the speed of current solutions? If 9800Pro is 38% faster than same generation 9600Pro at 1600x1024 -- Definately possible.
If the mid-range card can perform about 2/3 of the high-end card, it is possible that an entirely separate card and architecture from another company could be double the performance? Judging by that, the RV420 will be up to triple the performance of the FX5950 since the 5950 is about 50% or more faster than the FX5700.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Is anyone else surprised at how bitter this ostensibly fun discovery has been contested? I mean, if you don't believe the numbers, fine, but there are way too many seemingly angry posts here and at B3D.

My take is some numbers are believable, and some not, but I take the optimistic view that the game numbers appear plausible, Uttar and Baron aren't likely to try to trick us, and Vegetto doesn't seem to be acting *too* suspiciously. Others may choose to take the pessimistic view that the fillrate and 3DM2K1 numbers are absurd, Vegetto's changing systems and no longer having the card is questionable, and the numbers seem too convenient.

In the end we're left with nothing but a bunch of pretty screenshots, and even they're under dispute. I have to say that even if they're just 4xOGSS with a blur filter, they look damn good and are something next-gen cards should strive for.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Is anyone else surprised at how bitter this ostensibly fun discovery has been contested?

I'm not suprised myself. Seems like everyone is so polarized about anything anymore (just read P&N a little) "You're either for us, or against us" I think the numbers seem plausible to me, or at least entertaining while we sit and wait for the goods:)
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Pete
Is anyone else surprised at how bitter this ostensibly fun discovery has been contested? I mean, if you don't believe the numbers, fine, but there are way too many seemingly angry posts here and at B3D.

B3D forums, while home of some very informed and informative chaps, is an anti-altar to nVidia these days. Perhaps some may justify their ATI slant with in depth technical details, most are so extremely anti-nVidia due to "questionable marketing/business practice" that they disregard anything that isn't ATI. It's annoying to have to read through 17 pages of posts to get about 5 posts worth of decent analysis.

The B3D site itself, however, is still by far the most technically astute of all sites wrt 3D technology and reviews.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
And i thought AT was ATi biased. Wow.

They dont even take into account that not only is he using an OCed A-64. We are dealling with a 16pipe card, clocked HIGHER than the NV35, with much faster memory.

They then go on to claim that the NV40s AA+AF scores in botmatch are impossible. On a card with much higher fillrate, on an A-64 @ 2400mhz. Then they go on to talk about SSAA and MSAA, which the NV40 USES NEITHER ONE (they use a new "superior" AA implementation). Even if the guy is lieing, his numbers are very possible.

Edit: they also claim the R9800 is cpu limited at 1600x1200 4xAA 8xAniso in UT2004 botmatch. LMAO.
rolleye.gif
 

Flyermax2k3

Diamond Member
Mar 1, 2003
3,204
0
0
Originally posted by: RussianSensation
Originally posted by: Ackmed
I dont see any system specs, or how the game and drivers options were set.

Those numbers are worthless to me.

Quotes from B3D forum from the guys who got their hands on the benches claiming a close family friend tested it:

"His system was, Athlon 64 3200+, 1GB Ram Pc-4000, HDD 160gb Sata, Gigabyte GA-K8VT800 (VIA KT880)."
"he has his cpu o/c to 2420mhz ( 11x220 )" - So around Athlon FX53 speed.
"Stuff I know. NV40 Ultra is clocked at 475/600, GDDR3, revision A2, will probably be an A3 at some point for power consumption, 16x1, very significant pixel shader improvements (guessing that for register-intensive FP32 shaders, we're looking at a 4-6x performance improvement versus NV35 clock-for-clock...), RGMS, adaptive AF (don't know if there will still be non-adaptive AF, we'll see), will ship with Forceware 60, review boards will be out end of March.

I'm still kind of curious about A3 and retail availability--I would imagine that it is coming to improve yields and reduce heat output (and the possible double molex connectors), but given the April 13 date, I'm forced to wonder about when it's going to be available.

(PS--it's 16x1. Period. There is no doubt.)"

Unreal Botmatch CPU or GPU limited?
Finally, the proof for once and for all that Unreal Botmatch is ONLY CPU LIMITED AT =< 800X600
and proof that unreal botmatch is ALSO GPU LIMITED at 1600x1024
HERE

Insights
2. When highest ("MAX") quality settings are used, the graphics card speed and memory does make a difference.

Take a look at the "1600x1024" graph where the G5/2.0MP with Radeon 9800 Pro "outruns" the same model with a Radeon 9600 Pro. Note also the G4/1.42MP Power Mac beats the G5/1.8MP Power Mac at 1600x1024, thanks to its Radeon 9700 Pro.

Chances of NV40 being double the speed of current solutions? If 9800Pro is 38% faster than same generation 9600Pro at 1600x1024 -- Definately possible.

End of Story.

Thanks for the laughs :) If that's your idea of proving a point I'd hate to see you argue with lesser resources ;)
I don't mean this as slight towards you, but your methods leave much to be desired.
The NV40's not out yet and we have no official specs. Until we get either of those things *everything* related to NV40 is pure speculation. Some speculation may end up being correct, but given Nvidia's track history of *NEVER* delivering on a single performance promise they've made doesn't make me any more likely to believe any of the rumors floating around about NV40. It started with the original TNT and has held up through NV3x, I don't see any reason to start believing Nvidia's marketing department now
rolleye.gif
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: Acanthus
And i thought AT was ATi biased. Wow.
Hahaha. AT is one of the saner places around IMO. :)

They dont even take into account that not only is he using an OCed A-64. We are dealling with a 16pipe card, clocked HIGHER than the NV35, with much faster memory.
Actually, no, it wasn't, the game tests were run on a Dual Opteron box as mentioned in this quote (Page 5, The Baron):
Okay, I got some clarification on the CPU speed/botmatch controversy. The system specs were not accurate for Painkiller and UT2004--for these, a dual Opteron machine was used (workstation, probably). The rest of the scores were done on the more reasonable machine posted before. Remember, the card was only in the tester's possession for two days, so I imagine he took it between home and work.

They then go on to claim that the NV40s AA+AF scores in botmatch are impossible. On a card with much higher fillrate, on an A-64 @ 2400mhz. Then they go on to talk about SSAA and MSAA, which the NV40 USES NEITHER ONE (they use a new "superior" AA implementation). Even if the guy is lieing, his numbers are very possible.
With AA/AF turned off, botmatch appears to scale with the CPU more than the GPU, though it is definitely capable of being limited by either at any given time.

In the end, we're going to have to wait until early April before we see some benches. If they are launching with a huge lanparty on the 13th, cards should be sent out to reviewers about a week before, right?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I wasnt trying to prove anything about the NV40. It was simple speculation. Clearly the benchmarks show GPU matters for botmatch whether you wont to believe it or not. It doesn't matter if Xbit ran a different system or not. I was only trying to show the GPU limitation imposed at high quality settings and disprove the belief that Botmatch is only CPU limited.

Since high-end graphics cards pound mainstream cards by significant margins, why is it a bad assumption to think that a new generation will be 2x faster than the current high generation? Radeon 9700Pro was at least 2x faster than 8500 and Geforce 4 4600 2x faster than geforce 3 with high quality settings. You don't have to agree with my speculation. The only time Nvidia did not live up to the hype was with the FX series. Remember ATI wasn't even on the map until 9700Pro so did Nvidia present a better product for 10 years prior to 9700Pro? Yes, so does it matter if they did not deliver on their promises ONCE when they were #1 for so long? And you are saying Nvidia never delivers, especially considering the performance difference between 5950U and 9800xt is slim in non-DX9 games where both cards cannot attain playable framerates anyways?
 

Flyermax2k3

Diamond Member
Mar 1, 2003
3,204
0
0
Originally posted by: RussianSensation
I wasnt trying to prove anything. It was speculation. Clearly the benchmarks show GPU matters for botmatch whether you wont to believe it or not. It doesn't matter if Xbit ran a different system or not. I was only trying to show the GPU limitation imposed at high quality settings and disprove that belief that Botmatch is only CPU limited period.

Since high-end graphics cards pound mainstream cards by significant margins, why is it a bad assumption to think that a new generation will be 2x faster than the current high generation? Radeon 9700Pro was at least 2x faster than 8500 and Geforce 4 4600 2x faster than geforce 3 with high quality settings. You don't have to agree with my speculation.

Did you actually read the link you posted? You can speculate all you like, that's your right. My only quarrel is with the link you provided.
Those are just about the most worthless benchmarks I've ever seen. Pairing different video cards with different systems and putting all the numbers in a graph does *not* a valid benchmark make.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Sure the benchmarks are valid. They show the difference in performance between changing various CPUs and Videocards.

For one, at 1600x1200 G5/2.0 9800 beats G5/2.0 9600 => by keeping the cpu constant it shows GPU dependency at high quality settings => GPU-limited .......

Secondly, at 1600x1200 G4/1.4 9700 beats G5/1.8 5200 => by changing the cpu to a slower model it helps to show GPU matters more at 1600x1200 then CPU does => again proving FAST processor matters less for high res => highlighting more GPU + CPU limitation together, espec when compare to G5/2.0 with 9600 which beats G4/1.4 9700 but barely.

Thirdly, at 800x600, the scores between G5/2.0 9800 and G5/2.0 9600 are almost identical => Why? Shows CPU dependency at lower resolution => CPU-limited

Now you are saying these scores do not make sense? Well I just did make sense of them just for you.
 

Flyermax2k3

Diamond Member
Mar 1, 2003
3,204
0
0
Originally posted by: RussianSensation
Sure the benchmarks are valid. They show the difference in performance between changing various CPUs and Videocards.

For one, at 1600x1200 G5/2.0 9800 beats G5/2.0 9600 => by keeping the cpu constant it shows GPU dependency at high quality settings => GPU-limited .......

Secondly, at 1600x1200 G4/1.4 9700 beats G5/1.8 5200 => by changing the cpu to a slower model it helps to show GPU matters more at 1600x1200 then CPU does => again proving FAST processor matters less for high res => highlighting more GPU + CPU limitation together, espec when compare to G5/2.0 with 9600 which beats G4/1.4 9700 but barely.

Thirdly, at 800x600, the scores between G5/2.0 9800 and G5/2.0 9600 are almost identical => Why shows CPU dependency at lower resolution.

Now you are saying these scores do not make sense? Well I just did make sense of them just for you.

LOL, if you think those benchmarks are valid then this discussion is over. There's no point arguing with a fool.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Alright bud maybe you can contact the reviewer and tell him when he benchmarked the systems the numbers are all wrong...you do not make any sense by saying the results are not valid. That would only be true if they were made up out of his ass.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: RussianSensation
I wasnt trying to prove anything. It was speculation. Clearly the benchmarks show GPU matters for botmatch whether you wont to believe it or not. It doesn't matter if Xbit ran a different system or not. I was only trying to show the GPU limitation imposed at high quality settings and disprove that belief that Botmatch is only CPU limited period.
Except the fatal flaw in your argument is the difference in renderers used. Apple boxes HAVE to run the OpenGL renderer, which is slower than its D3D counterpart in UT2K4. Sure, on a Mac it is GPU limited. Try to make an Apples to Apples comparison. By the way, statements like "Clearly the benchmarks show GPU matters for botmatch whether you want to believe it or not" are not actually speculation.
The xbit benchmark -- which is actually relevant to the discussion given that it was run on the same architecture, hardware, and software -- already shows that Botmatch isn't CPU limited when you crank up the res and turn on AA/AF. Those Apple benchmarks, which are run without AA/AF on, only serve to show that the OpenGL renderer in UT2K4 is far slower than the D3D renderer is.

Since high-end graphics cards pound mainstream cards by significant margins, why is it a bad assumption to think that a new generation will be 2x faster than the current high generation?
Because that is all it is, an assumption. Why not wait and see instead of assuming what is to come? I hope that the next gen cards really do mop the floor with the current high-end cards, but I want to wait and see before I assume they will.

Radeon 9700Pro was at least 2x faster than 8500 and Geforce 4 4600 2x faster than geforce 3 with high quality settings. You don't have to agree with my speculation.
In my experience, speculation rarely sounds like an argument that one thing IS better than another. ;)

Alright bud maybe you can contact the reviewer and tell him when he benchmarked the systems the numbers are all wrong...you do not make any sense by saying the results are not valid. That would only be true if they were made up out of his ass.
The only two lines of that 'benchmark' that are valid are the ones showing the R9800Pro vs the R9600XT running both in a G5/2.0MP system. The numbers themselves may be right, but they have little to do with one another. He makes perfect sense by saying the results are not valid. There are two things you would want to test for a graphics card: card to card performance, and system scaling. In one, you take several graphics cards, and benchmark them in the same setup, changing the graphics card, in the other, you take one graphics card, and benchmark it in the same setup, changing the processor.
 

lordtyranus2

Banned
Oct 3, 2003
300
0
0
Judging by this benchmark

http://www.xbitlabs.com/articles/video/display/ati-nvidia-roundup_13.html

We see that an overclock of the 9800 Pro from the standard 380/340 to a 420/360 setup yields a 3 FPS increase in speed, or a 9% speed increase.

So clearly, at even those settings, overclocking the GPU does increase performance, so it is potentially conceivable that putting in an even faster GPU could yield even better numbers. In other works, the game is clearly not CPU limited at those settings.

But as for the 72 FPS figure, that seems to be complete BS. If nothing else, I can NOT believe for a second that this card is 100% better than the current best on the market. 30%? Great. 50-60%? Stretching it but conceivable. But 100%? No way.


In any case this card ships in April I hear, so we will know for sure then.