Flyermax2k3
Diamond Member
- Mar 1, 2003
- 3,204
- 0
- 0
I'm not going to argue with you, you have your opinion, and I have mine. We'll have to wait until NV4X actually comes out to know who's right 
Until then
:beer:
Until then
No, all I was pointing out was that the benchmarks shown by xbit tend to fall in line numbers-wise with what they are arguing in the thread. It would seem to me that botmatch (which should largely be unaffected by AA/AF or higher res) is NOT CPU limited.Originally posted by: Pete
If the benchmark xbit was using were CPU-limited, it'd show 40fps across the board for the 9800XT. It doesn't--the XT goes from 95fps @ 10x7 to 40fps @ 16x12 AA+AF. The main point of new cards is to increase framerate at taxing settings, and it don't get more taxin' than 16x12 AA+AF. And the fact that the XT scored 95fps shows that NV40 may indeed be scoring 80fps. Thus my puzzlement over the suggestion that Xbit's benchmarks are CPU-limited. CPU-limited would mean the same framerate from 10x7 to 16x12, and that doesn't seem to be the case there (or in any other graph that DW posted, AFAICT).
Originally posted by: Flyermax2k3
I'm not going to argue with you, you have your opinion, and I have mine. We'll have to wait until NV4X actually comes out to know who's right
Until then:beer:
Originally posted by: BenSkywalker
Actually, there are 9 pages to that thread that question the validity of the benchmarks. Specifically the botmatch scores being DOUBLE those of the R9800Pro/XT which are currently CPU bound.
If that thread were not about the NV40 and it was an honest person who thought the bench was CPU limited the rabid loyalists would be pointing out the the 9800 is wiping the floor with the 5950 at those settings and that all the boards are showing a linear increase throughout when the vid cards are overclocked. Not saying anything about the validity, but you have to take in to context the level of fanatacism on that forum and the fact that through the first eight pages at least noone was willing to do this right here.
Wait, what's that.... is that the R9800 pulling 100FPS in the very bench they are saying is CPU limited at less then 40? Why yes, yes it is. Bunch of half-wits flinging insults cuz they don't like what the numbers show, not because they have an active left lobe brain cell working on it for d@mn sure.
Edit-
Just made it to page ten of the thread- props to Pete for showing some reason by taking the minute to track down the results![]()
I agree with the comments about the CPU limited nature of those benchmarks, especially the UT2004 results.
And you're calling THEM fanboys for attacking the validity of the benches
Originally posted by: Ackmed
I dont see any system specs, or how the game and drivers options were set.
Those numbers are worthless to me.
You omitted the post where the guy says the UT2K4 tests were done on a dual Opteron system.Originally posted by: RussianSensation
Quotes from B3D forum from the guys who got their hands on the benches claiming a close family friend tested it
I'm going to just let the absurdity of those 'benchmarks' (if you can even call them that, given that they compare entirely different system combinations) speak for themselves.Unreal Botmatch CPU or GPU limited?
Finally, the proof for once and for all that Unreal Botmatch is ONLY CPU LIMITED AT =< 800X600
and proof that unreal botmatch is ALSO GPU LIMITED at 1600x1024
HERE
Is that even an insight?Insights
2. When highest ("MAX") quality settings are used, the graphics card speed and memory does make a difference.
Uhh yeah, because the "benchmarks" are completely inane and useless. I will say one thing, they exemplify why you shouldn't game on a mac comparing the top end G5 system to the AFX system xbit tested.Take a look at the "1600x1024" graph where the G5/2.0MP with Radeon 9800 Pro "outruns" the same model with a Radeon 9600 Pro. Note also the G4/1.42MP Power Mac beats the G5/1.8MP Power Mac at 1600x1024, thanks to its Radeon 9700 Pro.
If the mid-range card can perform about 2/3 of the high-end card, it is possible that an entirely separate card and architecture from another company could be double the performance? Judging by that, the RV420 will be up to triple the performance of the FX5950 since the 5950 is about 50% or more faster than the FX5700.Chances of NV40 being double the speed of current solutions? If 9800Pro is 38% faster than same generation 9600Pro at 1600x1024 -- Definately possible.
Is anyone else surprised at how bitter this ostensibly fun discovery has been contested?
Originally posted by: Pete
Is anyone else surprised at how bitter this ostensibly fun discovery has been contested? I mean, if you don't believe the numbers, fine, but there are way too many seemingly angry posts here and at B3D.
Originally posted by: RussianSensation
Originally posted by: Ackmed
I dont see any system specs, or how the game and drivers options were set.
Those numbers are worthless to me.
Quotes from B3D forum from the guys who got their hands on the benches claiming a close family friend tested it:
"His system was, Athlon 64 3200+, 1GB Ram Pc-4000, HDD 160gb Sata, Gigabyte GA-K8VT800 (VIA KT880)."
"he has his cpu o/c to 2420mhz ( 11x220 )" - So around Athlon FX53 speed.
"Stuff I know. NV40 Ultra is clocked at 475/600, GDDR3, revision A2, will probably be an A3 at some point for power consumption, 16x1, very significant pixel shader improvements (guessing that for register-intensive FP32 shaders, we're looking at a 4-6x performance improvement versus NV35 clock-for-clock...), RGMS, adaptive AF (don't know if there will still be non-adaptive AF, we'll see), will ship with Forceware 60, review boards will be out end of March.
I'm still kind of curious about A3 and retail availability--I would imagine that it is coming to improve yields and reduce heat output (and the possible double molex connectors), but given the April 13 date, I'm forced to wonder about when it's going to be available.
(PS--it's 16x1. Period. There is no doubt.)"
Unreal Botmatch CPU or GPU limited?
Finally, the proof for once and for all that Unreal Botmatch is ONLY CPU LIMITED AT =< 800X600
and proof that unreal botmatch is ALSO GPU LIMITED at 1600x1024
HERE
Insights
2. When highest ("MAX") quality settings are used, the graphics card speed and memory does make a difference.
Take a look at the "1600x1024" graph where the G5/2.0MP with Radeon 9800 Pro "outruns" the same model with a Radeon 9600 Pro. Note also the G4/1.42MP Power Mac beats the G5/1.8MP Power Mac at 1600x1024, thanks to its Radeon 9700 Pro.
Chances of NV40 being double the speed of current solutions? If 9800Pro is 38% faster than same generation 9600Pro at 1600x1024 -- Definately possible.
End of Story.
Hahaha. AT is one of the saner places around IMO.Originally posted by: Acanthus
And i thought AT was ATi biased. Wow.
Actually, no, it wasn't, the game tests were run on a Dual Opteron box as mentioned in this quote (Page 5, The Baron):They dont even take into account that not only is he using an OCed A-64. We are dealling with a 16pipe card, clocked HIGHER than the NV35, with much faster memory.
Okay, I got some clarification on the CPU speed/botmatch controversy. The system specs were not accurate for Painkiller and UT2004--for these, a dual Opteron machine was used (workstation, probably). The rest of the scores were done on the more reasonable machine posted before. Remember, the card was only in the tester's possession for two days, so I imagine he took it between home and work.
With AA/AF turned off, botmatch appears to scale with the CPU more than the GPU, though it is definitely capable of being limited by either at any given time.They then go on to claim that the NV40s AA+AF scores in botmatch are impossible. On a card with much higher fillrate, on an A-64 @ 2400mhz. Then they go on to talk about SSAA and MSAA, which the NV40 USES NEITHER ONE (they use a new "superior" AA implementation). Even if the guy is lieing, his numbers are very possible.
Originally posted by: RussianSensation
I wasnt trying to prove anything. It was speculation. Clearly the benchmarks show GPU matters for botmatch whether you wont to believe it or not. It doesn't matter if Xbit ran a different system or not. I was only trying to show the GPU limitation imposed at high quality settings and disprove that belief that Botmatch is only CPU limited period.
Since high-end graphics cards pound mainstream cards by significant margins, why is it a bad assumption to think that a new generation will be 2x faster than the current high generation? Radeon 9700Pro was at least 2x faster than 8500 and Geforce 4 4600 2x faster than geforce 3 with high quality settings. You don't have to agree with my speculation.
Originally posted by: RussianSensation
Sure the benchmarks are valid. They show the difference in performance between changing various CPUs and Videocards.
For one, at 1600x1200 G5/2.0 9800 beats G5/2.0 9600 => by keeping the cpu constant it shows GPU dependency at high quality settings => GPU-limited .......
Secondly, at 1600x1200 G4/1.4 9700 beats G5/1.8 5200 => by changing the cpu to a slower model it helps to show GPU matters more at 1600x1200 then CPU does => again proving FAST processor matters less for high res => highlighting more GPU + CPU limitation together, espec when compare to G5/2.0 with 9600 which beats G4/1.4 9700 but barely.
Thirdly, at 800x600, the scores between G5/2.0 9800 and G5/2.0 9600 are almost identical => Why shows CPU dependency at lower resolution.
Now you are saying these scores do not make sense? Well I just did make sense of them just for you.
Except the fatal flaw in your argument is the difference in renderers used. Apple boxes HAVE to run the OpenGL renderer, which is slower than its D3D counterpart in UT2K4. Sure, on a Mac it is GPU limited. Try to make an Apples to Apples comparison. By the way, statements like "Clearly the benchmarks show GPU matters for botmatch whether you want to believe it or not" are not actually speculation.Originally posted by: RussianSensation
I wasnt trying to prove anything. It was speculation. Clearly the benchmarks show GPU matters for botmatch whether you wont to believe it or not. It doesn't matter if Xbit ran a different system or not. I was only trying to show the GPU limitation imposed at high quality settings and disprove that belief that Botmatch is only CPU limited period.
Because that is all it is, an assumption. Why not wait and see instead of assuming what is to come? I hope that the next gen cards really do mop the floor with the current high-end cards, but I want to wait and see before I assume they will.Since high-end graphics cards pound mainstream cards by significant margins, why is it a bad assumption to think that a new generation will be 2x faster than the current high generation?
In my experience, speculation rarely sounds like an argument that one thing IS better than another.Radeon 9700Pro was at least 2x faster than 8500 and Geforce 4 4600 2x faster than geforce 3 with high quality settings. You don't have to agree with my speculation.
The only two lines of that 'benchmark' that are valid are the ones showing the R9800Pro vs the R9600XT running both in a G5/2.0MP system. The numbers themselves may be right, but they have little to do with one another. He makes perfect sense by saying the results are not valid. There are two things you would want to test for a graphics card: card to card performance, and system scaling. In one, you take several graphics cards, and benchmark them in the same setup, changing the graphics card, in the other, you take one graphics card, and benchmark it in the same setup, changing the processor.Alright bud maybe you can contact the reviewer and tell him when he benchmarked the systems the numbers are all wrong...you do not make any sense by saying the results are not valid. That would only be true if they were made up out of his ass.