• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Post your AquaMark3 CPU scores here!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
highly programable = means they can turn "quality textures mode" into "high performance textures mode" at will.

;) LOL
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,579
146
Originally posted by: THUGSROOK
highly programable = means they can turn "quality textures mode" into "high performance textures mode" at will.

;) LOL
:Q:brokenheart:;)
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Anyway... here's my scores... I don't think it's the detonator drivers responsible for doubling your CPU score =)

Linkage

45.23 Detonators
XP2500 @ 2.2 Ghz
Ti4200 128MB @ 320/650

Aquamark Score: 19,476
CPU: 7586
GFX: 2235
FPS: 19.48

When my FX5900 comes I'll let ya know how that does =)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: DAPUNISHER
Your working relationship with nVidia doese not negate the fact that the drivers leaked at precisely the perfect moment to influence scores in Aquamark3, if anything it undermines your position by making it appear as though you have an agenda the average gamer does not ;) And just how do you come to the determination that the rep is not spewing whatever propaganda he is being told/paid to? Besides, you are under an NDA so you couldn't say either way if it was an intentional leak even if you knew
rolleye.gif
In the end, the timing, the influence on the benchmarks, and the recent tactics make the situation highly suspect and if most will not believe the official statements made by nVidia themselves, then why would they believe you when you offer 2nd hand data from the same source.

Now that's not to say nVidia is the one behind the leak but the performance increase at the expense of IQ combined with the all the other previous points could certainly influence people to believe they are, so calling my speculation the stupidest thing you've heard in a long time just make you an elitist asshole because you have inside info others do not but seem to expect them to have :disgust:

Jesus, you really are that ignorant. What you just did is verify to everyone else that rather than playing devils advocate, you would rather pull out a dictionary to put words down that you don't even understand. What's funny is that it would appear that ATi has a stranglehold on your penis, and you rather enjoy it. It's funny how people always look for the coincidence in a release of something. Heaven forbid that accidents actually happen, and that ignorance wouldn't be so blatant on forums.

Oh, perhaps when your jealousy isn't so apparent, you can come back and post with a level head and not appear to be such an elitist asshole yourself, because you'd be one of those people I'd listen to what you had to say, and then promptly laugh at your ignorance in your face.

Oh, one more thing. Try to make sure that you flow with one thought if you're going to try and make a point, because you tried to validate your position with things that didn't even correlate to the original post. ;)
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
the problem is.....

we thought NV was on the right track with the 45.23 "quality" drivers. (quality at least to anything released before them as far back as the 30.82s) now these horchit 51.75s popup and OHNO! nv in back in the "cheating" game again!
rolleye.gif


so is this what its gonna be like when real DX9 games are on our boxes?
the game will suck til NV decides to tweak them?
then well need to scrutinize them to make sure NV didnt "cheat"?

this doesnt sound like fun :|

the 5900 is a very nice card, but this driver BS is trying my patience with them
rolleye.gif
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
This is the purpose of a "highly programmable" GPU. So they can optimize the hardware for the software. In my opinion, this is a good thing. However, it may not not be smart on nVidia's part since they're just creating more work for their programmers.

Either way... I highly doubt anybody here is in a position to tell the programmers and engineers at nVidia how to do their job. If you are, maybe you should apply for a job with nVidia and show them how it's done.

Same goes for ATI... you can bitch and moan about their "bad drivers" all you want... but until you can write better drivers, you should cut them some slack, they've come a long way.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
It has nothing to do with being programmable. It has everything to do with not following standards. nVidia did not follow DX9 standards, they also have the same issue with OpenGL. Games written for DX9 just "work:" on an ATi card using standard DX9 codepath. nVidia needs a special codepath for their hardware since it does not conform to DX9. Even Doom 3 in OpenGL needs a special nVidia codepath while Ati does not.

What do game developers think of this?

Anandtech:
The developer of Half-Life 2, Valve, is the first developer to voice their displeasure for the NV3x architecture with such intensity, because it has forced them to write additional codepaths particularly for NVIDIA hardware; thus, costing them time, money, and extra resources. This was something not needed to run on ATI hardware, which is why they entered into an agreement with ATI. The order of the agreement was based on already existing hardware benchmark scores to a marketing agreement, not the other way around as some have speculated.
Beyond3D:
Note: It?s an interesting fact that of the two most eagerly anticipated games over the next year the developers have had to expand time and effort in creating special, optimised paths for GeForce FX hardware whereas ATI has more than acceptable performance (and with higher IQ) in both of these titles utilising the default path. The release dates for both of these titles have been a constant source of interest - has creating more special rendering paths actually hindered the release of these titles?

Yesterday I went to Mojo Reloaded (ATI, MS and Intel game developer day) at Guildford. These events are always a combination of interesting presentations and catching up with the industry gossip. This one for me was far more gossip then prensentation, not that the presentations weren't interesting, My favs were the PRT/SH and non linear post-processing streaks (I'll give a better description in a minute). As this will be up on the web for some time its worth stating what day yesterday was, a day after the HL2 benchmarks were released.
The gossip on this was fairly dominant throughout the day about the benchmarks and comments putting NVIDIA in a bad light at Dx9 shaders.

Obviously ATI weren't that upset to see there hardware coming ahead so well, but what wasn't prehaps so expected was how glad everybody else was that HL2 results matched the results most of us had already seen ourselves. Somebody on the forums (sorry can't remember who) asked why developers seemed quite shy in stating our results, I obviously can't talk for everbody but the answer is probably a simple case of somebody had to first and whoever that person/company was, they better be able to handle the heat that it would produce.

Valve are fairly lucky, HL2 is probably the most eagerly awaiting tltle in the business, everybody has being doing everything to get the best results for this title. Everybody knows these guys are sh!t hot, they know what there doing and if they can't get good results, something is wrong and its likely not to be them. Smaller developers (and thats probably everbody except iD in the PC arena) don't have that luxery, If I had produced a similar performance table, the response of a lot of people would simply be, that the developers (i.e. me) don't know what there doing and obviously can't program properly. And why not? I don't have the reputation for quality that Valve or ID has, they've earnt there rights to be trusted that they know what there doing.

For Valve to do this, shows they were really annoyed, also the fact Microsoft issued a press release stating HL2 was the DirectX 9 benchmark also show how annoyed they were. To get these two massively important PC games companies to make such a public condemnation means you had to do something bad, just having bad performance wouldn't have been enough.

The basic problem that NVIDIA has caused has been the amount of extra work they've been requiring everybody else to do. Wether its benchmark's having to get smart and try and stop application specific optimisations, or developers having to write extra pipelines to even get half decent preformance at the high tech things its meant to be good at or MS having to upgrade the WHQL test to find spec violations. Everbody has been forced to pay for NVIDIA's mistakes, and that is what has caused the anger.

But in some ways it has had good consquences, quality should go up as loopholes are closed.
Future DX specs should now be much tigher.
WHQL testing to require pixel comparision tests.
Hardware must produce almost exact rendering of the same frame as the REFRAST
Self certification of WHQL, to make sure that WHQL driver will have bug fixes applied quicker without bypassing the quality checks.
Reviewers should be less quick to use 'special' drivers provided by the IHVs or test only under 'special' conditions.

Long term the biggest change this year long fiasco has caused will be to Microsoft and PC game developers. Microsoft have had to learn to protect its baby Direct3D, before its largely left quality and stability issues upto individual IHVs, now it knows that this is also its reputation thats damaged when IHVs play dodgy quality games. And us humble game developers have learnt we have to shout sometimes to protect our games from bad discisions made by IHVs, we can't just mop up the sh!t when it hits us. We have to be willing and able to communicate that certain things are NOT acceptable to our customers, so don't bother doing it. If your card is crap at something, at least be honest earlier on, don't make us find out when our games runs like a dog on your hardware, even though were using the techniques you've been suggesting for the last year.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,579
146
RoninCS, I don't even own an ATi card *only the 320m integrated GPU*, I run a 5800ultra, ti4200, and eVGA Personal Cinema along with 2 nF2 boards ;) Next time check the sig before implying someone is a fanATIc
rolleye.gif
I also beta-test nforce drivers and utilities for nVidia and have also signed an NDA. Does that mean I will exclude them from criticism the way you do because you are a kiss ass? no! Btw, all you proved is you can launch ad hominen attacks with the best of them while avoiding any substance to your posts, plus parrot nVidia rhetoric, now go hump somebody else's leg for awhile you nVidiot fanboi :)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: DAPUNISHER
RoninCS, I don't even own an ATi card *only the 320m integrated GPU*, I run a 5800ultra, ti4200, and eVGA Personal Cinema along with 2 nF2 boards ;) Next time check the sig before implying someone is a fanATIc
rolleye.gif
I also beta-test nforce drivers and utilities for nVidia and have also signed an NDA. Does that mean I will exclude them from criticism the way you do because you are a kiss ass? no! Btw, all you proved is you can launch ad hominen attacks with the best of them while avoiding any substance to your posts, plus parrot nVidia rhetoric, now go hump somebody else's leg for awhile you nVidiot fanboi :)

Thank you for proving my point. Case closed. Pity you didn't even realize it. :cool:
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,579
146
Originally posted by: RoninCS
Originally posted by: DAPUNISHER
RoninCS, I don't even own an ATi card *only the 320m integrated GPU*, I run a 5800ultra, ti4200, and eVGA Personal Cinema along with 2 nF2 boards ;) Next time check the sig before implying someone is a fanATIc
rolleye.gif
I also beta-test nforce drivers and utilities for nVidia and have also signed an NDA. Does that mean I will exclude them from criticism the way you do because you are a kiss ass? no! Btw, all you proved is you can launch ad hominen attacks with the best of them while avoiding any substance to your posts, plus parrot nVidia rhetoric, now go hump somebody else's leg for awhile you nVidiot fanboi :)

Thank you for proving my point. Case closed. Pity you didn't even realize it. :cool:
Bad dog! don't make me hit you with a rolled up newspaper! now get! ;) BTW, I may indeed be quite off the mark with the comment I made *willing to admit I'm not the sharpest tool in the shed quite often :) * but your defense of the betas and the attempt to deflect the attention away from how obvious it is that they were designed for a very specific purpose doesn't seem to be winning anyone over to your way of thinking on the subject ;)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
The other point here is that it's your assumption, and perhaps a few others, that's exactly why the drivers got leaked. I'm afraid Aquamark doesn't hold that much pull with most of the public (and perhaps that's not a good thing), and I'm fairly certain that nVidia holds little creedence to their benchmark.

My humble, and perhaps way off base, 2 cents.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,579
146
Originally posted by: RoninCS
The other point here is that it's your assumption, and perhaps a few others, that's exactly why the drivers got leaked. I'm afraid Aquamark doesn't hold that much pull with most of the public (and perhaps that's not a good thing), and I'm fairly certain that nVidia holds little creedence to their benchmark.

My humble, and perhaps way off base, 2 cents.
You'll get no argument from me on the point of me perhaps ASSuming too much and being influenced by what others are speculating on the matter. I do believe that based on the coverage that Aquamark is recieving at the moment that it must be a cause of concern for nVidia. BTW, I don't take any of it personally and the occassional heated debate is a nice change from my normally mundane posting habits, so no hard feelings regardless :)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Ditto. :)

I wonder if I could get an actual honest answer from my folks at nVidia about how they view Aquamark.

/me makes a task note in Outlook to make a call in the morning.
 

KoolHonda

Senior member
Sep 24, 2002
331
0
0
Anyone know the sales figures on DX9 cards? The Game Developers bending over to NV over a small segment of the market has me :confused:
 

Noid

Platinum Member
Sep 20, 2000
2,390
193
106
Man ... talk about some serious thread creep... (flame throws away!)

42,169

AquaMark CPU Score: . . 8013
AquaMark GFX Score: . . . 5721
Average FramesPS: . . . 42.170
Average TrianglesPS: . . 12694 K

more benches

Looks like the FX 5900 Ultra running around 500 core and 1000MHz memory speeds will beat a ATi 9800
Also,,, the top CPU scores in 2200 - 2300 are held by model 10 processors ,,, except for 2 Opteron scores ...

the opteron numbers are interesting

Aquamark needs to have motherboard info compare results also,,, anyone know if it comes with the licensed Pro version?
 

Dustswirl

Senior member
May 30, 2002
282
0
0
CPU: 7469, <<<2500XP @ 2305mhz NOT 3200 (2200mhz) as my sign says
GFX: 5706, <<<9800pro default

How did other 2500owners get high scores??? oced or not!
 

Noid

Platinum Member
Sep 20, 2000
2,390
193
106
It a matter of finding the sweet spot I think ...

I was pushing 220+ fsb stable,,, but found that I get better scores at 215fsb and a half a multi bump ...