NVidia stand proud in real-user benchmarks!

TravisBickle

Platinum Member
Dec 3, 2000
2,037
0
0
Whee! as if you didn't know...
look at the athlon1000 for example at madonion.com's database of over 1million users at 1024x768 16bit color (I have used the middle of each first page of results)

GTS/ultra 8600
GTS/pro 8225
GeforceMX 6100
RadeonDDR 5800
RadeonSDR 4240
V5500 5000

the performance choice is clear!;) the GTS is the card to apply the features of today at the resolutions of choice! this is not the word of some reviewer guy paid by advertisers!
the GTS seems to be overclocking to the point of blurring the distinction between it and the Ultra.:cool:
the voodoo was always known to suck at T&L benchmarks.:p
all Radeon cards' real competition is the MX and ATi has seen sense to price it accordingly. so the ddr creeps closer to the GTS at 32bit but try applying the funky features and FSAA with that!:(
(i don't include 32bit results because Radeon ones especially look like they were culled from drunkards, they are all over the place from 3300 to 5700 which looks v.unreliable.):frown:
apologies to all who know the good (and bad) news already!
 

slag

Lifer
Dec 14, 2000
10,473
81
101
Yeah.. because benchmarks mean so much in real world tests..

ROFL..

How about this one.

Quake III, demo 127

My machine, Thunderbird @ 824 mhz with Radeon 64 VIVO
1600x1200x32 37 fps

Friend's machine, Thunderbird @ 1 ghz with Geforce 2 GTS 64 MB
1600x1200x32 37.1 fps

Wow, that Geforce sure is alot faster

and whats this ?..
"(i don't include 32bit results because Radeon ones especially look like they were culled from drunkards, they are all over the place from 3300 to 5700 which looks v.unreliable.)"

Total and utter BS. The 32bit Radeon looks every bit as good (and most would say better), than geforce @ 32bits. As for being "all over the place", well, I don't know where you get your information, but even the 32 MB ddr Radeon is very close in benchmarks with the 64 MB card. But then again, benchmarks really dont mean squat.

Joe
 

bluemax

Diamond Member
Apr 28, 2000
7,182
0
0
Are you SUUUuure you're not Hardware? ;) The only thing missing is a big ****CONFIRMED!!!****
I don't think you're giving the Radeon the credit it deserves.
 

Taz4158

Banned
Oct 16, 2000
4,501
0
0
Guys just ignore him. He trolls for attention and everyone knows that or he posts off coloured and ridiculous threads in OT and they get locked. People like him NEED attention so ignoring wasteful posts like the one he just made is the only way to handle him. I'm sure NOBODY is surprised at the 3DMARK results. As usual Radeon owners are too busy PLAYING to waste time BENCHING.
 

TravisBickle

Platinum Member
Dec 3, 2000
2,037
0
0
no, seriously, if you look for results on Athlon 1000, 1024x768, 16bit, Radeon DDR chipset, look at the variability on that page! amazing!
i have to admit, it is a bit harder to find Radeon results on Madonion that one would expect. Do they shy away from benchmarks?
 

ahfung

Golden Member
Oct 20, 1999
1,418
0
0
I still like to see the scores under 1024x768 and 1280x1024x32. 32 bit is the norm now.
 

ahfung

Golden Member
Oct 20, 1999
1,418
0
0
OK I did it myself.

For Athlon 1GHz, 1280x1024x32, result chosen from the medium on the first page of each chipset.

In descending order:

NVIDIA GeForce 2 Ultra: 4698
NVIDIA GeForce 2 GTS/Pro: 4274
ATI Radeon DDR: 3567
NVIDIA GeForce DDR: 2667
NVIDIA GeForce 2 MX: 2551
3dfx V5 5500: 2420 (only 1 result found, LOL)
3dfx V4 4500: Result not found
ATI Radeon SDR: Result not found
NVIDIA GeForce SDR: Result not found

Have to say that NVIDIA still rules even under 32 bit high resolutions where Radeon's Hyper Z is supposed to be excel in.




 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
"3dMark2000 for dummies"?

why are we here?

anyone figure out how to steer that darn helicopter in the Treemark portion of the benchmark yet? My aim is impeccable, but I can't seem to steer where I want.

Doggone it.

 
Feb 9, 2001
61
0
0
For the record: Those scores are based only on 16-bit, not 32-bit. You obviously are missing the whole point of the Radeon. Perhaps you should attend a "Video 101" class and try to improve your knowledge.

For the record: Madonion posts a disclaimer stating very clearly that user submitted scores can NOT be proven. They are taken and posted without any verification.

For the record: Even though I am currently a happy Geforce owner, I would be proud to own a Radeon.

For the record: This is a weak attempt to start trouble, and it looks like Mr Taz was correct about you. You are nothing but a troll.

Hah! I laugh in your face and pee in your salad!
 

rickn

Diamond Member
Oct 15, 1999
7,064
0
0
to bad my vision got impaired staring at the crappy 2d of my former card, a Leadtek Geforce DDR. I dont think anyone has ever argued that the ultra is faster than Radeon DDR, it is afterall clocked significantly higher. Also, any scores on madonion do not differentiate from overclocked vs. defaults. There are a lot more people overclocking Geforce cards by a much higher margin than Radeons. Those results certainly are nothing scientific
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
HAHAHAHAHAHAHHA I'm sorry this is a good thread. This goes to prove my point quite well, of 3DMark sucking. I love the way the MX beats the Radeon DDR, SDR, and the V5. But anyway, this isn't about that......it's a 16-bit benchmark boy, 16-bit doesn't mean sh!t today.



<< (i don't include 32bit results because Radeon ones especially look like they were culled from drunkards, they are all over the place from 3300 to 5700 which looks v.unreliable.) >>


Oh, so it's unreliable? That's not the Radeon's fault.

Oh yea, a thought, wheres the regular GTS, hmm? Seems mighty strange, the MX can beat all these high-end cards, yet the GTS can't?
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Travis a troll? Why would you say that? A troll is someone who posts things like:


<< << Radeon not on Madonion top 10 user list!

I very nearly ended up with a radeon, but decided that the drivers had a reputation too bad to risk, and that ati never really took fsaa seriously and it doesn't effect in direct 3d. having seen them over the years, I don't like their attitude

I'm puzzled, i would have said it was to do with little memory on your card at a high resolution. another Radeon problem, methinks. what resolution is it at? what's it like at a lower one?

after prowling around the forums, no-one is ever going to convince me that radeon doesn't have big problems, with the drivers. it's common knowledge. it's quoted in website reviews. why deny it?

Taz, your radeon is working well. whoop-de-doo. have you ever considered *shutting up* a bit because you aren't representative? no offence, flameboy.

at college there is a display skeleton from Canada
I'm not sure if he died waiting for ATi support or he actually works for their tech support.

I've seen reviews saying the card wasn't stable at anything over its factory clock >>
>>

 

Taz4158

Banned
Oct 16, 2000
4,501
0
0
Wow guys, STOP giving Travis all this support, he'll get a swelled head. Hope he's cogniscent enough to recognize sarcasm. ;)
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
:p I don't see a difference between 16 and 32 bit color. Green is f'n green. Blue is F'n blue. Only Thing I see is a Performance hit.
 

TGCid

Platinum Member
Oct 9, 1999
2,201
0
0
What do you mean by colors? You mean there are more than black and white??
 

TravisBickle

Platinum Member
Dec 3, 2000
2,037
0
0
Robotech that was a good one i've never found a way to steer my heli in the trees either.
I don't trust a lot of those results also. for sure, the top results are from overclocking, but there are always results so far ahead of the rest they look fake.
well put it this way I'm not worried by the bad way the voodoo5 behaves in those benchmarks. for instance the voodoo is a case where 16bit color looks better than anybody elses. I am also sure the Radeon DDR can run its funky features for the foreseeable future in commonplace resolutions at good frame rates, but only the GTS/ultra has the headroom to apply fsaa with it fast.
because I don't know the Geforces very well i was surprised at just how fast it really was. and it's still surprising how hard it can be to find Radeon benchmarks on that site. that crazy page of Radeon results I noted suggests indeed that people are lying about results both up and even down.

it seems to me that 3dmark2000 (now 2001) was a benchmark that incorporated all the latest features of videocard hardware. so it would show off the latest hardware to its best advantage. the problem is the software development cycle is so much slower. honestly, I think Madonion would not like to publish a more relevant benchmark because it wouldn't look as pretty and wouldn't seem as current. it's targetted at people who will buy a performance 3d card and will be naturally interested in how it performs a year down the line for the stuff that will be likely written to really take advantage of it. it looks like Madonion get their predictions wrong, but in their position neither would I like to ignore features like T&amp;L in the benchmark. they have to keep it right up to date. I wonder if they include specifically Radeon features also in the 2001 edition.
I have to say it's a pretty benchmark tho! does anybody own 3dmark1999 and is that more relevant?
 

Smbu

Platinum Member
Jul 13, 2000
2,403
0
0
&quot;Oh yea, a thought, wheres the regular GTS, hmm? Seems mighty strange, the MX can beat all these high-end cards, yet the GTS can't? &quot;

it's listed right there in his post and in ahfung's.

in 1024x768x16bpp
GTS/pro 8225

in 1280x1024x32bpp
NVIDIA GeForce 2 GTS/Pro: 4274

The only thing that differentiates the GTS from the Pro is the higher clocked RAM. Although I do have my GTS clocked at 200/400 like a Pro card.:)

 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
As much as I like and regularly use 3DMark for testing, this thread is pure crap, and threads like this will only continue to give anti-3DMark zealots reasons to bitch. At least ahfung was kind enough to put up comparison scores using 32bit.

It is NOT a real-user benchmark, as it only tests D3D. When a game allows you to use OpenGL, in most cases that will work better, and give much better performance. 1024x768x16 is the norm in a 1GHz system?? Not a chance. It sure does look like the nVidia cards have all the bells and whistles in place to score high &quot;in this benchmark&quot;, while other cards may be missing some features that prevent them from scoring as high &quot;in this benchmark&quot;.

If you don't bother comparing your video card and system to different video cards and systems, and only compare what &quot;YOU&quot; have to identical systems, then the scores do work, and yours should be right up there if your system is as optimised as the other identical system. That is what 3DMark does best. Accurately compare identical systems.

It does surprise me that since 3DMark and nVidia jell so well, why haven't the other card makers found a benchmark that shows their card whipping a nVidia card in the same way? If it is as some here think, and Mad Onion likes nVidia so much more, doesn't anyone else like ATI just as much, so they can show that card favorably???? Doesn't make sense...