nVidia cards REALLY that fast?

Greeboog

Junior Member
Sep 15, 2000
11
0
0
This is a question for the people who have seen cards like GeForce whateverz versus other cards from ATI or Matrox or 3dfx...

When you look at every benchmark, fps score or any game, you can see that all GeForce cards score extremely well for a resolution/bpp versus the result of any other card at that same resolution/bpp. But, for those who have seen the image quality of GeForce cards and that of cards like the G400 on a good monitor, you know that 800X600 in 32 bits per pixel is NOT the same quality for them. At home i only have a voodoo II card but i've had the chance to use a G400 all summer long, and it's 16 bit quality is about the same that 32 bit quality for a GeForce card.

So... my question: at the same image quality, and NOT the same resolution/bpp, are GeForce cards really that much faster than other cards? Those who have experience with using GeForce and non-GeForce cards, please reply :) I know that image quality is a very subjective thing to compare, but please do your best! Thanks!
 

Charles

Platinum Member
Nov 4, 1999
2,115
0
0
I upgraded my Matrox G200 to NVidia TNT2 Ultra, GeForce DDR and then now GeForce2 GTS. Don't blame me if I say that NVidia cards are really fast because G200 was f***ing slow!
 

Greeboog

Junior Member
Sep 15, 2000
11
0
0
Yeah, of course, i mean a G200 vs a TNT2 or a GeForce!!! :) And I'm NOT saying they're not fast, but i'm just wondering how much faster (or slower!!) they are compared to other cards for the same image quality settings.
 

YU22

Golden Member
Mar 18, 2000
1,156
0
0

I certanly can't complain about my Gf DDR, neither picture quality or speed.

All this about nvidias chips/cards having terrible image quality is much like a myt, made and hold in life by 3dfx fans/zealots, because it's one of few arguments they had against nvidia in last years timeframe.

In 2d nvidias chips have just fine image quality and in 3d they have very good quality.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
In 3D image quality the GF based boards are quite a bit better overall then the G400.

Much as Matrox users point to high end 2D to justify their statement, run ViewPerf or the like on a Matrox board and check the visual quality between them. The G400 is clearly outclassed, and badly at that.

For 3D the Matrox G400 is both significantly slower and fails to compare on an image quality basis. All the "bashing" of nVidia's 3D currently has been over two issues, texture compression for one game(Quake3) and the variance of quality between V5's FSAA and its' own(minimal comparitively speaking, try 16x at 1024x768 and you will see the V5's FSAA is far from "great"). Running high end 3D applications the GF boards thoroughly outclass any of the other consumer companies offerings(Matrox, ATi or 3dfx) in speed and visual quality.
 

Rectalfier

Golden Member
Nov 21, 1999
1,589
0
0
I used to have a G400 MAX, but is froze up all the time, so I switched it out for a TNT2 Ultra. The TNT2 Ultra is faster however it does not even come close to the image quality of the G400 in both 2d(Desktop) and 3D(Games).
 

Noriaki

Lifer
Jun 3, 2000
13,640
1
71
Hmm...well about the only thing that a G400 has is faster RAMDAC...which helps image quality at really high res/refresh rates. I don't notice anything wrong with the image quality on a Voodoo3, TNT2 or GeForce.

EMBM looks really damn cool...Matrox has that going for them...but that's about it. If you are in really high res/refresh the speedy 360Mhz RAMDAC would probably help out your image quality. But in 2D at 800x600? I think you should check again...any decently made GeForce2 board would look fine. Maybe the GF card you used was a really cheaply manufactured POS..but my friend's got an Asus V6800 and it looks dandy to me. (I have also seen the G400Max).

But seriously how do you compare something like this? Maybe you decide that the "per pixel detail", lets call it, is about the same with a G400 at 800x600 as a GeForce2 at 1024x768. Well you have many more pixels and a much better screen area with the GF2 anyways...I don't really understand how you can compare something like this....

My only gripe against nVidia cards is that they have a longer list of incompatibilities than other video cards, but once you have a GF up and running I can't see anyone complaining (unless you are a video company X zealot)
 

Greeboog

Junior Member
Sep 15, 2000
11
0
0
Thank you for all of you nVidia card owners for your replies, but the question really wasn't for people who only have an nVidia card. I know people who have nVidia cards will defend their cards and say it's the best. I know people who have 3dfx cards will do the same. I know people who have an ATI or Matrox card will also do the same. You can't compare image quality with another card if you don't have it. But for those who can... are nVidia cards really that much faster than other cards for the same image quality? If you feel a card at 16bpp gives an equal image quality than another at 32bpp, than compare the fps for those two settings. Thanks.
 

StanTheMan

Senior member
Jun 16, 2000
510
0
0
I'm using TNT2 Pro at the moment. I believe when you say that Matrox 16 bit rendering is almost as good as TNT's 32 bit rendering. why? because Matrox has VCQ (Vibrant Color Quality). What it does is actually render all of the image in 32 bit colour, and in the end convert it back to 16 bit. this increase the 16 bit image quality. But in TNT(2) and GeForce (2), they render 16 bit as 16 bit, and 32 bit as 32 bit. I think nVIDIA's approach is better. as you can choose between 16 bit (speed) and 32 bit (quality). The only drawback from this techniques is that with old games that only support 16 bit, the game will look worse with nvidia video card.
 

YU22

Golden Member
Mar 18, 2000
1,156
0
0

StanTheMan actualy nvidias chips do the same thing, they always render internaly at 32bit and when diplaying in 16bit they dither image down to 16bit.

 

Greeboog

Junior Member
Sep 15, 2000
11
0
0
Yeah but there's more than the calculations' precision... I don't think nVidia card filtering is done in the proper way. By the way i'm not saying all of this to try to say that nVidia cards are bad! I mean I would rather have 35 fps with a soso image quality than 10 fps with a great looking image... I'd just like to get a good speed comparison for about the same image quality.