• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

2d quality differences among different cards, is it obvious ?

littlegohan

Senior member
I think I am not doing my fe950+ justice because I am using a really crappy forsa gefroce 2 mx. Considering the fact that the ram on it is 133mhz, the 2d filter on that card should be junk too

If I switch to a radeon, or a high 2d quality ti4200 card ( need recommendations , btw), will I be able to see a difference ?
 
To me now-a-days 2d quality depends on the monitor that you use rather than the video card.....unless you use very high resolutions then you can tell a difference in some video cards. And that resolution is usually above 1600x1200.
 
nvidia has had some problems with it's third party card makers. This is outlined in anandtech's DVI article which I'm too lazy to look up. Everything geforce 3 and down SUCKS SUCKS SUCKS for image quality. This is because nvidia had really loose requirements for card makers and they used cheap DACs etc. With the GF4 line and hopefully above they have much stricter requirements and the image quality of GF4 cards is GREAT. It easly matches matrox and ATI now based on all sorts of image quality comparisons I"ve read. I run my GF3ti200 at 1600x1200 and the image quality SUCKS SUCKS SUCKS. I can't wait to get a GF4. I wouldn't recommend radeon. Not as fast, texture reproduction problems, and generally buggy and incompatible. If you run at 1600x1200 you will definiltey see a big difference when going to GF4. Oh yeah...
 
Originally posted by: Chadder007
To me now-a-days 2d quality depends on the monitor that you use rather than the video card.....unless you use very high resolutions then you can tell a difference in some video cards. And that resolution is usually above 1600x1200.
I can tell a difference a resolutions below 1600x1200 between two video cards on the same monitor. I have tried a Matrox Millenium G400 MAX, Abit GeForce2 MX400 64MB, and Hercules Prophet 4500 Kyro2 at 1280x1024 with 32bit color on the same monitor and the 2D quality difference is noticeable. G400 MAX of course out in front, then the Kyro2 (it's decent), and the GeForce2 MX400 lagged behind. I really wish Matrox would release a 64MB, scaled down version of the Parhelia...
 
If you read the DVI article which I mentioned arlier it will explain it all for you. Basically with analogue signals, the less accurate it is, you'll get little jitters and incorrect data I.E. grey showing up as a lighter shade so at high resolutions with fine detail you'll see blurry text, some jitteryness (word?) and blurred adges around everything.
 
I think I am not doing my fe950+ justice because I am using a really crappy forsa gefroce 2 mx.

The geforce 2 mx doesn't do anything justice. Although people always talk about how different cards have highly different 2d quality, I tend to disagree. I just upgraded from a geforce 2 gts to an abit siluro geforce 4 ti4400. I can't really notice any quality difference in 2d. Maybe I would if I went over it with a fine tooth comb, but in general It looks the same to me.
 
Um yeah, every model card is different, but in general GF series cards have been behind the curve. Have my GF2ti (Gainward) out at the moment (see other thread for boring details) and have my Voodoo3 2000 back in temporarily.

Wow

Difference is like the difference between a filthy monitor and a freshly cleaned one - except I didn't clean it. Or how much sharper a dusty TV looks after you wipe the tube off. And I'm only at 1280x1024. Kinda hate to put the GF2 back in here now, but then, it's worthless to me elsewhere (see other thread for boring details).

--Mc
 
a friend of mine has the same monitor i do (NEC xe17), and even at 1024x768 his matrox g400 looks a ton better than my visiontek geforce3ti200. there isn't really anything you can put your finger on, it just looks a lot better.
 
You can just walk around in computer stores, and look at some nice flat panel monitors. The morons that work in these stores usually plug them into Geforces MX to demonstrate them, and the text and images just look HORRIBLE. My Matrox Millenium II on an old crappy 17inch monitor looks much better!
 
My Matrox Millenium II looks better than my old Geforce 2. That is how behind the times those cards were in 2D quality.

Differences? Not as vibrant colors, it looks pretty bland. When I first popped in my ATI 7500 (and then later upgraded to an 8500, which looks the same), I IMMEDIATLY noticed a HUGE difference in color and clarity on my desktop and 2d applications.

I had a Geforce3 for a brief period of time, and it looked as bad as my GF2, and those Geforces look bad at resolutions below 1600x1200, I'd say 1024x768 may have some noticable differences, depending on how crappy a GF2 was used.
 
Originally posted by: 7757524
nvidia has had some problems with it's third party card makers. This is outlined in anandtech's DVI article which I'm too lazy to look up. Everything geforce 3 and down SUCKS SUCKS SUCKS for image quality. This is because nvidia had really loose requirements for card makers and they used cheap DACs etc.

Isn't the DAC integrated into the chipset? I think you are referring to the analog output circuitry on the earlier GF reference designs. Most mfgs implemented cheapo analog output stages.

(For really sharp 2D output, I haven't seen anything better yet than my Matrox Millenium 1 cards, even up to 1920x1280.)


 
Originally posted by: 7757524
BTW if you REALLY want perfect reproduction get a digital DVI LCD (dvi also carries analogue signals)

So get a digital digital video interface, instead of just a digital video interface? Is that to make sure that it is fully digital? Btw, DVI interfaces are digital, they do not carry analog signals. The DVI-to-VGA converters include a RAMDAC inside them, AFAIK.

Whoops. Did some more research, there are actual four different "DVI" connector types. http://www.networktechinc.com/dvi.html has a list of them. What you meant, I guess, was the "DVI-D" flavor.

I guess for the DVI-I connectors, they do include analog video, but the adaptors for DVI-D to VGA, have the RAMDACs inside, like I mentioned.
 
I've never noticed any difference in 2d between any of the video cards I've owned. (some built in rage card, voodoo banshee, gf2mx, gf3ti200, gf4ti4200)

I have noticed a difference going from 2 crappy 15 inch monitors to a fairly decent 17 inch one.
 
I`m using two different GF2 cards on my monitor and the 2D image quality was slightly dIfferent on my 98 PC which had a Sparkle GF2 MX card and the image was crisp at 1152x864(my native res) on my Leadtek GF2 Ti which is in my WinXP PC was very slightly inferior until the 29.42 drivers came out and that sharpened it up to my surprise.

I `ve also had an ATi card linked up to my monitor when doing a repair job for a friends PC,anyway the image quality was about the same at 1152x864,I never really got the chance to do indepth testing.
Running at 1600x1200 normally seperates the good cards from the bad,too bad I never use that res.

Image quality is different between cards etc,also even drivers can effect the image quality,and probably the most important thing is what resolution you run your card in.Yes the monitor can also effect it as well (I`ve had 3 monitors of the same make and model and image quality was different on all three)so there`re many factors to consider.
In the end it`s what your happy with that counts.
 
DVI interfaces are digital, they do not carry analog signals.


this is the link that I was reffering to.
"On the left you'll notice 3 rows of 8 pins each; these 24 pins are the only pins required to transmit the three digital channels and one clock signal. The crosshair arrangement on the right is actually a total of 5 pins that can transmit an analog video signal.

This is where the specification divides itself in two; the DVI-D connector features only the 24-pins necessary for purely digital operation while a DVI-I connector features both the 24 digital pins and the 5 analog pins...By far, the vast majority of graphics cards with DVI support feature DVI-I connectors.

The idea behind the universal nature of this connector is that it could eventually replace the 15-pin VGA connector we're all used to as it can support both analog and digital monitors."
 
I remember Sharkeeper mention that, in the case of NVIDIA cards, there has been great variation amongst the same cards. So one person's Hercules GF2 GTS might look blurry and another's might look decent.

Tomshardware's 2D quality tests indicated the same was even true for different GeForce4 manufactuters: they tested their GF4 cards and I remember Gainward had one of the best scores. When I got my Gainward Ti4200, its image quality was pretty good, on par with my 2D-quality-modded GeForce2MX. (Using Sony CPD-G500 .24mm at 1600x1200x32 @85Hz).

But Tom's test makes one thing clear: even GeForce4 cards don't consistently exhibit high 2D quality.
 
I have the same monitor, FE950+ but with a radeon 8500 and yes, the image quality is sweet! 2d is really sharp compared to the same monitor w/ a geforce 2 at some lan places i have. I can't stand them at lan places cause it just makes the monitor look like crap when really it's great paired w/ a good videocard!

Oh and upgrading to the new catalyst drivers I could tell that "print" got sharper. Get a radeon! i've never tried a gf4 though so i can't comment on that 🙂.
 
Back
Top