High Res VS FSAA

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Alkali

Senior member
Aug 14, 2002
483
0
0
These people who say how rubbish FSAA is, I wonder if they have a Graphics card worthy of it?

My old GeForce 2 GTS could do 4xFSAA and it looked HORRIBLE, at very low fps, but it was an old card that wasnt equiped to give good FSAA, what do you expect, 2fps frame increase on no FSAA?
 

eaadams

Senior member
Mar 2, 2001
345
0
0
why is it that grafix cards cant push the limits of resolution like they can with FSAA and AF. Is it because resolution requires brut power while the others are more driver oriented?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
This whole discussion is moot to the poster's needs, b/c with onboard video, he'll have trouble running Pong at 1600x1200 ;)

The only card that can really achieve anything you guys are talking about at 1600x1200 without massive performance hits is the R9700pro, everything else will just run so poorly that any possible benefits will be shadowed by the slide-show onscreen.

As for the comments about LCD vs CRT, this has been argued over and over. I would tend to disagree that CRTs provide better colors (with DVI there's no comparison), and they don't need high refresh rates since LCDs don't "refresh" like CRTs do. Picture sharpness is no contest either, a good LCD provides much better IQ than CRTs. The price difference has decreased significantly as well, and what we tend to forget is large CRTs often have the same VIS as an LCD that is 1-2'' smaller. No distortion, warping, geometric issues, and the screen is 100% flat. My 1900FP is rated at 19'', and it actually is 19''. You'd have to get a 21'' CRT which isn't that much cheaper (~$200), but is about 8x the size and weighs 5x as much (my LCD weights 13lbs compared to ~60lbs :Q :Q :Q).

That being said, you'll need a better video card to take full advantage of all that real estate.

Chiz
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Your 19" FP only does 1280x1024. That's PATHETIC!, hence the reason for using CRTs instead....

Also, a 21" CRT is not only $200 cheaper. That flat panel costs $700 according to pricewatch. You can get a 21" capable of doing 2048x1536 with a reasonable refresh for under $300. It really is a no brainer.

As far as needing a new video card to take advantage of the real estate: a Geforce 2MX can do 2048x1440 stably for your windows desktop (or 2048x1536 but not so stable...) I wouldn't exactly call it a high end card. Although you'll have a much harder time finding a card that will do that res at a reasonable framerate in games.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
I would tend to disagree that CRTs provide better colors
According to something Carmark once wrote, CRTs beat LCDs in color reproduction. IIFC he said the best LCDs can't even reproduce 24-bit color while CRTs are all able to bring 32-bit color to you in all its glory. Whether or not your eyes notice the difference is another matter but technically LCDs lag behind.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: glugglug
Your 19" FP only does 1280x1024. That's PATHETIC!, hence the reason for using CRTs instead....

Also, a 21" CRT is not only $200 cheaper. That flat panel costs $700 according to pricewatch. You can get a 21" capable of doing 2048x1536 with a reasonable refresh for under $300. It really is a no brainer.

As far as needing a new video card to take advantage of the real estate: a Geforce 2MX can do 2048x1440 stably for your windows desktop (or 2048x1536 but not so stable...) I wouldn't exactly call it a high end card. Although you'll have a much harder time finding a card that will do that res at a reasonable framerate in games.

This is the type of ignorant post I'm used to seeing in CRT vs LCD threads. "Only" does 1280x1024, I never ran anything higher on my G420, b/c anything higher was so small and blurry, it was useless. There's nothing more fun than playing "Pixel Hunter" on your Windows desktop.
rolleye.gif
There's not a single game I play at higher than 1280x1024 b/c of performance issues (on a Ti4200p Turbo @ 310/650), and I don't own a 9700pro so its never an issue. The fact is, the majority of gamers run games at 1024 or 1280 and turn up detail, AF, or AA rather than max out their resolution. The only game I ever ran higher than 1280 (or felt I benefited from the increased resolution) was Warcraft III, and at "only" 1280x1024, it still looks better than my G420 at 1600x1200.

Let's quote pricewatch prices or 3 year old monitors from www.refurbedoffleasecrappymonitors.com and forget to include the $50 shipping costs.
rolleye.gif
If you are serious about your monitor, you are going to want to inspect it and have an easy method of returning it if defective. Sending it 1/2 way across the US only to find it is broken and then having to re-ship the 60lb behemoth kinda makes that a problem. Refresh rate is a non-factor with LCDs, as they don't need to refresh their pixels like CRTs. Not only that, I'd be more concerned with dot pitch over refresh rate anyhow. 2048x1440 pfft...there's never a time you would need that resolution on anything 21'' or smaller. Your comments make me think you've never seen 2048x1440 in real life, you haven't seen a large LCD in real-use (no those displays in-store don't do them justice), you're bitter b/c you can't afford a large LCD, or you simply don't know what you're talking about. Again, the ONLY video card that could attempt that resolution in games is a Radeon 9700, which still only gets around 30 average FPS in newer games. Taking min and max framerates into consideration, it'd probably be enough to make me nauseous.

Oh yah, and back to the original poster's thread, none of your comments have any relevance to what he was asking.

Chiz

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: JellyBaby
I would tend to disagree that CRTs provide better colors
According to something Carmark once wrote, CRTs beat LCDs in color reproduction. IIFC he said the best LCDs can't even reproduce 24-bit color while CRTs are all able to bring 32-bit color to you in all its glory. Whether or not your eyes notice the difference is another matter but technically LCDs lag behind.

That's gotta be dated, especially with DVI. LCDs use true RGB color reproduction from a digital source. Color in all its glory doesn't get much better than that.

Chiz
 

glugglug

Diamond Member
Jun 9, 2002
5,340
1
81
Originally posted by: chizow
That's gotta be dated, especially with DVI. LCDs use true RGB color reproduction from a digital source. Color in all its glory doesn't get much better than that.

No, its not. The LCD may take a 24-bit DVI signal, but most can only do either 15-bit or 18-bit (the lowest 2 or 3 bits of R, G, and B values are clipped off before display). Also very few if any LCDs produce a true red.

2048x1440 pfft...there's never a time you would need that resolution on anything 21'' or smaller. Your comments make me think you've never seen 2048x1440 in real life, you haven't seen a large LCD in real-use (no those displays in-store don't do them justice), you're bitter b/c you can't afford a large LCD, or you simply don't know what you're talking about.

I use 2048x1536 on a 21" CRT every day and have absolutely zero trouble reading it. (Until recently, now I use 1440x1920 with NVRotate until there is a new driver that supports custom modes or higher rotated...) I used to use 2048x1440 on a 19". The picture was actually noticably better on the 19", but it couldn't do 1536 rows at a decent refresh. The extra screen real estate is indespensible for looking at source code, plus I often have multiple terminal services windows up and if those windows need to be at least 1280x1024 for practical use, well having the main desktop bigger is simply better than having to put them in full screen mode. My 7 year old 17" monitor is fine in 1920x1200.

Again, the ONLY video card that could attempt that resolution in games is a Radeon 9700, which still only gets around 30 average FPS in newer games. Taking min and max framerates into consideration, it'd probably be enough to make me nauseous.
Warcraft III has no problem with 1920x1440 on my Geforce 2 GTS. Plenty of other games are at least playable in 1600x1200 (although I admit I usually use 1280x1024 for games due to framerate considerations)
 

eaadams

Senior member
Mar 2, 2001
345
0
0
This was not supposed to be a LCD vs CRT thread. I remember a while ago seeing pictures of the inside of id and saw that carmack used dual lcd's that were HUGE! looked so cool. Also if I remember there was once a .plan from someone else at id talking about what the resolution of the human eyes was. So what I find interesting is how we seem to have given up on pushing up resolution (how fast did we go from a norm of 640 to a norm of 1024) now we seem to be adding all these other features and I dont understand why? I wonder if it is a cheaper way to better grafix or if they are doing it for a reasonable reason.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
well BFG10K, in older games not useing fsaa would be the waste.
I agree with you there, specifically with the games that don't support high resolutions, and I have absolutely no qualms about using the Radeon 9700's awesome 6x FSAA in these situations. But if it's a choice between high resolution and FSAA, high resolution always takes precedence, even with the Radeon's great implementation. I never enable both of them together unless I'm getting ridiculous performance levels even when both of them are enabled.

These people who say how rubbish FSAA is, I wonder if they have a Graphics card worthy of it?
I've tried almost every single FSAA implementation ever made (through retail cards) and none of them comes close to true high resolutions. Because no matter what implementation you chose you still have the fundamental problem that all of your great sampling algorithms have to generate pixels that ultimately fit back down into your limited screen size.

High resolution is the only true fix because it physically gives you more pixels to use. This increases the renderering accuracy of the image (especially at long range), reduces the associated artifacts caused by the conversion of world co-ordinates (infinite) to screen co-ordinates (finite), reduces pixel popping, reduces texture shimmering, reduces jagged edges and reduces edge crawling. It also sharpens the whole image at the same time, which is something FSAA doesn't (and can't ever) do. In addition very large textures will look much better under straight high resolution than they ever will under low resolution + FSAA.

If you had an infinite resolution you could draw anything you liked at any angle and at any distance and you'd never have any of the problems mentioned above. Of course that's not possible, but we can certainly keep increasing the resolution and things will improve each time. Eventually (probably when each pixel is invisible to the naked eye) you'll come very close to reaching this "infinite resolution goal", if not effectively reaching it already.

Getting back to your comment, some FSAA implementations are better than others and the Radeon 9700 Pro's 6x FSAA is the best I've ever seen, especially since it handles the image quite smartly and doesn't cause excessive blurring when you use it. I will glady use its 6x FSAA when high resolutions are not available. OTOH something like Quincunx just looks like a monkey's rear end and I'd never use it

I would tend to disagree that CRTs provide better colors (with DVI there's no comparison),
Disagree all you like, no LCD can match the colours, saturation and contrast of a good CRT. How many graphics pros do you think use LCDs when they require precise colour matching? Not too many I'd imagine, if any at all. Also CRTs don't have the issue of blurriness, can run at any resolution you like without distortions (assuming a 3:4 ratio of course) and looking at the screen at an angle other than dead straight on doesn't screw up the image that you see. Have you ever tried looking an LCD screen at a 45 degree angle? It's a totally different world.

and they don't need high refresh rates since LCDs don't "refresh" like CRTs do.
For normal desktop viewing perhaps not, but for games that have framerates then yeah, it makes a huge difference. While I agree that there's a difference between 120 FPS and 60 FPS on a 60 Hz monitor, it's far better to have 120 FPS on a 120 Hz monitor. Framerate is simply the amount of images drawn per second and LCDs are inferior in that respect because their effective refresh rate is much lower than a good quality CRT.

No distortion, warping, geometric issues, and the screen is 100% flat.
That sounds like something any good quality CRT can also provide - a CRT that'll be half the price of the LCD, much bigger and run at far higher refresh rates and resolutions.

not a single game I play at higher than 1280x1024 b/c of performance issues (on a Ti4200p Turbo @ 310/650),
I play games at 1600 x 1200 or higher. Also why should I buy an LCD which limits me to one magic resolution when I could buy a far cheaper CRT that goes higher and to any setting I like?

Also what happens when your Ti4200 can't handle 1280 x 1024? Either you have to suck it up, upgrade your card, or lower the resolution and be greeted with a distorted picture. OTOH a CRT person can just run at 1152 x 864 or whatever 3:4 resolution he/she likes.

CRTs are cheaper, have more options, are bigger, have higher refresh rates and have much better colours.