• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

External antialiasing?

cirthix

Diamond Member
is there some sort of way to do antialiasing on a complete signal? i'm talking about some sort of box that solely does antialiasing, nothing more. Kind of like a box that goes between your monitor and vid card
 
Originally posted by: cirthix
is there some sort of way to do antialiasing on a complete signal? i'm talking about some sort of box that solely does antialiasing, nothing more. Kind of like a box that goes between your monitor and vid card

It would be possible to do this. However, there is nothing inherent about the signal that will allow the box to distinguish between lines\edges and surfaces. This would make the antialiasing more like a motion blur. You can create a system that does line detection, etc, but that would mean a necessary and real delay between the time the signal is outputted from the computer to the box, and from the box to the monitor. You'll pretty much be plugging your computer into ANOTHER computer just to have AA.

There is no reason for such a system to exist separately from the computer that is the source of the original signal.

Of course, if you dont want real time, you can just capture the video, and then use a video editting program to do the work.
 
k, i was thinkin it could help out cards to make the image a little smoother while keeping the same framerate. i guess its less work just to get a better card or oc it. i just always wondered about it. tahnks for the quick reply
 
It is not possible to do this.

A display of 1000 pixels across can only represent a signal with a maximum frequency of 500 cycles (0.5 cycles/pixel) across. When a model is rendered, it is equivalent to sampling it, and it may have features that go beyond 500 cycles. That high frequency energy gets mapped into noise (like moire patterns or jaggies on edges), but the noise is below 500 cycles. That is, the high frequency is "aliased" to a low frequency, that's what the term means.

There is no way to tell which low frequenies are legitimate and which are aliases of some high frequency.

Graphics cards antialias by sampling at a higher frequency, and then doing a digital low-pass filter to remove the high frequenies before they display the pixels.
 
Like Don said, Once the data has reached the graphics card, you've already thrown away all the data you have for doing AA. The entire point of AA is that you render the relevant parts of the image and a higher resolution to get rid of jaggies. The very earlist AA cards did very little more than just internally render a high resolution scene and it showed in the performance. If you want a "poor man's AA", take a look at LCD interpolation. It seems to do essentially what you propose but it's not a replacement for AA.
 
You need more pixels to antialias the image. If you supersample it, you would only get a quarter of the original resolution. If you think that rotate the image by a few pixels would do, you would only get a flat crap.
 
Back
Top