I assume you mean bilinear, trilinear and anisotropic filters. So far as I'm aware they are very similar to the full screen anti aliasing but done on the textures rather than the egdes of polygons.
To see the effects, try removing whatever filtering you can and looking at a textured surface which fades away into the background. As it gets further backwards, the pattern will quickly be lost - particularly features such as the gaps between stones on a floor or wall. Turning on the filters you'll see these features remain much further into the background. The downside is that you have to do more work on the pixels in the texture to get some sort of average of what a pixel in the image should show.
Taking a 256x256 texture applied to a 64x64 surface in a 3D image. No filtering I assume would just pick 1 single pixel from the texture and apply it to the pixel in the image. A better solution would be to get the average value of a 4x4 section of the original texture and paint that onto the image. However, this would take a lot of computation to actually do so I'd assume a much more efficient way is attempted. On the other hand, a texture might fit into cache, allowing very fast computation, so perhaps it is done like that.
Anyway, the Radeon seemed to do a better job at this than a GeForce 2, and I thought (in my subjective opinion) that a Radeon looked better than a GeForce running a whole resolution higher, and therefore I selected a Radeon for my computer.