So much bad information in this thread and I only got to page 2.
The premise is sound, a higher PPI (more pixels per inch) is better than using any form of anti-aliasing, with a high enough PPI your eyes would stop being able to discern aliasing in a scene.
However the reality of the situation is that in the short term we'll not likely be able to increase the PPI sufficiently to eliminate the need for AA entirely.
As a performance trade off MSAA and other optimized versions of AA can yield faster results than simply increasing the PPI, but then you'd expect that, increasing the PPI leaves you with a perfect picture where as using AA, especially shader based AA leaves you with a blurred scene and a non-perfect result.
We have Anti-Aliasing to stop Aliasing, and we have Aliasing because we have imperfect displays, our ultimate goal should always be to create perfect displays, that is to say displays so good our eyes cannot distinguish between reality and the display.
In the meantime we need to overcome this idea of maintaining ~100 PPI or less for monitors, it would really be nice to see something like 2560x1600 shrunk down to maybe 22-24", the PPI on the 2560x1440@27" and 2560x1600@30" displays is already an improvement over the smaller monitors (somewhat counter-intuitively, PPI tends to go up with display size) and on my 2560x1600@30" display I can already see diminishing returns with 8xAA not really giving a noticeable benefit over 4xAA.
One thing is for sure, with the gaming scene shifting towards a console dominated market we're going to stop seeing the steady increase in graphical fidelity over time, instead only seeing major jumps every 6-7 years, meanwhile as graphics power continues to increase we need something to "spend" it on, and higher PPI monitors is a great way to put that extra horsepower to use.