Poly edges are sub pixel in accuracy. To make edge aliasing non visible to the human eye the best approach is to make the pixels smaller then the eye can discern on an individual basis. If you do this, you eliminate visible poly edge aliasing, unless of course you can't display the image at the native resolution. If you are seeing edge aliasing, the game isn't running at the native resolution.
Which metric would you like me to use? Compute power where desktop GPUs have been in the TFLOP range for years? Geometric throughput? Which metric? No matter which you decide on, the A5X is going to be utterily humiliated to an absurd degree. Having monster specs for a SoC doesn't mean it is in the league of even the weakest current desktop GPU.
Those benches won't change, they are primitive throughput- perhaps I should say those benches shouldn't change because they are primitive throughput.
I was quite clear long before we had any of the details on what matters in a quality display. Contrast was at the top of my list, well before we had any reviews of the new iPad. The reviews are in, the new iPad loses to the dollar store quality Kindle Fire in the most important metric. You can take the word of someone who can't tell the difference between bilinear and trilinear at a glance- based on your comments you wouldn't be able to either.
As far as performance- the benched back me up, across the board actually. You can keep your devotion going, the performance data spells out exactly what I said. With both at their native resolution the new iPad has inferior performance to the iPad2 in anything GPU intensive.
Umm... you're comparing graphics cards that run in a 300W envelope to an SoC that runs at 4W envelope. That is a 75x difference in TDP. Any performance comparisons are absurd at that point...