Nevermind the fact that 99% of users don't have a GPU capable of supporting 2560x1600.
Yes they do, the hd 4000 can support 2560 x 1600 fairly easily for 2d applications. Using word or checking your email doesn't require much gpu power. 3d demanding games obviously not. But they as a whole very few people with a computer play demanding games on them.
Note: Yes the rmbp does seem to have some problem with the hd 4000 displaying smooth images on the screen but there are two things to be aware of. First, OSX renders at a higher resolution and then scales down. For 1080p the entire image is rendered at 3840 x 2160 and scales down to 1080p. Secondly, especially with the web browsing sub 60 fps scenarios some of the problem is with the way the software is encoded. The cpu process displaying the webpage uses only a single core with is often maxed out and bottlenecks the computer creating stuttering. With a software fix this should be eliminated.
Resolution makes a huge difference with regards to AA. I have two 15.6 inch laptops. One is 1366 x 768 and the other is 1080p. The 1366 x 768 NEEDED AA if you were going to game to get rid of the jagged lines. The laptop with the 1080p screen does not really need AA. Sure if you look for it you can see jaggies but you have to be fairly close to the computer and have to look for them. 2x AA at 1080p is easily equivalent to 8x AA at 1366 x 768. Anyone running a game at native res on the 15 inch rmbp does not need AA as the pixels are so small they cannot effectively see pixels.
And yes text is smaller but set font scaling to 125% in windows and everything is fixed.
Skyrim at minimum settings at 1080p (textures high)
Uploaded with
ImageShack.us
Skyrim ultra (shadows med) at 1280 x 720
Uploaded with
ImageShack.us
The ultra quality at 1280 x 720 looks better (mainly because of AF) but gets significantly worse frame rates. (Note: Both screenshots were uploaded at the same resolution because otherwise one would be much bigger than the other eliminating the ppi argument). Ultra at 720p gets 56 fps. Low, textures high at 1080p gets 60 fps at 75-80% gpu usage, without v-sync this would probably be around 75 fps. Adding AF (which costs almost nothing but would clear up the muddy ground) and a few other settings would likely make the two identical (or very close) at the same frame rate. (AF makes the two almost identical but higher fps for the 1080p one). Now disregarding texture quality, there are slightly more jaggies on the 1080p low screenshot
Running everything maxed at 1080p including shadows gets 35 fps for that scene.
Running everything maxed at 720p including shadows gets 48 fps for that scene.
(Note: No fxaa because it looks horrible).
The relationship between resolution and fps is not linear.
1080p is 1920 x 1080 or about 2.1 M pixels. 720p is 1280 x 720 or 0.92 M pixels. 2.1/0.92~2.3 times. Yet you get 48 fps instead of the expected 80 (35 x 2.3=80.5).
http://www.tomshardware.com/reviews/digital-storm-x17-radeon-hd-7970m-enduro,3345-6.html
Despite the fact the 1080p is about 2.28x more pixels than 720p you will not see a single instance in which 720p gets twice the fps than 1080p does at the same settings.
1080p with 2x AA gets 40 fps (this is roughly comparable to 720 8x AA with regards to jaggies).
1080p without any AA gets 45 fps.
So 720p with 8x AA gets roughly the same as 1080p no AA on the same settings, at least for skyrim.
http://www.tomshardware.com/reviews/digital-storm-x17-radeon-hd-7970m-enduro,3345-7.html
Supporting my data, there is very little variation in frame rate between resolutions at the ultra setting.
I have only tested one game with a static scene. More data points are needed to reach a real analysis. Me personally I would prefer to play at 1080p on med than 720p on ultra.
Computer specs
Lenovo y580
8 gb ram
i7 3630qm
gtx 660m (runs at 1085/2500)