I just tried 4k (on my 1080p display) for Elite: Dangerous again.
Yeah, DSR/VSR should improve IQ better than AA can, but the cost is great. My R9 290 has to have the memory OC'd to 1500 MHz at increased voltage to get a decent framerate. Being purely bottlenecked by memory speed, I saw a nice increase in framerate. I had to have the power limit maxed out along with the fan speed to get the most out of performance. I saw a nice bump in framerate (~35fps -> 42fps in station for medium/high settings), but my GPU is working much harder; pumping out a lot more heat.
I don't think it is worth it. The IQ difference is good, but efficiency goes out the window. Native resolution is the most important factor. I don't see the point in maxing out my GPU on a resolution that is mostly wasted on an inferior display.
Going back to native 1080p, I do notice that text is a little blurrier, and there is a bit more aliasing. At maxed settings for 1080p, I get a consistent 60fps. At similar settings for 4k, I get 32 fps. At the normal distance I sit from the monitor, I can barely tell the difference in quality on most textures. There are some striking differences on small text, but it isn't worth the huge performance drop.
If most of us play modern games, than we probably don't see the appeal in using DSR/VSR unless we have the GPU grunt. If I owned 2 GTX 980 ti's, I'd probably own a 4k monitor as well, making DSR irrelevant. I see it as a good way to test our hardware, but other than that, it is kinda gimmicky imo.
I guess it is only worth it if 4k DSR/VSR can be used while maintaining the monitor's refresh rate.