• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

2560x1440 on native 1920x1080 screen? lol

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Rendering in a resolution larger than your monitor's resolution is a known trick. It will produce better image quality. It is very similar to using SSAA. I believe it's even better. Maybe it is similar to how many bits color-depth you use for internal rendering, even when the output is still only 24-bits.

There are people who's hobby it is to produce nice screenshots from games. They use tools like EnbSeries or SweetFX. But they also use huge resolutions to produce their screenshots. For some reason this works better than simply using SSAA. They don't care about the performance hit, because they don't care about the running game, only the stills (= screen shots).

Here is an example of a guy (jim2point0) who makes very nice screenshots.
https://secure.flickr.com/photos/jim2point0/sets/
 
Last edited:
I guess I did not realize it would be as simple as just creating a larger custom resolution. not only did I think my screen would not display it properly in games but I thought my cable would be a limitation.


Yes, the monitor doesn't magic more pixels into existence instead it's performing a super sampling process through its scaler hardware. Interesting point from VirtualLarry about the potential for toyota to have a DVI-I connection going between his monitor and gpu.

The screenshot size is easily explained as that's the resolution the GPU is rendering. Still no magic pixels, just good scaling hardware in a lot of those non-budget Dell monitors.

and yes I am almost positive that I have dvi-i.
 
then AGAIN tell me when it comes to games why I am getting crisper visuals and less aliasing and all the performance of 2560x1440? if i don't really have those pixels there then the image should look the same, not show up as 2560x1440 in a screenshot and not impact performance.

The performance of 1440P is because your GPU is rendering a 1440P image. The picture looks cleaner because you have "zoomed out". You are taking a 1440P picture and squeezing it into 1080P. Open a high resolution picture in Windows Picture viewer and zoom in on it and you will see jagged edges but as you zoom out the edges are blended and the picture looks crisper.
 
The performance of 1440P is because your GPU is rendering a 1440P image. The picture looks cleaner because you have "zoomed out". You are taking a 1440P picture and squeezing it into 1080P. Open a high resolution picture in Windows Picture viewer and zoom in on it and you will see jagged edges but as you zoom out the edges are blended and the picture looks crisper.

so I have the visuals and performance of 1440 but its not 1440? sure makes sense. lol

and by zoomed out I hope you dont mean its literally a different image I am seeing in the game because 1080 and 1440 line up perfectly. its just 1440 looks crisper and has less aliasing. and the screenshots at 2560x1440 look exactly like 2560 screenshots should.
 
Last edited:
so I have the visuals and performance of 1440 but its not 1440? sure makes sense. lol

and by zoomed out I hope you dont mean its literally a different spots I am seeing in the game because 1080 and 1440 line up perfectly. its just 1440 looks crisper and has less aliasing.

No I don't. I will leave this to someone else to explain since you continue to fail to grasp my explanations :whiste:
 
I dont have any newer Dell monitors to check with. On the 2 older ones at the house when you hit the menu button it shows current resolution. Still work this way? What's sbown if so
 
I was doing this like 10 years ago when encoding video.

You'd scale up to a massive resolution to apply noise reduction and sharpening and then downscale it back to the normal resolution and you'd get a bit more of a "natural" image. Ran superfast on my 2.8ghz A64 S939!

But seriously, this has been about for ages and is very easy with NVidia cards. I was playing Dishonored at just under 4k and it looked amazing. All you're doing, as other people have said, is adding SSAA, which is always going to be better than MSAA. And types of AA like FXAA are going to work better with more pixel information too (blur less, most likely).

It still randomly stops working though. My machine currently won't even accept 2560x. Probably need to nuke drivers and start from fresh or something, blah :/
 
and yes I am almost positive that I have dvi-i.

The reason that I brought that up is because DVI-I carries both a 165Mhz TMDS signal (on the digital pins) as well as a traditional analog VGA signal.

So the question is, if both GPU and monitor are DVI-I capable (capable of both digital and analog signals), then which type is it using?
 
well then what do you call it when I am sitting here on the desktop in 2560x1440 with tiny icons and way more real estate? my 1920x1080 wallpaper has black borders around it so yes I am getting 2560x1400 on the screen.

again in games there is less aliasing and they look more crisp and if I take a screenshot its 2560x1440 so what do you call that? also if there were not more pixels being shown then how do you explain the benchmarks.
Given your setup, it sounds like your GPU is rendering at 2560x1440 (whatever it you set it at), then downsamples it to a 1080p signal before sending it to the monitor. I do not think your cable can possibly carry anything close to a 3200x1800 signal, so therefore, I think the scaling is being done prior to the signal being sent to the monitor.
 
Given your setup, it sounds like your GPU is rendering at 2560x1440 (whatever it you set it at), then downsamples it to a 1080p signal before sending it to the monitor. I do not think your cable can possibly carry anything close to a 3200x1800 signal, so therefore, I think the scaling is being done prior to the signal being sent to the monitor.

That would mean his GPU is doing the downsampling, which most Nvidia cards can do, and I assume AMD as well.
 
I dont have any newer Dell monitors to check with. On the 2 older ones at the house when you hit the menu button it shows current resolution. Still work this way? What's sbown if so

On the OSD menu in my dell u2711 both the current resolution and type of input are shown, but it's a pretty old model. Still, nothing worth upgrading it to. I'd like both 120hz and 2560 IPS screen, I know that's possible with modding for the half of the price I paid for my dell.

btw. by driving it by DP and connecting my headphones to my monitor do I incur any loss in audio quality?
 
Back
Top