On "retina"/Hi-res screens

sufs

Junior Member
May 9, 2012
5
0
0
Hi,

As some of you might know Apple is probably readying "retina" screens across most of their new Macbook Pro/Air and iMac lines.

Also, with this comes the endless whining about GPU performance not being able to handle gaming on this resolutions.

However, I can't see why this should pose a problem. Surely any modern GPU should be able to map 1 rendered pixel to the four pixels that resides on the same physical space as the previous generation screens. And all this with little to no overhead?

Correct me if I'm wrong, but using this "scaler" of sorts makes the whole argument of 3D performance stupid and void in terms of worse performance on Hi-res screens. Of course assuming that Apple, or any other vendor for that matter, goes the same route they did with the iPhone 3G => iPhone 4 and iPad 2 => iPad 3 in terms of quadrupling pixel count in the same square unit.


On a side note, would not Hi-Res screens make way for less use of Anti Aliasing, which (last time i checked) had major impact on 3D performance.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I am not sure about the new mac books. But I think the majority of games on the iPad 3 are being scaled up to the "Retina" resolution. Playing games @ 2048-by-1536 is definitely a strain on any single GPU - regardless of AA.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
On a side note, would not Hi-Res screens make way for less use of Anti Aliasing, which (last time i checked) had major impact on 3D performance.
No they would not reduce the need for rotated grid AA (or >= 8 sparse samples), because of the angles at which jaggies appear. AA also doesn't really have a "major impact" on 3D performance any more than an uber high res does.

AA will be needed to reduce edge aliasing as long as conventional rasterization is used. I don't believe it will be a problem with ray tracing, but edge aliasing is an inherent feature of conventional rasterization.

There is no such thing as a retina display anyway, because the image is still made of pixels. More pixels per inch is better than less pixels per inch, but this "retina display" is also a marketing gimmick (like 1080p was) to cover up for piss poor quality of output devices as well as lossy file formats. I'd much rather see 1920x1200 displays with better power circuitry than they currently use, extra long life RGBLEDs, 120, 180, or even 240 Hz* input, than 2400p stuff that only displays 70% of the NTSC color gamut and that has an assload of input lag.

*DL-DVI isn't enough for more than 1920x1200 @ 120 Hz, but DL-DVI isn't the future anyway.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Everything is a "retina display" at far enough viewing distance...

Next person to use the stupid term "retina display" gets an icepick in the retina.
 

sufs

Junior Member
May 9, 2012
5
0
0
No they would not reduce the need for rotated grid AA (or >= 8 sparse samples), because of the angles at which jaggies appear. AA also doesn't really have a "major impact" on 3D performance any more than an uber high res does.

AA will be needed to reduce edge aliasing as long as conventional rasterization is used. I don't believe it will be a problem with ray tracing, but edge aliasing is an inherent feature of conventional rasterization.

There is no such thing as a retina display anyway, because the image is still made of pixels. More pixels per inch is better than less pixels per inch, but this "retina display" is also a marketing gimmick (like 1080p was) to cover up for piss poor quality of output devices as well as lossy file formats. I'd much rather see 1920x1200 displays with better power circuitry than they currently use, extra long life RGBLEDs, 120, 180, or even 240 Hz* input, than 2400p stuff that only displays 70% of the NTSC color gamut and that has an assload of input lag.

*DL-DVI isn't enough for more than 1920x1200 @ 120 Hz, but DL-DVI isn't the future anyway.

Thanks, that cleared up a lot of misconceptions I had. I'm very aware of the "retina" marketing gimmick, however I'd rather swallow the gimmick and get hi-res screens instead of the 720p crap they put on everything nowadays.

What would be the benefits with 120Hz+ input in terms of image quality?
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Doesn't increase IQ, but it decreases input lag.

Input lag (time from input to output) and frame lag (time from frame to frame) are different although the time between frames is factored into input lag, many monitors have an input lag of far greater than the time lapse between two frames.

The statement you made may or may not be true- it really depends on the setup of the display. It is quite possible that a 120hz version of a monitor could have increased input lag compared to a 60hz version. The components added to a 120hz version could actually take longer to process the image even though it is accepting and displaying images at 120hz.

I would state that the benefit to a 120hz display would be perceived smoothness and a decreased likely hood to experience microstutter in high end setups.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Monitors with 4x as many pixels as we routinely see today might push graphics cards to the limit but that is a good thing. We use Anti Aliasing at the moment to compensate for how poor the monitors are and we could use a somewhat better AA algorithm for interpolating the additional pixels until the graphics cards catch up and can run those resolutions natively.

These 4x resolution screens will help not just games but also text rendering as well. All the "greyness" of the edges of characters can finally disappear and be replaced with nice crisp black text rather than this horrid sub pixel rendered rubbish we have today.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Crap. Give me blacks no lag nor blur first. OLED can't come fast enough.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
AA will be needed to reduce edge aliasing as long as conventional rasterization is used. I don't believe it will be a problem with ray tracing, but edge aliasing is an inherent feature of conventional rasterization.
Aliasing and the dynamic scene problem are pretty much the only two things keeping us from using ray tracing in modern engines. It takes an incredible amount of computing power to bring a ray traced image to par with a scene rendered using conventional shaders in regards to aliasing.
 
Last edited:

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Crap. Give me blacks no lag nor blur first. OLED can't come fast enough.
I could be mistaken, but OLED won't have less lag unless you're planning on using analog. I believe that Blue OLEDs will always have shorter lifespan than red and green ones.

I personally don't see OLED tech as a legitimate successor to LCD tech since OLED has some issues and since LCDs can be vastly improved other than that LCDs will never have as low of response time as an OLED.
Aliasing and the dynamic scene problem are pretty much the only two things keeping us from using ray tracing in modern engines. It takes an incredible amount of computing power to bring a ray traced image to par with a scene rendered using conventional shaders in regards to aliasing.
Okay, I didn't know that:) I had thought that raytracing actually didn't have aliasing, but then I don't know much about it.
 
Last edited:
Feb 25, 2011
16,776
1,466
126
OP: This is already happening. It can be implemented by the game dev. Basically does what you say - rendering at 1024x768, or even something in between like 1600x1200, then upscaling to 2048x1536.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Everything is a "retina display" at far enough viewing distance...

Next person to use the stupid term "retina display" gets an icepick in the retina.

Okay, so what word should people use to describe them?

High resolution doesn't work, because some screens are high res that are not "retina-style" displays at a typical seated distance at a computer.

So what term should we use? Why NOT use retina? It's one word, where any other description requires half a sentence to describe they're talking about low pixel pitch + high resolution.
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
You kids gotta stop using that apple catch phrase...

If you stand far back enough from any display, it's essentially a retina display, since the concept only depends on fovea-l angles.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
However, I can't see why this should pose a problem. Surely any modern GPU should be able to map 1 rendered pixel to the four pixels that resides on the same physical space as the previous generation screens. And all this with little to no overhead?

Correct me if I'm wrong, but using this "scaler" of sorts makes the whole argument of 3D performance stupid and void in terms of worse performance on Hi-res screens. Of course assuming that Apple, or any other vendor for that matter, goes the same route they did with the iPhone 3G => iPhone 4 and iPad 2 => iPad 3 in terms of quadrupling pixel count in the same square unit.
As others have noted, you are correct. GPU scaling is effectively free with no meaningful overhead. The trick is that in order to avoid the ugly, blurring pixel interpolation we see today when running at lower resolutions (e.g. 1920 on a 2560 display) you need to use integer scaling, so that an image is increased by 2x2, 3x3, 4x4, etc. This is the basis for what Apple did for their Retina displays, where they quadrupled the resolution of both the iPhone and the iPad. Since Apple has a fixed device resolution all old programs were immediately and cleanly scaled up.

If the rumors are true, this is also where Apple is going with their desktop displays. The new MacBooks would have panels with 4 times the resolution of their existing panels, allowing existing content designed for those screens to scale up cleanly. So a game could continue to render at 1440x900 and easily be scaled up for display purposes, while the lower rendering resolution will allow games to continue to be practical on the relatively low performance of a laptop GPU.

Of course in practice it's going to be more complex than that. From what we've seen with console games game makers have no qualms with rendering at oddball resolutions that don't cleanly scale, counting on the blurriness of interpolation to hide their aliasing. So you may see games that render at 1680x900 on a 2880x1800 screen, or something else screwy like that.

On a side note, would not Hi-Res screens make way for less use of Anti Aliasing, which (last time i checked) had major impact on 3D performance.
It reduces, albeit not eliminates the need for AA. Fundamentally AA exists to solve the fact that discrete pixels betray our continuous vision, and while a higher pixel density makes those discrete pixels harder to see, they're still discrete pixels. For such a display it's in effect a form of ordered grid anti-aliasing, which studies have found doesn't hide aliasing as well as a rotated/sparse grid. Basically you may still see some aliasing on high resolution displays, but until we have those displays it's going to be hard to say just how much aliasing actually remains. Most likely geometry aliasing will be reduced, but shader aliasing will still be visible (particularly specular lighting).
Okay, so what word should people use to describe them?

High resolution doesn't work, because some screens are high res that are not "retina-style" displays at a typical seated distance at a computer.

So what term should we use? Why NOT use retina? It's one word, where any other description requires half a sentence to describe they're talking about low pixel pitch + high resolution.
The "most correct" term would be a HiDPI display, which is the term both Apple and MS have been using for the 4x scaled bitmap image + native resolution text rendering modes their OSes use for driving these displays. There's no ambiguity there, it has capitalization, and it has neat alliteration.:p
 
Last edited:

mavere

Member
Mar 2, 2005
186
0
76
OP: This is already happening. It can be implemented by the game dev. Basically does what you say - rendering at 1024x768, or even something in between like 1600x1200, then upscaling to 2048x1536.

It's a bit of a damned-if-you-do, damned-if-you-don't situation, however, when you're not running at native res. At 1024x, the game lacks detail but is sharp(er), while at 1600x or whatever, the game has more detail but is blurry due to interpolation.

Non-Retina games on an iPad have the misfortune of looking comparatively bad because of the other Retina assets. I wonder if we'll come across the same psychological aspect here. At the very least, assuming HiDPI screens proliferate across all platforms, Nvidia and AMD will get a lot of new orders.

Also, Apple has chosen Retina as the marketing term for a very specific technological step: high ppi screen with an OS-supported quarter-res fallback. Considering that we're discussing Apple products, it makes sense to use their branded name to describe their own products, and it would make further sense to save any pedantry for when we discuss other platforms. ;)
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
I can't see the OS exposing the true resolution of the display unless the application specifically asks to do so. Doing it any other way would break compatibility with older games which ask for exclusive fullscreen use.