• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

If the next gen console games will be made for 1080p...

-Slacker-

Golden Member
...Then how will they run on 720p TVs, which I assume a lot of people will still own?

Will the games be optimized for 2 different resolutions, or will the be output at 720p or less, and then up-scaled to 1080p like they are on the ps3 and xbox360?
 
I thought most of the ones with good 3d visuals run at 720p or less and they are upscaled to fit the resolution of the screen.
 
So if the games on the next gen consoles will be 1080p, then will it be possible to down scale them to 720p if that's the native resolution of your tv? And will that increase performance or...?
 
I didn't know you could still even buy anything that wasn't 1080p in the last couple years outside of 13" kitchen TVs. I'm keeping eyes open for a gently used 480p flat panel exclusively for PS2-GCN-XBox-DC era consoles. I'm spoiled to the point that 480p upscaled even looks bad on a 1080p display.

Current gen consoles, 360 and PS3 anyway, are perfectly able to do native 1080p framebuffers. Game devs choosing not to use because of performance issues as games get more complex is another story.
 
Last edited:
...Then how will they run on 720p TVs, which I assume a lot of people will still own?

Will the games be optimized for 2 different resolutions, or will the be output at 720p or less, and then up-scaled to 1080p like they are on the ps3 and xbox360?

They'll run just fine on 720p televisions. In the 360 dashboard you can toggle between 480i/p or 720p or 1080p. Whatever you select is what the 360 will output at. Even games that are internally rendered at 720p will be upscaled to 1080p if that is the resolution you tell your xbox to output to. My 360 even seems to pick up on whatever rez the device it is hooked up to supports. I didn't even need to go to the settings before it outputted to my projector in 720p, only to output at 1080p when I brought it over to a friends house and hooked it up to his tv. I imagine the PS3 has similar settings.

Not that it really matters though. Every 720p television I see has "Maximum supported rez 1080p" or something on it. If the console doesn't scale it down for you, the tv likely will.

I didn't know you could still even buy anything that wasn't 1080p in the last couple years outside of 13" kitchen TVs.

A lot of 32" ones you see going for $200-300 are still 720p today. It makes sense as the 1080p 32" ones are often 50% more expensive, yet at 32 inches you're unlikely to tell the difference between 720p and 1080p on a TV you're staring at from half a room away. You really do need a bigger display to get the most out of 1080p. So long as you're talking to console gamers anyway. Some of the PC gamers on this board threaten to gouge their eyes out if their tiny 22" isn't 2560x1600+. 😛
 
Last edited by a moderator:
A lot of 32" ones you see going for $200-300 are still 720p today. It makes sense as the 1080p 32" ones are often 50% more expensive, yet at 32 inches you're unlikely to tell the difference between 720p and 1080p on a TV you're staring at from half a room away. You really do need a bigger display to get the most out of 1080p. So long as you're talking to console gamers anyway. Some of the PC gamers on this board threaten to gouge their eyes out if their tiny 22" isn't 2560x1600+. 😛

It's not so much the DPI, but the scaling has a profound effect on image sharpness. Whether you can resolve individual pixels or not, you can certainly perceive the wax paper like effect the filter/scale operation has on the overall picture.

Even at 32", 720p is noticeably less spectacular on a 1080p panel, and vice versa. No matter what the size of the screen, I'm a huge proponent of always running native res.

That's why I have a 15 KHz RGB CRT for my 16 bit consoles and will be looking for an old 480p flat panel for anything after SNES/GEN/PS1 but before PS3/360.

Running PS3/360 on a 1080p screen all the time even if certain games don't actually use a 1080p framebuffer isn't as bad because of all the multi pass rendering, AA, etc that goes on anyway it's pretty hard to debate what the native res is supposed to be in the first place. But you can definitely tell when a PS2 via 480p component is run on a 1080p screen; even a 32", it looks horrible.
 
Last edited:
I think you missed the point of what I was trying to say with my post. I wasn't trying to say "480i? SCALE IT TO 1080 TEE VEES LAWL". Obviously running something on a native res looks better. But even if the Xbox 720 or whatever is able to render games in 1080p natively there is no reason why it could not do so in 720p as well if you told the system to do as such in the system settings. Heck, the game may even run in a slightly higher fps when set to output in a lower resolution. Your Super Nintendo stuff obviously does not have the option and likely outputs in 240p or something no matter what.
 
Don't think game designed for 1080P has unreadable text on 720P TV will be a problem like it was for SDTVs this gen.
 
No matter what the size of the screen, I'm a huge proponent of always running native res.
Obviously running something on a native res looks better.
How do you guys figure this? For comparison sake, I took these two screenshots, the first rendered at 720p, and the second rendered at 1080p and then scaled down to 720p, all other settings are the same:

43E98F01E4BC827FEEE9290B01AE26E9FDBC9C7B

downsampled.jpg


Can you point out any way in which the native image looks better than downsampled one?
 
Last edited:
Yes. I'm on mobile and can't show you. Example, look at aliasing and sampling artifacts on the utility poles for example. They appear softer and not "pixel perfect". Note how native you have solid black vertical lines with a distinct break between object and sky, but scaled there are fuzzy unclear intermediate lines and blotches that are the averages of pixels between the pole and sky. More examples between the hand and gun, fence and sky,etc.

These are just a small isolated examples; the effect is present throughout the image and contributes to an overall softer fuzzier picture when full screen and moving as the averages change.

And you used software to resize the image offline with advanced CPU intensive sampling and filtering in non real time anyway. That operation is nowhere near as good in real time by display processors and scalers. Then factor that less common downscaling produces results superior to the more common upscaling; in one you have information you are averaging and reducing, this isn't as bad. The other that most of us experience is averaging too few pixels to produce duplicate data that was never there and doing so at non integer multiples which makes upscaled games, esp 480p, look like you have wax paper over your screen. A red pixel next to a blue pixel is no longer two pixels, but 20 pixels of various shades of purple.

A static image isn't a big deal, but when its in motion on a full screen you can tell that there is a resolution mismatch.
 
Last edited:
How do you guys figure this? For comparison sake, I took these two screenshots, the first rendered at 720p, and the second rendered at 1080p and then scaled down to 720p, all other settings are the same:

Can you point out any way in which the native image looks better than downsampled one?

For the second one, did you just take a 1080p screenshot and then resize/resample it in Photoshop? Or did you actually have it render in 1080p and make the game downsample it? Is there a difference?

Not a rhetorical question; I'm legitimately not sure how you would even make a PC game render at one resolution but display at a lower resolution.
 
Yes. I'm on mobile and can't show you. Example, look at aliasing and sampling artifacts on the utility poles for example. They appear softer and not "pixel perfect". Note how native you have solid black vertical lines with a distinct break between object and sky, but scaled there are fuzzy unclear intermediate lines and blotches that are the averages between the pole and sky.

And you used software to resize the image offline with advanced CPU intensive sampling and filtering. in non real time anyway. That operation isn't as good in real time by display processors and scalpers.

The downsampled one obviously looks better and I don't see how you could possibly argue otherwise; however, I doubt those screenshots are actually indicative of what a console game would look like at 720p native vs. 1080p downsampled to 720p.
 
No it doesn't. You just prefer a soft blurry picture, obviously.

You see what appears to be "better" because the fence and other "billboarded" items aren't as broken up due to sampling/aliasing errors present in the native image. Downscaling "anti-aliases" these at the cost of making the whole scene have a thick, blurry feel to it that smudges fine detail.

The proper way is to perform proper AA techniques such as "transparency AA" to resolve aliasing issues with those individual items at the framebuffer's native resolution, not downscale. Downscaling the entire screen produces inferior results compared to performing correct super sampled FSAA.
 
Last edited:
The current consoles still leave a lot on the table in terms of performance. There is absolutely no console on the market today that can do 1080P (natively), with AA enabled, and keep a constant 60fps with a game like BF3 graphics.

I hope the new consoles are forward-thinking and will allow at least 1600P graphics, eventually. Look at the original XBOX. Hardly anyone had 720P or 1080i displays when it launched in 2001, but it definitely found it's legs as time moved on. XBOX games at 720P or 1080i were MILES ahead of the PS2 counterparts late in their life-cycle.
 
Example, look at aliasing and sampling artifacts on the utility poles for example. They appear softer and not "pixel perfect".
Sure, I specifically turned off AA to take those shots to emphasizes that effect. The fence sections provide even better examples, as sections that look like random pixels floating in space in the native image look a whole lot more like actual fence in the downsampled one.

Anyway, had I used AA, the native image wouldn't have those "solid black vertical lines with a distinct break between object and sky" either, and rather both would show what one might describe as "fuzzy unclear intermediate lines and blotches that are the averages between the pole and sky", though "fuzzy unclear" and "blotches" aren't the terms I'd choose in either case. So, am I to take it you are of the opinion that rendered images look better without AA?

This is just a small isolated example; the effect is present throughout the image and contributes to an overall softer fuzzier picture when full screen and moving as the averages change.
The texture detail is obviously softer/fuzzier on parts of on the native image, the cracks in the ground to the left and right of the gun being the most notable examples. I specifically turned off AF to emphasis this difference, though it will still show to a lesser extent with AF on.

And you used software to resize the image offline with advanced CPU intensive sampling and filtering. in non real time anyway. That operation isn't as good in real time by display processors and scalers.
I used standard bicubic resampling in Photoshop, which provides comparable results to the scaling I've seen on plenty of displays, including my own plasma.

A static image isn't a big deal, but when its in motion on a full screen you can tell that there is a resolution mismatch.
Yeah, I often play games at higher than native resolution on my plasma, and I can tell there is "a resolution mismatch" because it looks much better when running at higher than native resolution. This hold particularly true in motion, as the downsampling helps smooth out the aliasing which looks especially nasty as it crawls across the screen when the view moves.

For the second one, did you just take a 1080p screenshot and then resize/resample it in Photoshop? Or did you actually have it render in 1080p and make the game downsample it? Is there a difference.
I used Photoshop since CSS doesn't have the option to set the rendering resolution separate from the display resolution, and because it really doesn't make a notable difference. If you'd like to see for yourself, I recommend ARMA II, as it does allow changing the rendering resolution separate from the display resolution, and there are demos available.
 
Last edited:
Back
Top