I've been using my PS3 and xbox 360 on my cheapo 32" Olevia LCD with one HDMI input for 2 years now. While 720p is nice, the enthusiast in me always wants, but can't afford the best.
My PC took a crap the other day so I went out and bought a hdmi to dvi adapter and connected my PS3 to my 24" 1920x1200 Gateway monitor.
I enabled 1080p output in the xmb and the xmb was nice and sharp. When I loaded up MSG4, I was half expecting to be stunned, but instead I was horrified by how the game looked.
I looked on the back of the box and it said 1080p on the back, but sure enough, my monitor was reporting 720p. I turned on 1:1 pixel mapping and the game looked good again, but what's the point if I lose almost half the viewable area to black bars?
Next I tried to unselec 720p in the xmb thinking 1080p might be forced, but that only made it look considerably worse.
I tried RE5, Fallout 3 and dead space, all with more or less the same result. Instead of getting a nice clean , crisp picture that my PC gave me, all I got was a fuzzy, stretched, low res joke. Same with my xbox using same cable and adapter.
Now I understand why it is happening. He systems are rendering the games at 720p or lower native and there is some poor attempt at scaling similar to running a PC game at a non native resolution.
What I am wondering is if the same thing happens on 1080p tvs? Everyone I know who owns a nice 1080p tv and a PS3 tells me how amazing games look and how I'm missing out by only having 720p tvs.
Yet when I hook the systems back up to my 720p tv, I am amazed by how much better it looks than at 1080p.
Is there something different about how actual tvs handle non native resolutions? I just can't wrap my mind around someone spending $2000 on a legit TV and then being greeted by the god awful mess that I was.
Besides having most of the screen filled without black bars when using 1:1 pixel mapping, there is no indication that either system is doing anything else besides stretching a non native resolution.
My PC took a crap the other day so I went out and bought a hdmi to dvi adapter and connected my PS3 to my 24" 1920x1200 Gateway monitor.
I enabled 1080p output in the xmb and the xmb was nice and sharp. When I loaded up MSG4, I was half expecting to be stunned, but instead I was horrified by how the game looked.
I looked on the back of the box and it said 1080p on the back, but sure enough, my monitor was reporting 720p. I turned on 1:1 pixel mapping and the game looked good again, but what's the point if I lose almost half the viewable area to black bars?
Next I tried to unselec 720p in the xmb thinking 1080p might be forced, but that only made it look considerably worse.
I tried RE5, Fallout 3 and dead space, all with more or less the same result. Instead of getting a nice clean , crisp picture that my PC gave me, all I got was a fuzzy, stretched, low res joke. Same with my xbox using same cable and adapter.
Now I understand why it is happening. He systems are rendering the games at 720p or lower native and there is some poor attempt at scaling similar to running a PC game at a non native resolution.
What I am wondering is if the same thing happens on 1080p tvs? Everyone I know who owns a nice 1080p tv and a PS3 tells me how amazing games look and how I'm missing out by only having 720p tvs.
Yet when I hook the systems back up to my 720p tv, I am amazed by how much better it looks than at 1080p.
Is there something different about how actual tvs handle non native resolutions? I just can't wrap my mind around someone spending $2000 on a legit TV and then being greeted by the god awful mess that I was.
Besides having most of the screen filled without black bars when using 1:1 pixel mapping, there is no indication that either system is doing anything else besides stretching a non native resolution.
