posting in 1080p :D

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mrSHEiK124

Lifer
Mar 6, 2004
11,488
2
0
Originally posted by: UNCjigga
You shouldn't have overscan issues unless your display is not native 1080p resolution. I have the same problems running DVI to my Sony Grand WEGA III--the resolution is 1366x768 or something funky like that. Even when I get powerstrip to do its native resolution, it still overscans (but I think its because this older Sony doesn't accept DVI resolutions outside of HD-spec, idiots won't even take their native res!! :|)

You might be able to fix the overscan though if you're willing to tinker with your TV's service menu.

I also had the same problem with my 1366x768 (720p claimed) 24" Samsung LCD on DVI/HDMI. If I feed it a 1366x768 DVI signal from my HTPC, it'll resample to 1280x720 and back to 1366x768 for display as it only accepts ATSC resolutions through the HDMI port. Works fine with VGA though, 1366x768 is displayed just as its transmitted with 0 overscan, and I get 0 fuzziness, dunno if its the card, LCD, cable, or a combination of all three, but its awesome.
 

SaltBoy

Diamond Member
Aug 13, 2001
8,975
11
81
I don't get the overscan thing anyway. Why IS there overscan on HDTV's? Is it required per FCC regulations or what? :confused: