Originally posted by: TheSnowman
Originally posted by: jiffylube1024
That does exactly what people above have said: when you set a resolution higher than your LCD screen supports, then it only displays as many pixels as the screen can, and you have to scroll when you get to the edges of the desktop. There is no way to 'interpolate up' extra resolution on an LCD!
Surely there has to be a way? Plenty of TVs do it, LCD or otherwise.
No they don't. An LCD is a matrix of a finite number of pixels.
Originally posted by: Cheesetogo
Why can't you display a higher resoulution by interpolating up? On CRTs, sometimes at higher resoulutions, they cannot display all the pixels, but they are still at a higher res. By the way, when I increased the resoulution on my lcd, I didn't have to scroll around the desktop. Everthing actually got smaller, so it was somehow displaying a higher res. The max res was 12x10, but I somehow got it to 16x12.
Impossible unless it's some virtual desktop thing.
Originally posted by: TheSnowman
I get the impression that you are the one who doesn't understand the concept as you said "interpolating down" while interpolation is adding information between what is there and that isn't something that happens when you downscale an image. When you downscale the image is dithered and not interpolated, while interpolation is used un upscaling. So your "interpolating down" is an oxymoron.
Dithering is what happens when a color is simulated. What usually happens is some rudimentary supersampling, but the algorithm must be fast so the DSP can handle it. There are decent algorithmic/digital gaussian (e.g. CRT) scaling methods, it's just that the speed of the IC can't keep up.
It absolutely is being interpolated.
Wikipedia: In the mathematical subfield of numerical analysis interpolation is a method of constructing new data points from a discrete set of known data points.
answers.com: Mathematics. To estimate a value of (a function or series) between two known values.
It blends adjacent pixels to form a 'bigger' image. You have input 2048x1536 and you need to convert it to output 1024x768. You blend input pixels (1,1) and (1,2) to form a new output (1,1) for example. With supersampling it almost always takes the average of input (1,1) and (1,2) to form the new output (1,1). Generally speaking there's not a whole lot of quality lost (this is what SSAA does).
Originally posted by: TheSnowman
But again, semantics aside; my point was that many TVs, LCD or otherwise, do allow one to run a higher resolution than there are pixels on the screen similar to what Sultan is looking to do with his LCD.
No they don't! The higher input resolution is being interpolated to fit the finite matrix, no matter how you want to word it. If the aspect ratios are different, you
lose quality in this process, a lot of quality. This is called downsampling. An LCD with a native of 1280x1024 is ALWAYS showing 1280x1024 resolution. The INPUT resolution is higher. The resolution of the LCD is NOT changing. With CRTs, it isn't either, but the electron guns spread out more and the scaling is inherently analog.