Why are we still stuck with "scanning" screen input and display?

24601

Golden Member
Jun 10, 2007
1,683
40
86
Why do screens still "scan" as if an electron gun is still the driver to the screen?

We should get a standard to output to the display as close to instantaneously as possible, and for the screen image to be updated uniformly.

The year is 2007, it is the future :O
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Bits through a HDMI or DVI cable are still sent sequentially. Or maybe a few at the same time in parallel. But definitely not all at once.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
Bits through a HDMI or DVI cable are still sent sequentially. Or maybe a few at the same time in parallel. But definitely not all at once.

I know, that's what my question is about :D
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
If they are going to change something about the current interface its not the order of the bits I think that matters, its the fixed interval on which its done. LCDs don't need to be refreshed, they can maintain the same image for as long as needed. Given that it would be nice to have refresh rates as an upto number rather than fixed at the top.

This would allow any game running at below the ideal fps to immediately send out its image and have the minimum latency possible associated with vsync. It would stop the need to have 16 or 33ms frames but instead allow all the values in between as well, effectively reducing the latency for these scenarios and making it more consistent.

Its a relatively simple change that would improve the quality of game animation and image quality for a large number of people.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
If they are going to change someone about the current interface its not the order of the bits I think that matters, its the fixed interval on which its done. LCDs don't need to be refreshed, they can maintain the same image for as long as needed. Given that it would be nice to have refresh rates as an upto number rather than fixed at the top.

This would be any game running at below the ideal fps would be able to immediately send out its image and have the minimum latency possible associated with vsync. It would stop the need to have 16 or 33ms frames but instead allow all the values in between as well, effectively reducing the latency for these scenarios and making it more consistent. A relatively simple change that would improve the quality of game animation and image quality for a large number of people.

I don't think you realize that screens are still "refreshed" by "scan lines" as if LCDs have scanning electron guns in them.

I'm wondering why we haven't even fixed that simple part of it yet.

Those "runt frames" you see are entirely the fault of this paradigm.

If it was done properly, you wouldn't see "runt frames," you would see pure micro-stutter instead.

The reason for the fixed interval is entirely because of the "scanning electron gun" paradigm in display input and display driving.

PC Gaming has subsidized every advance in microprocessors and computing in general since the dawn of microprocessors.

It's time they subsidize another revolutionary advance :p.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Your premise is flawed I think? The rate of electron gun scanning across the screen was orders of magnitude faster than the rate of the video processing, they are unrelated.

Any similarity between the electron gun scanning and the way graphical information is processed is just coincidental.

In other words, there are good reasons to use the approach for electron guns, and there are other different reasons for using a similar approach for video card processing.

The paradigm currently used has been valid on its own, and you can't cast aspersions on it just because it happens to be similar to a paradigm used by electron gun scanning. Perhaps it's just the best way, and your speculation about how we need to "fix" that is simply not based on good evidence?
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
Your premise is flawed I think? The rate of electron gun scanning across the screen was orders of magnitude faster than the rate of the video processing, they are unrelated.

Any similarity between the electron gun scanning and the way graphical information is processed is just coincidental.

In other words, there are good reasons to use the approach for electron guns, and there are other different reasons for using a similar approach for video card processing.

The paradigm currently used has been valid on its own, and you can't cast aspersions on it just because it happens to be similar to a paradigm used by electron gun scanning. Perhaps it's just the best way, and your speculation about how we need to "fix" that is simply not based on good evidence?

Take your rhetorical questions to a substance-less conversation.
Bait-maor.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
It is entirely true that the screen is refreshed in lines very much akin to how a cathode ray worked. At least in result, each pixel is told one (or a few) at time what colour to change to.

The issue is just one of practicality. To change all of the pixels at once you'd need each pixel connected to its own memory increasing the board complexity by many many orders of magnitude. Additionally unless you had a signal cable that was also several million leads you'd have to have far more processing power to receive and distribute the pixel information to a buffer in the monitor to await simultaneous display thus increasing input lag greatly (or requiring more processing power).

The way we do it now is similar to a cathode ray because both data transfers were serial and it made sense to just adapt the same signal at first, it is just way more efficient. It doesn't much matter if we raster the pattern (simple to code and build though) or use some sort of other pattern. We will never get away from serial display interfaces until we move away from pixels entirely in whatever the next paradigm is.

We use serial to reduce complexity. In some cases to make it possible to manufacture or function at all.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
This thread = mind blown.
Good question.

@Daedalus685

Damn it Daedalus stop talking logic at us!
Killer of dreams :p

We just want 1 complete picture at a time refresh, instead of 1 pixel at a time or line at a time.
Is that too much to ask? :)

Complete pictures at a time = no screen tears right?

Make it happend someone!


If they are going to change something about the current interface its not the order of the bits I think that matters, its the fixed interval on which its done.

What exactly would that acomplish though? wouldnt you be intruduceing lag/microstutter,
in form of these non fixed intervals of refresh's ?
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
This thread = mind blown.
Good question.

@Daedalus685

Damn it Daedalus stop talking logic at us!
Killer of dreams :p

We just want 1 complete picture at a time refresh, instead of 1 pixel at a time or line at a time.
Is that too much to ask? :)

Complete pictures at a time = no screen tears right?

Make it happend someone!

Heh, complete screen change would mean no screen tears but the result would be identical to Vsync from your perspective but with FAR more expensive monitors and likely more lag as well (or just cables that cost several thousand dollars).

I just recently ordered a cable for a phased array ultrasound probe. It is about half a cm in diameter and has 64 shielded leads. It cost 2400 each cable. We'd need 32000 of those cables to send a parallel screen image :)
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Right now, the graphics driver and the monitor need to synchronize to 60hz. Why not just send a packet stating pixel number (imagine if top left is 0, and bottom right is whatever the final pixel number is), followed by the data to update? If no data is received, the monitor can still maintain the last image. Transfer is still serial, but not linked to a 60hz refresh rate. I feel like we might see this in mobile first if it ever happens, where there might be power savings to doing so. Heck, isn't Intel already planning to do this for their mobile chipsets, with extra DRAM added to the display?
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Right now, the graphics driver and the monitor need to synchronize to 60hz. Why not just send a packet stating pixel number (imagine if top left is 0, and bottom right is whatever the final pixel number is), followed by the data to update? If no data is received, the monitor can still maintain the last image. Transfer is still serial, but not linked to a 60hz refresh rate. I feel like we might see this in mobile first if it ever happens, where there might be power savings to doing so. Heck, isn't Intel already planning to do this for their mobile chipsets, with extra DRAM added to the display?

We synchronize because it requires less data be sent. You'd roughly double the required bandwidth and the frame buffer required to send the pixel location along with the colour information to save some board complexity in the display.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Right now, the graphics driver and the monitor need to synchronize to 60hz. Why not just send a packet stating pixel number (imagine if top left is 0, and bottom right is whatever the final pixel number is), followed by the data to update? If no data is received, the monitor can still maintain the last image. Transfer is still serial, but not linked to a 60hz refresh rate. I feel like we might see this in mobile first if it ever happens, where there might be power savings to doing so. Heck, isn't Intel already planning to do this for their mobile chipsets, with extra DRAM added to the display?
hmmmm....

Wouldnt you need to asign some type of memory that "remembered" the colour/location of each pixel then? and did some sort of check-up before sending new signals to each location that needed a update.

That process sounds like it would add a delay rate of some sort, ei. lag input.

Also what happends if their out of sync? would the screen just stay frozen as the last input it recived in sync?
how long a "delay" time would be acceptable (to help ensure sync rates)?

How much power would this save the screen/video card?
From a power saveing techique it sounds intresting though.


I just recently ordered a cable for a phased array ultrasound probe. It is about half a cm in diameter and has 64 shielded leads. It cost 2400 each cable. We'd need 32000 of those cables to send a parallel screen image :)
O_O

16 meter diameter cable for a monitor? (32000 x 0.5cm diameter = 16,000 cm)
Your pulling my leg .... right?

You know then the cable to the monitor is "thicker" than the monitor is big (size),
that something is wrong with your design philosophy.

I guess im just stuck with screen tears..... a 100% no screen tear monitor should be a goal of some sort,
that monitor manufactures are trying to achieve though. Maybe one day it ll be realised by some crazy inventor out there.... I can still dream.
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
hmmmm....

Wouldnt you need to asign some type of memory that "remembered" the colour/location of each pixel then? and did some sort of check-up before sending new signals to each location that needed a update.

That process sounds like it would add a delay rate of some sort, ei. lag input.

Also what happends if their out of sync? would the screen just stay frozen as the last input it recived in sync?
how long a "delay" time would be acceptable (to help ensure sync rates)?

How much power would this save the screen/video card?
From a power saveing techique it sounds intresting though.


O_O

16 meter diameter cable for a monitor? (32000 x 0.5cm diameter = 16,000 cm)
Your pulling my leg .... right?

You know then the cable to the monitor is "thicker" than the monitor is big (size),
that something is wrong with your design philosophy.

I guess im just stuck with screen tears..... a 100% no screen tear monitor should be a goal of some sort,
that monitor manufactures are trying to achieve though. Maybe one day it ll be realised by some crazy inventor out there.... I can still dream.

Heh that example was just to illustrate a point. The kind of cable you'd need is not practical at all and likely ins't even possible to build :). Granted the cable I bought was very specialty made but its already at about 32AWG (or 34, i forget) per strand. You don't get much finer for a cable you want to last and work.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
16 meter diameter cable for a monitor? (32000 x 0.5cm diameter = 16,000 cm)
Your pulling my leg .... right?

Cables are round, it would only be 16 meters if you made a flat cable.

More seriously, 1920x1080x32@60Hz requires 3.7 Gb/s just for the color data, not including any error correcting or encoding overhead. Since few external interfaced deliver more than 10-20 Gb/s, I don't anticipate instant pixel refreshes.

That being said, I think the most promising idea would be to use a couple of thunderbolt connectors to connect a PC to a display with the GPU inside the display so the pixels would be driven almost directly off the frame buffer at HDTV like refresh rates (480 Hz for example).

The main issue is that there is very little market demand. Look how few models of 120 Hz and 144 Hz displays are available. This is clearly only demanded by a small subset of hardcore gamers.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Cables are round, it would only be 16 meters if you made a flat cable.
Hahahah your right :D
Hadnt thought about that.


I think the most promising idea would be to use a couple of thunderbolt connectors to connect a PC to a display with the GPU inside the display so the pixels would be driven almost directly off the frame buffer at HDTV like refresh rates (480 Hz for example).
Intresting concept/idea.

I take it thats to overcome the transfer rates of data needed to do instant full screen refreshes?

That would mean your GPU was stuck (fused) to the TV/Monitor right?
If you introduced a socket for the GPUs in the TV/monitors you'd be back at square one.
(same problem, differnt place)

So in order to get better GPU everytime you wanted a video card upgrade, you'd need to buy a tv+video card in 1.
 
Last edited:

Wall Street

Senior member
Mar 28, 2012
691
44
91
That would mean your GPU was stuck (fused) to the TV/Monitor right?
If you introduced a socket for the GPUs in the TV/monitors you'd be back at square one.
(same problem, differnt place)

So in order to get better GPU everytime you wanted a video card upgrade, you'd need to buy a tv+video card in 1.

That would be the idea. A cable can only have a limited number of pins before you end up with a cost prohibitive cable, but a chip on a PCB can have many more lines for less cost and since the lines are shorter they could be at higher frequency.

Even with a socket in the TV/Monitor it could be faster (PCIe x16 has much higher bandwidth than Thunderbolt), but having the GPU on the same PCB as the panel controller would have the highest bandwidth ideally.

Again, cost and a lack of demand hold this back though, since the current 60 Hz standard (with some displays supporting 24 Hz/72Hz) is more than good enough for video, office work and 90% of games.