Is DVI limited to 1280x1024? And if so, why?

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
I have an NEC Multisync FP1350X and I noticed that the DVI output only goes up to 1280x1024. Then I noticed that this same resolution is the max DVI resolution on a whole lot of other monitors and video cards. Then I realized that I had never seen a resolution that was higher than this on DVI.

Is there a limit of 1280x1024 on DVI, and if so why is this the case? If not, then why does it seem like that is the maximum resolution that most cards and monitors support? It seems like the maximum is 1280x1024 - at least a few web searches seemed to confirm this. With monitors getting bigger, 1280x1024 is going to increasingly be "too low". I personally would like the option of 1600x1200 DVI on my 22" monitor.

DVI looks crisper and nicer than analog. Why give it such a low limit?
 

shadowfaX

Senior member
Dec 22, 2000
893
0
0
I was wondering the same thing too... excuse the non-technical speak I use here, but is it because it has to do with some sort of... hm... "stretching" of the pixels that CRTs are capable of doing?
 

crypticlogin

Diamond Member
Feb 6, 2001
4,047
0
0
If you're asking why the industry hasn't adopted larger resolutions, "blame" the Digital Display Working Group (DDWG) (half the links are members only) since they're the consortium that does the standardizations. Since the DVI we have right now is only the first revision, and *most* LCD monitors manufactured recently are going to be in the 1024x768, 1280x1024 range, it'd seem wasteful to have anything higher. There probably is some behind-the-scenes work to approve and standardize the next revision going on right now.

If you're asking why the DVI interface itself doesn't lend itself to higher bandwidths, thus higher resolutions and refresh rates... well, I'm guessing it's also a mix of the above market demand, getting *a* standard in place before moving up (ala USB 1.1 -> 2.0).
 
May 15, 2002
245
0
0
The current limit is most certainly NOT 1280x1024 -- I have a Dell 2000FP with a native resolution of 1600x1200@60Hz and it works beautifully with the DVI output from either my 3Dlabs GVX1 Pro or my Gainward GeForce4 Ultra/750XP. In fact, the Gainward card drives both my Dell 2000FP and my Dell 1900FP (at 1600x1200@60Hz DVI and 1280x1024@60Hz DVI respectively) simultaneously without difficulty.

That being said, there is a standard-mandated upper limit (of pixel-rate) right above 1600x1200@60Hz. Check these links:
http://www.3dlabs.com/product/dviresolution.htm
http://www.ddwg.org/dvi.html

BTW, I was considering buying a Matrox Parhelia until I learned that it can't support both my panels simultaneously at full resolution in digital mode...
 

Den

Member
Jan 11, 2000
168
0
0
60 hz may be OK on a flat pannel, but it looks hideous (to me and a lot of others, but not to everyone admitadly) on a CRT. The link you provided does indeed say it is limited to 1280x1024 at 85 hz....
 
May 15, 2002
245
0
0
>> The link you provided does indeed say it is limited to 1280x1024 at 85 hz... <<

Actually, what it says is:
"Single Link DVI supports a maximum bandwidth of 165 MHz (1920x1080 at 60 Hz, 1280x1024 at 85Hz)."

The two resolutions quoted are just examples. A simple calculation shows:

1920x1080x60 = 124416000
1600x1200x60 = 115200000
1280x1024x85 = 111411200

I agree that a 60Hz refresh-rate is abysmal on a CRT, but it works fine on an LCD panel.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: heliomphalodon

I agree that a 60Hz refresh-rate is abysmal on a CRT, but it works fine on an LCD panel.

I think that's why- the resolution at 60Hz is enough for an LCD at a resolution that is still VERY high.
 

Kujack

Junior Member
Jul 3, 2002
12
0
0
Simply put, NO.

Most LCD's only support up to 1024 natively. Some will support higher, but i dont believe there is practical limit to the DVI's possible resolution.

Its probably the monitor that is the limiting factor.
 

kylef

Golden Member
Jan 25, 2000
1,430
0
0
Refresh rates for CRTs and LCDs are different concepts. There is a very good reason that 60 Hz looks horrible on a CRT but fine on an LCD panel. I realize that I'm going offtopic here, but I want to clear up a few misconceptions:

Each pixel in a digital LCD panel is addressable, and as such there is no need to "refresh" the entire screen just to change a single pixel value. In a CRT, the entire screen is repainted at each refresh interval, whether or not any screen pixels have changed in the video buffer since the last refresh. LCD panels, on the other hand, can change only those pixels which need to be changed and leave the others "untouched." As a result, there is no refresh "flicker" in LCD screens, contrasting sharply with the unpleasant flickering that our eyes notice when CRT refresh rates are too low.

Indeed, the idea of a "refresh rate" for LCD panels is somewhat misleading because there is no mandatory whole-screen refresh. In order to ascertain the "screen update speed" of flat panel displays, we must look elsewhere. Most TFT arrays inside LCD flat panel displays have pixel response times which are considerably slower than 60 Hz (and much slower than the phosphor in CRTs). In fact, I am unaware of a commercial TFT unit that achieves 60 Hz speed. For instance, this Viewsonic VE510+ model has a relatively speedly pixel response time of 25 milliseconds. If you do the math, the fastest that this screen can possibly refresh is 40 Hz (but note that there is no fundamental physical reason that the diplay should need to refresh synchronously). So the DVI flat panel refresh rate that you see is actually the "video buffer transmission rate", or in other words, the rate at which changes to the screen are sent from the video card to the LCD flat panel to request pixel update.

A 25 millisecond response time is fine for watching NTSC video (30 Hz) or playing slow games, but hardcore gamers have known for quite some time that flat panel displays simply do not meet the requirements of high-speed high-frame-rate games. Many TFT manufacturers are working to improve pixel response time to accommodate this lucrative market.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Which raises the question, why are you using DVI for an analog montior anyway?

DVI was designed for TFT's which only require a 60Hz refresh.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Which raises the question, why are you using DVI for an analog montior anyway?
I have played at length with the analog and digital inputs to my monitor and there is no doubt in my mind that the digital simply looks better. I couldn't explain exactly why, but there is a noticeable difference and I have no doubt that if I were to do some form of experiment where someone switched it back and forth betweeen analog and digital, I would consistently pick the digital as the better looking of the two. Like I said, I couldn't describe it exactly, but the digital one looks "crisper".

I have never managed to get my DVI output of my GF3 card to work under Linux so I use analog on Linux frequently, but if I accidently boot up Windows with the analog mode still on, then it doesn't take me more than a minute of thinking "why does the screen look so weird?" before I realize and switch it back to DVI. Whether or not it was designed for LCD's, it makes for a cleaner picture on CRT's.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
There is a frequency limit up there somewhere ... but DVI can go dual transmission channels to one display to halve the transmission frequency when that happens. What halts us currently is that there is a huge jump from 1280x1024 to 1600x1200 - 1400x1050 panels are starting to pop up in notebooks, but not yet as standalone displays. Lack of support for that resolution in graphics card drivers is part of the reason.

regards, Peter
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
DFP was the standard DVI was built on. DFP could only do 1280x1024 . they double available bandwith, with DVI. thereis a long article on tomshardware.com about it
 

NuclearFusi0n

Diamond Member
Jul 2, 2001
7,028
0
0
Originally posted by: pm
Which raises the question, why are you using DVI for an analog montior anyway?
I have played at length with the analog and digital inputs to my monitor and there is no doubt in my mind that the digital simply looks better. I couldn't explain exactly why, but there is a noticeable difference and I have no doubt that if I were to do some form of experiment where someone switched it back and forth betweeen analog and digital, I would consistently pick the digital as the better looking of the two. Like I said, I couldn't describe it exactly, but the digital one looks "crisper".

I have never managed to get my DVI output of my GF3 card to work under Linux so I use analog on Linux frequently, but if I accidently boot up Windows with the analog mode still on, then it doesn't take me more than a minute of thinking "why does the screen look so weird?" before I realize and switch it back to DVI. Whether or not it was designed for LCD's, it makes for a cleaner picture on CRT's.

um, CRTs can't use digital-only connectors
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
Originally posted by: NuclearFusi0n1
Originally posted by: pm
Which raises the question, why are you using DVI for an analog montior anyway?
I have played at length with the analog and digital inputs to my monitor and there is no doubt in my mind that the digital simply looks better. I couldn't explain exactly why, but there is a noticeable difference and I have no doubt that if I were to do some form of experiment where someone switched it back and forth betweeen analog and digital, I would consistently pick the digital as the better looking of the two. Like I said, I couldn't describe it exactly, but the digital one looks "crisper".

I have never managed to get my DVI output of my GF3 card to work under Linux so I use analog on Linux frequently, but if I accidently boot up Windows with the analog mode still on, then it doesn't take me more than a minute of thinking "why does the screen look so weird?" before I realize and switch it back to DVI. Whether or not it was designed for LCD's, it makes for a cleaner picture on CRT's.

um, CRTs can't use digital-only connectors

Traditional CRTs cannot. However, there was word on CRTs that accepts a digital signal. obviously that means the monitors has DAC. I think viewsonic has such a product.

There are different flavors of "dvi" usually noted with a -something at the end. each lends to compatibility with existing standards via adapters. Example, ATI's dongles, they have a dvi to vga, as well as a dvi to component.

 

NuclearFusi0n

Diamond Member
Jul 2, 2001
7,028
0
0
Originally posted by: Mday
Originally posted by: NuclearFusi0n1
Originally posted by: pm
Which raises the question, why are you using DVI for an analog montior anyway?
I have played at length with the analog and digital inputs to my monitor and there is no doubt in my mind that the digital simply looks better. I couldn't explain exactly why, but there is a noticeable difference and I have no doubt that if I were to do some form of experiment where someone switched it back and forth betweeen analog and digital, I would consistently pick the digital as the better looking of the two. Like I said, I couldn't describe it exactly, but the digital one looks "crisper".

I have never managed to get my DVI output of my GF3 card to work under Linux so I use analog on Linux frequently, but if I accidently boot up Windows with the analog mode still on, then it doesn't take me more than a minute of thinking "why does the screen look so weird?" before I realize and switch it back to DVI. Whether or not it was designed for LCD's, it makes for a cleaner picture on CRT's.

um, CRTs can't use digital-only connectors

Traditional CRTs cannot. However, there was word on CRTs that accepts a digital signal. obviously that means the monitors has DAC. I think viewsonic has such a product.

There are different flavors of "dvi" usually noted with a -something at the end. each lends to compatibility with existing standards via adapters. Example, ATI's dongles, they have a dvi to vga, as well as a dvi to component.
http://www.hometheaterhifi.com/qa/images/DVI-video-connectors-diagram.jpg
check out the top two. those pins on the left of the connector aren't digital, they are analog. all a "dvi to vga converter" does, like the ATI ones, is use those pins
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
um, CRTs can't use digital-only connectors
My NEC CRT monitor definitely has DVI. I take a DVI cable and hook up the output of the DVI output of my GF3 Ti200 to the DVI input of my NEC 1350x. There is definitely no analog mode going through the connectors.

Here's a review of the monitor that I have.

Quoting from the review:
Among the first major product releases by the new corporation is the NEC MultiSync FP1350X, a 22" (20" viewable image) monitor capable of resolutions up to 1920X1440 at 76Hz. The FP1350X incorporates what NEC-Mitsubishi calls Ambix technology?compatibility with both digital and analog video cards. Indeed, the back of the monitor provides both mini D-sub and DV-I connectors as well as USB connectors for monitor control.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
umm pm, you are aware that DVI-I connectors do carry the classic analog VGA signal side by side with the digital signal, aren't you?

You can identify which of the signals your display device uses by looking at the blade in its DVI connector. Blade in parallel to the long sides of the connector -> analog. Blade upright -> digital.

regards, Peter
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Yes, I am aware of that, Peter. But I'm pretty sure that when I set up all of the DVI cabling and push the DVI button on the monitor that I am using the digital input.

My thinking:
  • my monitor's specs clearly state that it supports digital inputs.
  • pressing the "DSub" button that switches between the analog input and the digital input changes the screen noticebly.
  • the DVI output of my card doesn't work under Linux and thus I don't get a signal - but the analog output works fine.
  • the maximum resolution of the monitor in Windows with DVI enabled is 1280x1024. With the analog input selected it is 1920x1440.
  • the section of the display properties tab under the advanced -> device selection says "digital".

I'm pretty sure that when I hook it up, I am digital.
 
May 15, 2002
245
0
0
Couldn't it just be that the NEC MultiSync FP1350X has an internal D/A converter appropriate for the DVI digital input? Looking at the information (for the FP1375X) on the necmitsubishi web site seems to indicate that this is the case. It seems to me that the later in the signal path that the conversion to analog takes place, the better. Performing the conversion inside the monitor (even if it's a CRT) would eliminate any problem of signal degradation in the monitor cable.

Does the FP1350X driver distinguish between the analog and digital input modes? My Dell 2000FP has both analog and digital inputs -- if I use the analog input, the display properties (under XP Pro) call the monitor a "Dell 2000FP" but if I use the digital input, it's called a "Dell 2000FP (Digital)" and the list of available modes for the video card ("Hide modes that this monitor cannot display." is checked) is different as well.
 

Buddha Bart

Diamond Member
Oct 11, 1999
3,064
0
0
The degridation could be in the cable, but more than likely the D/A conversion in the monitor is just with much nicer electronics than they pack on your average Geforce. Kinda like how anand explained in the recent article on futzing with your low-pass filters on geforce cards. I imagine if the monitor does the conversion up inside the main shielding it already has to have for all the interference it puts off, its parts dont need to conform to the FCC spec like AGP cards would (seing as they ahve nothing to sheld them).

bart
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
If NEC really use the digital signal, they sure have one big advantage - it's not so much about filtering, it's that connectors and length of cable introduce quite a bit of parasitic capacitance that will mess up your analog signal quality no matter what you do. Keeping the signal digital (and thus lossless) as far as possible will keep it clearer - something the audio community has already found out a while ago.

regards, Peter
 

BeeMojo

Junior Member
May 10, 2001
22
0
0
there seems to be a little confusion here, DVI and DVI-i DVI-d.

DVI is used for Digital connects only, and you can have some pretty long cables with it. It is for Digital moitors, like LCD's Plasma etc, not for CRT's.

On a video card with both Analog and DVI, usually the DVI will not work with a CRT. However, the newer DVI specs have analog signals passing through the DVI connector as well.

If you think that a CRT with a DVI connector on the back is "digital" NOT, if it was so, there would be an internal DAC to convert the signal back to Analog, so the CRT can use it. In this case the the only thing that is happening is the DVI connector is becoming a standard that will support both Analog an Digital signals in one form factor.

The limits for TRUE Digital is as stated in the above spec, and is limited not by the purely connector and spec, but by the TMDS transmiter on the video card (similar function as a RAM DAC on an analog only video card). Most of these are limited to 1600x1280. DVI is the second digital connector widely used, DFP was the first and that had a max of 1280x1024 (I have run them above up to 1600x1024).

Many card used external TMDS trasmitters, you can remember that the GeForce 2 used an internal one, and it sucked, most manufacturers put an external one made by Silicon Magic. Now, internal tmds tranmitters are becomming common.

So in the terms of a pure digital signal to a flat panel, 50 to 60 Hz is adequate, for 99% of situations. DFPs, have some drawbacks, and in situations where frame rates needs to be above 60Hz, not refresh rate, a high end crt will be better.

The best thing about digital, is no color or focus issues, all cards produce crisp high contrast pictures. This levels the playing feild and makes companies like Matrox no better anyone else running a DVI port in terms of picture quality, because the conversion to Analog is gone (unless you are using a CRT).

Correct me if I am wrong.
 
May 15, 2002
245
0
0
Originally posted by: BeeMojo

If you think that a CRT with a DVI connector on the back is "digital" NOT, if it was so, there would be an internal DAC to convert the signal back to Analog, so the CRT can use it. In this case the the only thing that is happening is the DVI connector is becoming a standard that will support both Analog an Digital signals in one form factor.

Correct me if I am wrong.

I think you might be wrong with respect to the NEC monitors that have been discussed in this thread. Take a careful look at the documentation available on the NEC-Mitsubishi web site.

1) The page says that the monitor incorporates PanelLink technology.
2) The User's Manual states that the monitor comes with both a DVI-D to DVI-D cable and a DVI-A to D-Sub cable.
3) One of the monitor controls is "DVI Selection" -- under which either an analog or a digital signal is chosen on the monitor's DVI input.
4) The specifications of the monitor list "DVI Digital Input -- TDMS, 165M MAX signal link"
5) The monitor includes "Ambix" technology, which is NEC-Mitsubishi's digital display technology and is compatible with DFP.

I think you're right with respect to the vast majority of CRTs, but the NEC-Mitsubishi MultiSync FP1350X and FP1375X are exceptions -- they actually support true DVI-D digital inputs.

If I hadn't already migrated to dual LCD panels, I'd be looking hard at the FP1375X for my next monitor.