Are there any video cards which support output on 2 x DVI + 1 x HDMI ?

Axonn

Senior member
Oct 14, 2008
216
0
0
Hello everybody ::- )

I have 1 x DVI input monitor, 1 x Analog input (via DVI adaptor) monitor and 1 x HDMI (which goes to a Denon Receiver which is connected to my Panasonic TV).

My Radeon 6850 can only handle 2 of these at the same time, which is VERY annoying. I always use my DVI input monitor, but I am annoyed that I have to disconnect my 2nd monitor whenever I want to use the HDMI.

I already understood that there is a limitation on the number of digital outputs on video cards.

Is there any video card which has moved past this frustrating limitation?

Any other ways to circumvent this?

Thank you in advance!
 

Jimzz

Diamond Member
Oct 23, 2012
4,399
190
106
Yea get a card that supports display port and get active adapters for the DVI part of it.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Would you pay $30 to add the 3rd digital output (DVI) to your existing 6850?

All you need to buy is an active single-link displayport adapter. I think your card uses mini displayport? If not, you can get mini or regular size versions for around $30.

That will let you drive all 3 displays at the same time. You could throw a cheap DVI-HDMI adapter on that displayport adapter if you want.

Here's an example from Newegg (make sure you consider the active, not passive, addapters):

14-999-034-TS


http://www.newegg.com/Product/Product.aspx?Item=N82E16814999034
 

Axonn

Senior member
Oct 14, 2008
216
0
0
Thanks for answering guys ::- ). This is my videocard:

http://aphnetworks.com/reviews/gigabyte_radeon_hd_6850_1gb_oc/3

So apparently it also has a DisplayPort.

But I think you might have misunderstood my question, because you mention getting a 3rd digital output, DVI.

I don't need 3 DVI.

I need 1 DVI (main monitor), 1 HDMI (TV) and 1 Analog (secondary monitor which works via a DVI adapter).

Anyway, inferring from your information, here is what I suppose I should do, so please tell me if this is correct:

1. I leave my DVI main monitor how it is.
2. I leave my HDMI TV how it is.
3. I add an active display-port adapter, put it in the display port so that I have: Display Port -> DVI -> Analog. This, in total, means *TWO ADAPTERS*. 1 from DP to DVI and one from DVI to my Analog monitor.

Is all this correct?
 

Lorthreth

Member
Aug 14, 2004
120
0
86
paint.ruokamo.eu
Reading the HD6850's manual. There are two options for you.
  • HDMI+DP+DVI
  • "CRT"+HDMI+DP
So either get a DP-DVI-D adater and use it for you main monitor + HDMI + Analog adapter.
Or get a DP-VGA adapter and use it for the analog monitor + HDMI + DVI main.

The DP-DVI-D adapters can't be used with a simple analog adapter because it needs a DVI-I which contains the analog signals while DVI-D only contains digital signals.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Your suggestion is correct, but I would change it as follows:

1. Put your DVI main monitor on the displayport using the adapter below
2. Leave HDMI as-is
3. Use the DVI-VGA analog adapter on the card's DVI port. (you likely already have this adapter, they are like $1 and usually included in the box of every video card purchase for over a decade.

My justification is that I prefer to *not* piggyback multiple adapters together if possible, so I like using at most one adapter per port. Also, I'm thinking your video card may treat the displayport as display #1, so it's nice to have your main monitor on that (e.g., try the option to identify monitors). This can have an effect when you are switching between different eyefinity/extended desktops, but is pretty minor.

Please note, the card you linked appears to have a full-size displayport. So, you'd get this adapter instead:

14-999-030-Z13


http://www.newegg.com/Product/Product.aspx?Item=N82E16814999030

Edit: PS Are you planning on getting a new monitor any time soon, like one of those 27" 1440p you see at places for $300-400? If so, you can just use the native displayport for that monitor, and save the money by avoiding buying a dedicated active displayport DVI adapter.
 
Last edited:

Axonn

Senior member
Oct 14, 2008
216
0
0
Thank you Loreth & KingFatty.

KingFatty, big kudos to you for both posts. Indeed, your alternative makes more sense to me. I will do exactly like that. BTW, should I get an "active" adapter in this case as well? What's the difference between active and passive anyway?

As for the monitor: I have a 4-year old Samsung 244T. It has a PVA display (very high quality at the time), 8 bit color. There's no way I'm going to change it anytime soon, and other good quality monitors are pretty expensive so I'm gonna stick with this one a while longer.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Yes I suggest getting the "active" adapter. The difference is that it can enable the use of 3 simultaneous displays, as well as support up to around 1920x1200 output (that's how I'm using mine, on a samsung SyncMaster T240 @ 1920x1200).

The passive adapter would work if you were only going to limit yourself to 2 displays, as the card would use one of its two timing clocks for the displayport passive adapter, and the other timing clock would be used for either the remaining DVI port or the HDMI port. So it's like the card has 2 clocks to be used, and the passive adapter doesn't add another clock but the active adapter does (so you can drive 3 monitors). The $30 cost for active is a better value to have the flexibility to use it with any single-link display and get triple monitor. Note: the dual-link adapters are around $100, but you'd need that if you want to connect a dual-link monitor to the displayport, although that's probably only an issue if you are connecting multiple dual-link monitors as you'd simply use the dual-link DVI to run that dual link 1440p monitor.
 

Axonn

Senior member
Oct 14, 2008
216
0
0
Thanks for the comprehensive explanation KingFatty. Now I know what to do ::- D.