Connect ATI Radeon to Dell Monitor

mediarays

Junior Member
Jun 16, 2012
5
0
0
Hi,
I have Radeon HD videocard with display port and Dell u2311h monitor which also has display port.I want to utilize the display port on the monitor as the other connections are connected to XBOX and PS3 so only one available.
Which is the best option
1.Connect to the Displayport using standard Displayport to displayport cable
2.Connect using HDMI to Displayport ( i think this is not available)
3.Using DVI to Displayport.
Also will there be any quality loss when using Displayport to VGA or DVI
Help me decide.
 

palladium

Senior member
Dec 24, 2007
539
2
81
In your situation you can use any of the options sans VGA, I do not see why there should be any difference in image quality between the three digital connections. If you have a DP cable you should use that, just because it is the easiest option.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
VGA is horrible.
HDMI tends to have scaling issues.
DVI always works.
DP only makes sense if you need the bandwidth. As in usually no. The DP connectors are also horrible and a disaster lurking in terms of physical damage.
 

mediarays

Junior Member
Jun 16, 2012
5
0
0
VGA is horrible.
HDMI tends to have scaling issues.
DVI always works.
DP only makes sense if you need the bandwidth. As in usually no. The DP connectors are also horrible and a disaster lurking in terms of physical damage.
Actually my question is my graphic card and monitor has Display port and can i connect using Display port to display port cable or not.I want to buy this cable http://bit.ly/M1xXat.
As the displayport to Hdmi does not work iam asking this question please come to the point.
 

birthdaymonkey

Golden Member
Oct 4, 2010
1,176
3
81
Be careful when choosing a displayport cable to use with a Radeon video card and a Dell monitor. There is an issue with many of them connecting Pin 20, which is supposed to be empty. This creates weird problems. For example, when my monitor was plugged in via a Startech mini-DP to DP cable, my computer simply would not boot. You could plug the monitor in after it was booted and it worked fine, however. I think the Belkin cable you indicated earlier is a good one, but other popular cables (such as the monoprice ones) will have problems. Also, are you sure your video card's port is full size DP and not mini-DP? Most of the gaming class cards use mini-DP connectors.
 

palladium

Senior member
Dec 24, 2007
539
2
81
DP only makes sense if you need the bandwidth. As in usually no. The DP connectors are also horrible and a disaster lurking in terms of physical damage.

Really? I thought they are pretty robust. But the lack of screws sucks though.

Display port has NOTHING on DVI... I'm all about DVI ober here.

I think its the other way around. Connectors aside, DP is superior to DVI in every aspect. On my current Dell U2711, I find Dl-DVI very cumbersome to deal with - if the connectors are slightly loose on either end I get image distortion (this is not true with single link DVI), and I can't have 10-bit colour output over DL-DVI. Stuff like 4K2K would definitely require DP.
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
Really? I thought they are pretty robust. But the lack of screws sucks though.



I think its the other way around. Connectors aside, DP is superior to DVI in every aspect. On my current Dell U2711, I find Dl-DVI very cumbersome to deal with - if the connectors are slightly loose on either end I get image distortion (this is not true with single link DVI), and I can't have 10-bit colour output over DL-DVI. Stuff like 4K2K would definitely require DP.

10 bit, oh please, only uber pro photo PRINTING guys (not even post people) use 10bit, and they certainly wouldn't use a ghetto 10bit u2711

They'd use a high end CRT, or an NEC :thumbsup:

And what media comes in 10bit, LOLOL.

I got 10bit on my ghetto epson 8350, have yet to use the mode, epic useless.
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
If your video card and display both natively support DisplayPort, there is NO reason NOT to use it. No adapters needed, maximum bandwidth, win/win.
 

mediarays

Junior Member
Jun 16, 2012
5
0
0
Be careful when choosing a displayport cable to use with a Radeon video card and a Dell monitor. There is an issue with many of them connecting Pin 20, which is supposed to be empty. This creates weird problems. For example, when my monitor was plugged in via a Startech mini-DP to DP cable, my computer simply would not boot. You could plug the monitor in after it was booted and it worked fine, however. I think the Belkin cable you indicated earlier is a good one, but other popular cables (such as the monoprice ones) will have problems. Also, are you sure your video card's port is full size DP and not mini-DP? Most of the gaming class cards use mini-DP connectors.
Yes it has only Display port and not mini-DP.The belkin costs $55 and the other cable costs only $6 and both are Dp to Dp cable. :)
 

palladium

Senior member
Dec 24, 2007
539
2
81
10 bit, oh please, only uber pro photo PRINTING guys (not even post people) use 10bit, and they certainly wouldn't use a ghetto 10bit u2711

They'd use a high end CRT, or an NEC :thumbsup:

And what media comes in 10bit, LOLOL.

I got 10bit on my ghetto epson 8350, have yet to use the mode, epic useless.

I'll use an NEC or high end CRT if I have the cash :)

Many newer Japanese anime are encoded in 10bit h264. Even without 10bit media, you still benefit from the higher colour precision with reduced dithering and better gradient reproduction. Moreover, I don't see any disadvantage of using DP to enable 10bit output, versus using DL-DVI on the exact same monitor, with "only" 8 bit. I do concede that the difference is minimal, but why not use it if it is there?
 

iCyborg

Golden Member
Aug 8, 2008
1,342
59
91
10 bit, oh please, only uber pro photo PRINTING guys (not even post people) use 10bit, and they certainly wouldn't use a ghetto 10bit u2711

They'd use a high end CRT, or an NEC :thumbsup:

And what media comes in 10bit, LOLOL.

I got 10bit on my ghetto epson 8350, have yet to use the mode, epic useless.
How about higher resolutions supported, stuff like spread spectrum mode for EM shielding, supports audio, can adjust number of links and link rate for optimal signal quality and power consumption, can drive any other signal (vga, dvi, hdmi) through passive adapter, support for multiple monitors over one cable, fast auxilliary channel capable of supporting USB 3.0... LOLOL...

Sure you personally may not need many/all of them, but can you give one advantage that DVI has over DP?