Converting R9 390 output to VGA

Altus7

Junior Member
Feb 19, 2014
24
0
16
I need to get my new R9 390 to work with my VGA CRT. I think I've found the right kind of DAC adapter that actively converts DP into VGA, but I'm worried it might not handle the bandwidth of 1600x1200 @85hz @32bpp.

http://www.startech.com/eu/AV/Displ...ayPort-to-VGA-Video-Adapter-Converter~DP2VGA2

Our adapter will work at 5.8GHz. A resolution of 1600x1200@85 only requires 4.9GHz so this adapter should work fine for your set up.

Do they mean that I only need 4.9Gbit/s and that their adapter supports 5.8? Shouldn't my display mode require 1600 * 1200 * 85 * 32 = 5222400000, or 5.2Gbit/s?
 

freeskier93

Senior member
Apr 17, 2015
487
19
81
Does your R9 390 not have a DVI-I port? Or is that not a thing anymore...

I mean the adapter says up to 1920x2000, it seems reasonable to assume it will work with common refresh rates for CRTs.

Really the bigger question is WTF are you hooking up an R9 390 to a CRT monitor?!
 

MongGrel

Lifer
Dec 3, 2013
38,466
3,067
121
Really the bigger question is WTF are you hooking up an R9 390 to a CRT monitor?!

Kind of wonder about that myself a bit.

It's just a matter of the right cable I imagine, but does seem a bit odd.

Sounds like you might have considered a monitor upgrade prior to a GPU, but I guess I really do not know what GPU you had prior to that.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Do they mean that I only need 4.9Gbit/s and that their adapter supports 5.8? Shouldn't my display mode require 1600 * 1200 * 85 * 32 = 5222400000, or 5.2Gbit/s?

If they are saying that their adapter works up to 5.8Ghz, and they stand behind their statement, you are safe to buy it and if it doesn't work, you can just return it. Also, see if you can try to find it cheaper on Amazon or something as that price is about double what it costs in the US.
 

Seba

Golden Member
Sep 17, 2000
1,596
258
126
Instead of spending 35 EUR on that adapter, you should buy a LCD monitor which has at least a DVI input.
 

cyclohexane

Platinum Member
Feb 12, 2005
2,837
19
81
Instead of spending 35 EUR on that adapter, you should buy a LCD monitor which has at least a DVI input.

This. And you can't make the argument that CRT is better than flat panels, not in 2015 going on 2016.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Do they mean that I only need 4.9Gbit/s and that their adapter supports 5.8? Shouldn't my display mode require 1600 * 1200 * 85 * 32 = 5222400000, or 5.2Gbit/s?
VGA is analog, not digital. I'm not sure where that "5.8GHz" stuff comes from, but regardless that is a bog-standard DP to VGA adapter. It will work just fine for your needs.

And technically your math is off on the display. Due to the padding required for monitor blanking intervals, the resolution transmitted from the video card to the adapter is going to be closer to 1700 x 1275 or so (note that this is just a quick & dirty guess). On the other hand, color data is only 24bpp (the last 8 bits are useless alpha info that isn't transmitted). So the bandwidth requirement is somewhere around 1700x1275*85*24, or around 4.4Gbps.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
CRT's don't have any of that ghosting nonsense, and there are several models with 5:4 resolutions higher than 1280x1024, and 4:3 resolutions higher than 1024x768

Square is justice.
 

freeskier93

Senior member
Apr 17, 2015
487
19
81
VGA is analog, not digital. I'm not sure where that "5.8GHz" stuff comes from, but regardless that is a bog-standard DP to VGA adapter. It will work just fine for your needs.

And technically your math is off on the display. Due to the padding required for monitor blanking intervals, the resolution transmitted from the video card to the adapter is going to be closer to 1700 x 1275 or so (note that this is just a quick & dirty guess). On the other hand, color data is only 24bpp (the last 8 bits are useless alpha info that isn't transmitted). So the bandwidth requirement is somewhere around 1700x1275*85*24, or around 4.4Gbps.

VGA is analog, yes, but since DP is digital the conversion is bandwidth limited.
 

SPBHM

Diamond Member
Sep 12, 2012
5,065
418
126
1600x1200 85Hz, that's probably a pretty nice CRT,
my cheap CRTs were all max 1280x1024 with horrible flicker.
 

f2bnp

Member
May 25, 2015
156
93
101
Wish I could use my Lacie Electron 19 Blue IV with my R9 290 as well. High resolution CRTs look amazing.
I might just go ahead and buy an adapter as well.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
CRT's don't have any of that ghosting nonsense, and there are several models with 5:4 resolutions higher than 1280x1024, and 4:3 resolutions higher than 1024x768

Square is justice.

Yeah, the closer to square screens fill more of your field of view... said no one with two working eyes ever.
 

SPBHM

Diamond Member
Sep 12, 2012
5,065
418
126
Yeah, the closer to square screens fill more of your field of view... said no one with two working eyes ever.

4:3 is fine, when 4:3/CRTs died, 5:4 which is even more of square became the standard for PCs for a while

and my "dream CRT" at the time was 16:10, which I still think is the best aspect ratio for PCs

GDM-FW900.jpg
 

Altus7

Junior Member
Feb 19, 2014
24
0
16
VGA is analog, not digital. I'm not sure where that "5.8GHz" stuff comes from, but regardless that is a bog-standard DP to VGA adapter. It will work just fine for your needs.

And technically your math is off on the display. Due to the padding required for monitor blanking intervals, the resolution transmitted from the video card to the adapter is going to be closer to 1700 x 1275 or so (note that this is just a quick & dirty guess). On the other hand, color data is only 24bpp (the last 8 bits are useless alpha info that isn't transmitted). So the bandwidth requirement is somewhere around 1700x1275*85*24, or around 4.4Gbps.

Thanks, that's exactly what I was wondering. I remember reading about monitor blanking, but wasn't sure whether to include it. It also made no sense to calculate for 24 bits when I select 32bit truecolor in Windows.

What I asked them about was the limit of the input bandwidth that the DAC could handle before the output VGA would suffer quality loss, compared to the digital input.

Perhaps it's 4.9Gbit with DisplayPort overhead? Wouldn't the DAC have to create the blanking lines, since the digital input is pure 1600x1200?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
4:3 is fine, when 4:3/CRTs died, 5:4 which is even more of square became the standard for PCs for a while

and my "dream CRT" at the time was 16:10, which I still think is the best aspect ratio for PCs

I went from this to this to this. Between faded whites/colours (CRTs lose 50%+ of their brightness over time), geometry distortion issues, fuzzy text, eye strain/headaches over extended daily use, and tiny CRT screen size, no proper height adjust, swivel or vertical pivot rotation, I do not miss my CRT days at all. Outside of twitch FPS gaming and deep black levels, CRT's limitations are just too high. I cannot imagine using a 19-23 CRT for productivity, movies, picture viewing and games in 2016 with so many amazing monitors that do both.

It's actually possible now to purchase a 27" 4K IPS monitor for about $500-550 US:

Dell Ultra HD 4k Monitor P2715Q 27-Inch Screen LED-Lit Monitor

And even a 27" 4K IPS + FreeSync:
LG Electronics MU67 27MU67 27" Screen LED-Lit Monitor
 
Last edited:

Altus7

Junior Member
Feb 19, 2014
24
0
16
Instead of spending 35 EUR on that adapter, you should buy a LCD monitor which has at least a DVI input.

This. And you can't make the argument that CRT is better than flat panels, not in 2015 going on 2016.

*sigh*

I'll gladly buy a flatscreen once I can get one with no more than 1ms of ghosting, CRT-quality brightness/contrast and a price under 500EUR.

(http://www.blurbusters.com/faq/60vs120vslb/)

Until then I'll stick with my trusty ViewSonic Professional Series P97f+.


Wish I could use my Lacie Electron 19 Blue IV with my R9 290 as well. High resolution CRTs look amazing.
I might just go ahead and buy an adapter as well.

This guy knows what's up.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Thanks, that's exactly what I was wondering. I remember reading about monitor blanking, but wasn't sure whether to include it. It also made no sense to calculate for 24 bits when I select 32bit truecolor in Windows.

What I asked them about was the limit of the input bandwidth that the DAC could handle before the output VGA would suffer quality loss, compared to the digital input.

Perhaps it's 4.9Gbit with DisplayPort overhead? Wouldn't the DAC have to create the blanking lines, since the digital input is pure 1600x1200?
The digital input isn't pure 1600x1200. For compatibility reasons we pad for blanking intervals on digital connections as well. Hence my earlier math.

Anyhow, I've used a ton of these before. I can assure you, they're just fine for 1600x1200@85Hz.:)
 

Bramble

Junior Member
Dec 29, 2015
4
0
0
I'm trying to use a displayport-vga adaptor from my r9 390 to my mitsubishi 2070sb display.
It's a high quality crt graphics monitor with true colour, max res 2048 x 1536 @86hz
(For those who ask why I'd use a CRT monitor with CRT screen - it was a £1000 monitor new - still available as N.O.S for £1200)

For some reason the max res Crimson gives is something like 1280 x 1050 (I'll check when I get home from work) and won't accept any of my custom res's
Though it pick up that it's a mitsubishi 2070

Any ideas what's going on?

Win 10 pro 64 latest update, sapphire 390 nitro, x99, latest crimson drivers
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I'm trying to use a displayport-vga adaptor from my r9 390 to my mitsubishi 2070sb display.
It's a high quality crt graphics monitor with true colour, max res 2048 x 1536 @86hz
(For those who ask why I'd use a CRT monitor with CRT screen - it was a £1000 monitor new - still available as N.O.S for £1200)

For some reason the max res Crimson gives is something like 1280 x 1050 (I'll check when I get home from work) and won't accept any of my custom res's
Though it pick up that it's a mitsubishi 2070

Any ideas what's going on?

Win 10 pro 64 latest update, sapphire 390 nitro, x99, latest crimson drivers
2048 x 1536 is an incredibly high resolution for most of these adapters. We'd need to know more about the specific adapter you're using. I'm not 100% sure if there's a controller out there that can support 2048x1536@60Hz with CVT timings, let alone 86Hz.
 

Bramble

Junior Member
Dec 29, 2015
4
0
0
2048 x 1536 is an incredibly high resolution for most of these adapters. We'd need to know more about the specific adapter you're using. I'm not 100% sure if there's a controller out there that can support 2048x1536@60Hz with CVT timings, let alone 86Hz.
Hi - thanks
I'm not trying to run it at 2048x1536 (just that it's massively underestimating my max res) - I just want it to do 1600x 1200

I didn't go for the startech converter (or unbranded equivalent)
I went for this generic one (there basically seems to be 2 converters on the market, branded with different names) This one seemed to have higher resolutions than the startech;

Here's my adaptor's spec;
Supports Resolution:
2560 x 1600
1920 x 1080
1920 x 1200
1440 x 900
1280 x 720


Supports Video Bandwidth up to 10.8Gpbs
Supports 8-bit & 10-bit Deep Color
Supports 1Mbps' Bidirectional Auxiliary Channel

It looks like it only supports 16:9, but reviews I've read of same specced converters (probably from the same factory) say it does 4:3 as well.

For the timings in the custom section, should I be using CVT?
(I wasn't sure, but tried several of them)

Thanks in advance


 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yes, you should be using standard CVT timings (and not CVT-R, which is for LCDs). However I admittedly have never seen one of these adapters default to such a low resolution.

You can go ahead and try Custom Resolution Utility and see if that solves any of your problems: http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

I've used it more than I have Crimson, and I know off of the top of my head that it can properly generate timings for a CRT monitor.
 

Bramble

Junior Member
Dec 29, 2015
4
0
0
Yes, you should be using standard CVT timings (and not CVT-R, which is for LCDs). However I admittedly have never seen one of these adapters default to such a low resolution.

You can go ahead and try Custom Resolution Utility and see if that solves any of your problems: http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

I've used it more than I have Crimson, and I know off of the top of my head that it can properly generate timings for a CRT monitor.
Thanks a lot.

I'll try that.

Do you know if I should set the display as progressive or interlaced? (I figure progressive)

I'm wondering if it's windows that's set the resolution so low, or the crimson drivers.
It's a clean install of win 10 and it defaulted the display to 800x600 before suddenly upping it to it's current resolution.
(I've got auto updates set as 'ask first', in the group policy settings.)

Cheers
 
Last edited:

Aeather

Junior Member
Jan 8, 2016
3
0
0
OP were you ever able to figure it out?

I have an MSI 390 as well and looking at 1600x1200 at 85HZ with my new CRT.

I ordered the DVI - D dual link which should arrive tomorrow, however after viewing this thread I ordered a VGA to DP for the higher bandwidth that may be required.