Multi-Monitor Setup

Isura

Member
Aug 1, 2005
100
0
76
Building a new machine. I have a 30" and 2 20" monitors. I want to use Radeon 4870 on the 30", and some cheap/vanilla video card to support the 2 20" monitors.

30" is used for some gaming on moderate settings (1920x1200 medium detail, mostly 16x1050). The 20"s are for browsing and other apps, nothing graphically intense (plan to run Windows 7).

Base setup is
Core 2 Quad Yorkfield Q9550
Gigabyte GA-EP45 UD3P
Radeon HD 4870
Cheap ATI card
DD2 1066 Memory

As I understand, the Gigabyte motherboard switches from PCI 16x to 8x with 2 video cards.
Does that split the available bandwidth by 1/2, or will the 4870 use the majority of it? I don't want the weak card slowing down my graphics.

Do I need a board with 2 PCI 16x slots with 2 4870 cards?

Edit: I haven't built a machine in 5 years. Is the 4970 sufficient for modern games like Fifa or NBA 2k9.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
You should be able to use some cheap PCI or PCIE 1x or 4x slots video cards for web browsing etc. and you'll still have the full 16x for the 4870.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
wait so are you using 1 or 2 4870s? if so, why are you sinking $300 into a dual card config like that with the next gen hardware only a month out? and you wouldnt need a cheap ATI card if you pick a card with a lot of outputs. all the refrence designs i have seen for the 5870s for example have dual DVI and HDMI or display port outputs (4 outputs total). if you want a 4870, and you want it now, you can easily run 3 monitors with this one

http://www.newegg.com/Product/...x?Item=N82E16814102825

it has VGA, HDMI, and DVI outputs, which should be perfectly fine for a 2560x1600 monitor (off the DVI), and then the 2 20 inch monitors off HDMI and VGA

ed: if you want more performance, which i would recommend for a 30 inch monitor, the 4890 vapor-x has 4 outputs. it has display port, HDMI, DVI, and VGA

http://www.newegg.com/Product/...x?Item=N82E16814102848

ed: i should mention, from benchmarks i have seen, a 4870 or 4890 crossfire config does bottleneck on a p45 board vs the equiv x48 board. the benchmark i saw was 4890 CF on a GA-EP45-DQ6 vs a GA-EX48-DQ6, and the x48 was significantly enough faster.

http://www.tweaktown.com/artic...performance/index.html

as you can see, even 4850 CF bottlenecks on x8 crossfire vs x16. the difference here was between 15 and 100% depending on the resolution. you have a 30 inch monitor, so pay close attention to the differences there. i think the main reason you saw such huge gains with x16 though was becasue the 4850cf combo probably was a 4850 512mb cf, and they were going out to system memory for additional frame bufferage lmao. still, it does paint an unpretty picture.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
if you want a 4870, and you want it now, you can easily run 3 monitors with this one

Are you sure it supports more than 2 concurrent, independant displays? Even with the various connection options, is it not still basically a dual head card that offers only 2 independant outputs?
 

Isura

Member
Aug 1, 2005
100
0
76
faxon,

Thanks for the reply. I don't need to run crossfire, just a single 4870. I didn't realize you could run 3 monitors off a single card. I find the image quality drops with VGA vs DVI. I still use the 20" for reading text, and want clear and bright text. I've never tried HDMI, how does the image quality compare to DVI?
 

Chaotic42

Lifer
Jun 15, 2001
34,589
1,749
126
I wouldn't mix nvidia and ATI cards together. I'm running 3 monitors with a 4870x2 and a 4650. It's.... interesting. I get the impression that ATI never meant for anyone to use more than two monitors or one card.
 

Isura

Member
Aug 1, 2005
100
0
76
Originally posted by: Chaotic42
I wouldn't mix nvidia and ATI cards together. I'm running 3 monitors with a 4870x2 and a 4650. It's.... interesting. I get the impression that ATI never meant for anyone to use more than two monitors or one card.

Do you mean excessive heat or power?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: Isura

As I understand, the Gigabyte motherboard switches from PCI 16x to 8x with 2 video cards.
Does that split the available bandwidth by 1/2, or will the 4870 use the majority of it? I don't want the weak card slowing down my graphics.
Somehow I doubt the drop to 8x is going to impact a single 4870 much, if at all.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: Chaotic42
I wouldn't mix nvidia and ATI cards together. I'm running 3 monitors with a 4870x2 and a 4650. It's.... interesting. I get the impression that ATI never meant for anyone to use more than two monitors or one card.

I wouldn't say that, I've used many different ATI muti-monitor configurations over the years. They were on the 3rd generation of IGP + concurrent video card before anyone else offered it (allowed for 3 monitors using onboard graphics and a single additional card) They've offered quad display PCI cards since at least the Rage chip as well.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,208
126
Originally posted by: rbV5
if you want a 4870, and you want it now, you can easily run 3 monitors with this one

Are you sure it supports more than 2 concurrent, independant displays? Even with the various connection options, is it not still basically a dual head card that offers only 2 independant outputs?

It's still a dual-head graphics chipset, even if it does have 3 or 4 outputs on the back panel.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: VirtualLarry
Originally posted by: rbV5
if you want a 4870, and you want it now, you can easily run 3 monitors with this one

Are you sure it supports more than 2 concurrent, independant displays? Even with the various connection options, is it not still basically a dual head card that offers only 2 independant outputs?

It's still a dual-head graphics chipset, even if it does have 3 or 4 outputs on the back panel.

Thats what suspected. I've been out of the graphics loop the past couple years now, so I wasn't sure if things had changed that much.

Supported minor features' details are a ridiculous PITA to track down, and it seems worse now than it used to be (and it used to be bad enough)