Update: 970 SLI or 290X CF ? Type of cooler?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

destrekor

Lifer
Nov 18, 2005
28,799
359
126
And before power consumption sophistry takes hold here:
Power_04.png

From http://www.techspot.com/review/898-geforce-gtx-970-sli-4k-gaming/page6.html

Nobody said the 290 was a power hog, that belongs more to the 290x. And stock 970 is rather low, but manufacturers are adding breathing room for higher overclocks so TDP limits are raised.

The 290 stock is low, but the 290 overclocked to reach 290x levels is not so low I reckon, though they never bothered reporting power usage for the OC'd cards.

And... addressing both of your previous posts: it's like you didn't even look at the entire article that you linked me. Sure, you can cherry pick a few scenarios where it makes the 290 and 290 Crossfire Overclocked look good, but I noticed far more scenarios where it didn't, and in almost every case, frame time was better, even at 4K. There were only a few situations where the 290 even outed the 970 at 4K.

So, I'm not sure what you were set out to prove? That the 290 is a good value? Sure, I get that. But just overclocking it doesn't suddenly make it that much more of an outstanding product, especially when the competition is overclocked. Frankly, I would have liked to see the 290x in Crossfire and overclocked thrown in with that review.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Nobody said the 290 was a power hog, that belongs more to the 290x. And stock 970 is rather low, but manufacturers are adding breathing room for higher overclocks so TDP limits are raised.

The 290 stock is low, but the 290 overclocked to reach 290x levels is not so low I reckon, though they never bothered reporting power usage for the OC'd cards.

And... addressing both of your previous posts: it's like you didn't even look at the entire article that you linked me. Sure, you can cherry pick a few scenarios where it makes the 290 and 290 Crossfire Overclocked look good, but I noticed far more scenarios where it didn't, and in almost every case, frame time was better, even at 4K. There were only a few situations where the 290 even outed the 970 at 4K.

So, I'm not sure what you were set out to prove? That the 290 is a good value? Sure, I get that. But just overclocking it doesn't suddenly make it that much more of an outstanding product, especially when the competition is overclocked. Frankly, I would have liked to see the 290x in Crossfire and overclocked thrown in with that review.

Yeah someone needs to sit down and do a roundup with the 290/290x/970/980 and show 1080p/1440 or 1600p/4k single and SLI with the latest drivers. I think there have been performance improvements in some games as well since then not even driver improvements.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So as soon as a benchmark with 290 on top is found... ok.

I don't know why the thread crapping and trolling. It's a legitimate thing to want to see. Both AMD and Nvidia have new drivers, all the benchmarks we can look at for the 970 and 980 are using older versions. What has changed? Well looking at Guru3D's SLI benchmarks, the newer article using 3 way SLI has better frametimes than 2 cards did at launch. Of note is that even the SLI 980s had dropped frames which is not there in the tri-SLI review.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
So as soon as a benchmark with 290 on top is found... ok.

There are games where 290 will come out on top.

My actual curiosity is seeing a new guru3d test, among many others, where they tested the frame times and saw frame drops, yet others weren't really showing much in the way of negative frame times.

But also, I'd like to see if there are changes on the Maxwell cards. The R9 series were fairly well established whereas the Maxwell cards were quite fresh - surely some metrics have changed, for better or worse.

I'm annoyed that the thread got bogged down in that direction anyhow.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
There are games where 290 will come out on top.

My actual curiosity is seeing a new guru3d test, among many others, where they tested the frame times and saw frame drops, yet others weren't really showing much in the way of negative frame times.

But also, I'd like to see if there are changes on the Maxwell cards. The R9 series were fairly well established whereas the Maxwell cards were quite fresh - surely some metrics have changed, for better or worse.

I'm annoyed that the thread got bogged down in that direction anyhow.
That makes the two of us, and off topic as it is but, i think you should direct the complaint at the origin of the problem... I didn't think me urging caution will start what is effectively a pissing contest. I fully realise that they're trading blows for the most part, unless it is a rather skewed thing, which one can hardly call a benchmark. It is why i urged you to take your time picking whatever, as at best you will have 30 days (depending on how things are where you are) in case you change your mind.

Things have changed on 970s front, but that apart, the drivers may be improving, but how much scope for improvement is there is a good question. Then again, i'd be very surprised if Maxwell cards aren't already quite optimised by now. IIRC Nvidia first launched Maxwell on mobile and they've been writing drivers for a bit now. i would be very surprised if they suddenly improved performance on pre-existing titles by a fair few percent.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
That makes the two of us, and off topic as it is but, i think you should direct the complaint at the origin of the problem... I didn't think me urging caution will start what is effectively a pissing contest. I fully realise that they're trading blows for the most part, unless it is a rather skewed thing, which one can hardly call a benchmark. It is why i urged you to take your time picking whatever, as at best you will have 30 days (depending on how things are where you are) in case you change your mind.

Things have changed on 970s front, but that apart, the drivers may be improving, but how much scope for improvement is there is a good question. Then again, i'd be very surprised if Maxwell cards aren't already quite optimised by now. IIRC Nvidia first launched Maxwell on mobile and they've been writing drivers for a bit now. i would be very surprised if they suddenly improved performance on pre-existing titles by a fair few percent.

Your point regarding optimization is a fair one. There was also Maxwell on the desktop as early as February of last year in the form of the GTX 745 and 750 series. However, the larger chips in any form, mobile or desktop, were not launched until the fall, so any optimization issue may not have been as readily apparent.

I grant you that they are likely fully optimized by now, and were likely running *mostly* optimally at launch, but there could stand to be improvements, especially in SLI frame pacing if there were actually any issues to begin with, as well as SLI scaling in specific titles or generalized per-game performance issues. Those are always ongoing issues though, so perhaps my question of optimization had been poorly framed.

Dealing with the potential compatibility issues with a secondary OS that doesn't even properly support DIY system builds, that adds a layer of complexity that I really despise.

In so many ways, the 290X does seem like the better route, but not only is there the worry of getting those cards to work in OS X with proper multi-monitor support, I also have to worry about how much that may threaten my CPU overclock. The system can get a bit of an upgrade in cooling with additional fans, and perhaps I am underestimating how much that upgrade will help. And Mantle sure would be nice until DX12 takes hold of the market.

The 970s are mostly a slam dunk in terms of compatibility, that said, I have to worry about frame pacing of the cards in SLI at 5760x1080. I know full well I can't do that resolution at absolute max settings in most modern games, I don't think any 2-way SLI/CF setup can truly boast that -- quadfire and 4-way SLI, on the other hand, absolutely. But I'm hoping to stumble upon completely clear and honest benchmarks that show playable framerates for all games are clear.
I haven't really seen any evidence that the bad results thus far obtained are strictly the result of the frame buffer allocation. Frankly, it seems like when the driver allows the game to actually utilize that much memory, it's really just a last ditch effort to provide SOME help for settings that are overwhelming the GPU. If the settings are simply too much for the GPU to handle, it won't play well on any card. And too many situations in any game can see too high of settings still playable under many areas, but suddenly, some viewpoint simply requires far too much, which might be what triggers the increase in VRAM.

People are going to debate that viewpoint, and say the simple fact that more VRAM required is the issue. It's sort of a chicken vs the egg scenario, and I just cannot determine which. I WANT to get the 970 for ease of compatibility and the fact that, why yes, it does in fact create less heat than the 970x, and I simply want to ensure I don't cause any issues in my case. My next build will be far, far more analyzed for such weaknesses, but it is what it is at the moment. But if it won't be ideal in terms of smooth gameplay for the foreseeable future, I might have to bite down and suffer through the growing pains of getting cards to work in OS X. I just don't want to go down that route, considering I never really had a perfect system in the first place, and the 560 Ti's were in a similar boat: sort of supported, but far less than later cards.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Your point regarding optimization is a fair one. There was also Maxwell on the desktop as early as February of last year in the form of the GTX 745 and 750 series. However, the larger chips in any form, mobile or desktop, were not launched until the fall, so any optimization issue may not have been as readily apparent.

I grant you that they are likely fully optimized by now, and were likely running *mostly* optimally at launch, but there could stand to be improvements, especially in SLI frame pacing if there were actually any issues to begin with, as well as SLI scaling in specific titles or generalized per-game performance issues. Those are always ongoing issues though, so perhaps my question of optimization had been poorly framed.

Dealing with the potential compatibility issues with a secondary OS that doesn't even properly support DIY system builds, that adds a layer of complexity that I really despise.

In so many ways, the 290X does seem like the better route, but not only is there the worry of getting those cards to work in OS X with proper multi-monitor support, I also have to worry about how much that may threaten my CPU overclock. The system can get a bit of an upgrade in cooling with additional fans, and perhaps I am underestimating how much that upgrade will help. And Mantle sure would be nice until DX12 takes hold of the market.

The 970s are mostly a slam dunk in terms of compatibility, that said, I have to worry about frame pacing of the cards in SLI at 5760x1080. I know full well I can't do that resolution at absolute max settings in most modern games, I don't think any 2-way SLI/CF setup can truly boast that -- quadfire and 4-way SLI, on the other hand, absolutely. But I'm hoping to stumble upon completely clear and honest benchmarks that show playable framerates for all games are clear.
I haven't really seen any evidence that the bad results thus far obtained are strictly the result of the frame buffer allocation. Frankly, it seems like when the driver allows the game to actually utilize that much memory, it's really just a last ditch effort to provide SOME help for settings that are overwhelming the GPU. If the settings are simply too much for the GPU to handle, it won't play well on any card. And too many situations in any game can see too high of settings still playable under many areas, but suddenly, some viewpoint simply requires far too much, which might be what triggers the increase in VRAM.

People are going to debate that viewpoint, and say the simple fact that more VRAM required is the issue. It's sort of a chicken vs the egg scenario, and I just cannot determine which. I WANT to get the 970 for ease of compatibility and the fact that, why yes, it does in fact create less heat than the 970x, and I simply want to ensure I don't cause any issues in my case. My next build will be far, far more analyzed for such weaknesses, but it is what it is at the moment. But if it won't be ideal in terms of smooth gameplay for the foreseeable future, I might have to bite down and suffer through the growing pains of getting cards to work in OS X. I just don't want to go down that route, considering I never really had a perfect system in the first place, and the 560 Ti's were in a similar boat: sort of supported, but far less than later cards.
Well its your dough, your system, and importantly if you are not comfortable with something, then you are not comfortable with something. If thought of going down amd makes you squeamish, then there's little that should be delaying your decision.

I can't speak of compatibility with hackintosh, but if temps and heat are a big concern, then 295x2 is always there, but iirc you said something about case and radiator. Or, 290s by visiontek with ek water blocks for 340 odd each, if you fancy that.

I sincerely will be impressed if they squeezed more out of the 9xx... the drivers are already working very well with the segmented memory blocks of 970, so Nvidia does have a hang of them. How will they perform with dx11.3 and 12 may be a question mark as yet, but no one can answer that yet. Now gm200, that is a different chip, no pun intended, as it will have more resources better compute and so on.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
So, I'm not sure what you were set out to prove? That the 290 is a good value? Sure, I get that. But just overclocking it doesn't suddenly make it that much more of an outstanding product, especially when the competition is overclocked. Frankly, I would have liked to see the 290x in Crossfire and overclocked thrown in with that review.


Can you read?

3% average lead for 970. 4% when overclocked. For $175 more. If you buy 970s over 290s at 4k you're throwing away $175 for 3% more performance. That's the bottom line. Rationalize that away however you like
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Yeah someone needs to sit down and do a roundup with the 290/290x/970/980 and show 1080p/1440 or 1600p/4k single and SLI with the latest drivers. I think there have been performance improvements in some games as well since then not even driver improvements.

The review I linked does exactly that. Read it.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The review I linked does exactly that. Read it.


Old drivers. The newest ones from nvidia are 347.25 and they are using 344.16 on tech spot. I don't know the newest AMD version.

I want to see a round up with the latest fixes and such for both sides.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Old drivers. The newest ones from nvidia are 347.25 and they are using 344.16 on tech spot. I don't know the newest AMD version.

I want to see a round up with the latest fixes and such for both sides.

Agreed, but that's the most recent one there is that includes aftermarket models, overclocking and frame time analysis. The latest few rounds of drivers seem to have been more features + bugfixes than large performance increases on both sides of the equation. The results are almost certainly very close to today's reality, within a percent on average though particular games may see slightly larger variances. We might see updated numbers on the next high-end release but I doubt we'll see a renewed evaluation just for driver updates, too much time investment to do one of these round-ups. IMO the driver situation seems to be affecting Kepler v Maxwell and GCN v Kepler comparisons a lot more than GCN v Maxwell comparisons in terms of relative performance
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Agreed, but that's the most recent one there is that includes aftermarket models, overclocking and frame time analysis. The latest few rounds of drivers seem to have been more features + bugfixes than large performance increases on both sides of the equation. The results are almost certainly very close to today's reality, within a percent on average though particular games may see slightly larger variances. We might see updated numbers on the next high-end release but I doubt we'll see a renewed evaluation just for driver updates, too much time investment to do one of these round-ups. IMO the driver situation seems to be affecting Kepler v Maxwell and GCN v Kepler comparisons a lot more than GCN v Maxwell comparisons in terms of relative performance


Yeah and I would also like to see a good comparison of heat and noise output between various brands of cards. I could hardly find much when I was searching for my new card(s). There is probably just not enough time for some sites to do all this.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
One thing I have been searching hard for is a comparison of system heat between configurations. You see exactly that for many CPU tests and especially for case tests, but the only temperature we see is the GPU temp itself.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
One thing I have been searching hard for is a comparison of system heat between configurations. You see exactly that for many CPU tests and especially for case tests, but the only temperature we see is the GPU temp itself.

Very true, this could help people decide if a reference card is actually better for their setup.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Yeah and I would also like to see a good comparison of heat and noise output between various brands of cards. I could hardly find much when I was searching for my new card(s). There is probably just not enough time for some sites to do all this.
In another thread, 290 trix was shown doing much better handling noise with crysis 3 at load, than some other cards.
http://forums.anandtech.com/showpost.php?p=37122380&postcount=11

Think you'll have to narrow down cards you're interested in buying then looking for individual reviews against stock stuff and others.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Well, I've ordered one MSI 290X Lightning, some more RAM (now I'll be at 16GB instead of 8GB), 4x Phanteks F140SP fans, and 1x Prolimatech 140mm Ultra Sleek Vortex fan.

That Prolimatech is seemingly the best of the very very very limited selection of 140mm slim-profile fans (15mm instead of 25mm), so it will fit in the upper side intake space in which an ordinary fan would run up against the CPU cooler. It should have enough space around it to still push out air, but I'll be watching for the possibility of turbulence, in which case I'll either force it to run at low speed or get rid of it altogether.

These Phanteks look like they should be massive upgrades over the side intake and rear exhaust I have now, and of course add a lot by adding two exhausts up top as well. The front intakes will remain stock for now, I'm trying to find 120mm LED fans that can be controlled by the light-switch on the Corsair 400R - it seems they are 3-pin but Corsair went funky and they aren't really standard pin-outs. Not sure if I can replace those and retain that function with higher-CFM fans or not.

I won't really benefit from Crossfire for now, not on one 1080p. Plus, MSI only allows 1 rebate per household so I won't get $60 back in rebates for the two cards, only $30 for one. So perhaps waiting a bit will see the card drop in price a little. And if the next generation is enticing and also compatible with OS X, it might be easier/less-costly to sell and then buy new card(s). Plus this will give me time to judge how well the interior cooling of the system will perform now that heat is being added directly, alongside with a total fan overhaul with far more exhaust potential and overall airflow.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Well, I've ordered one MSI 290X Lightning, some more RAM (now I'll be at 16GB instead of 8GB), 4x Phanteks F140SP fans, and 1x Prolimatech 140mm Ultra Sleek Vortex fan.

That Prolimatech is seemingly the best of the very very very limited selection of 140mm slim-profile fans (15mm instead of 25mm), so it will fit in the upper side intake space in which an ordinary fan would run up against the CPU cooler. It should have enough space around it to still push out air, but I'll be watching for the possibility of turbulence, in which case I'll either force it to run at low speed or get rid of it altogether.

These Phanteks look like they should be massive upgrades over the side intake and rear exhaust I have now, and of course add a lot by adding two exhausts up top as well. The front intakes will remain stock for now, I'm trying to find 120mm LED fans that can be controlled by the light-switch on the Corsair 400R - it seems they are 3-pin but Corsair went funky and they aren't really standard pin-outs. Not sure if I can replace those and retain that function with higher-CFM fans or not.

I won't really benefit from Crossfire for now, not on one 1080p. Plus, MSI only allows 1 rebate per household so I won't get $60 back in rebates for the two cards, only $30 for one. So perhaps waiting a bit will see the card drop in price a little. And if the next generation is enticing and also compatible with OS X, it might be easier/less-costly to sell and then buy new card(s). Plus this will give me time to judge how well the interior cooling of the system will perform now that heat is being added directly, alongside with a total fan overhaul with far more exhaust potential and overall airflow.


That's a winning outcome. The MSI 290x lightning is a solid single card solution. Frankly after having done multiple Multi GPU setups from both AMD and nVidia I won't be going back. Single card is just an easier way to ensure you are getting all the performance of the card. If you aren't gaming at 4k i'd go fastest single card you can afford everytime.

290x lightning is great, you may want to adjust the fan curve if noise bothers you, from what I read that card keeps the GPU at 70c but will ramp up fans to get there, you are fine with letting the GPU hit 80c and having slower spinning fans.

Phantek fans are best fans i've used for 140mm, i'm a quietPC fanantic.
 

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
Congratz! Just make sure you run some kind of 'driver cleaner' type of program BEFORE you install the card. I didn't do that when I went from a 970 SLI to a 290x CF setup and the stutter was so bad in my games I thought I made a very wrong decision in switching. Now it runs super smooth.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
That's a winning outcome. The MSI 290x lightning is a solid single card solution. Frankly after having done multiple Multi GPU setups from both AMD and nVidia I won't be going back. Single card is just an easier way to ensure you are getting all the performance of the card. If you aren't gaming at 4k i'd go fastest single card you can afford everytime.

290x lightning is great, you may want to adjust the fan curve if noise bothers you, from what I read that card keeps the GPU at 70c but will ramp up fans to get there, you are fine with letting the GPU hit 80c and having slower spinning fans.

Phantek fans are best fans i've used for 140mm, i'm a quietPC fanantic.

Well I will be going dual-GPU, no ifs ands or buts about it. I just couldn't afford to get both cards right now, not with everything else I needed. As I don't have my triple-monitor setup at the moment due to the current living situation and lack of desk space, I'm not stressing the second card just yet. But once I am ready to get my Surround (err.. Eyefinity) setup going again, it will be a must. And since 5760x1080 is 75% of 4K in pixels, that demands a good GPU solution, especially since it has a smaller pixel density so 4K's ability to handle less/no AA doesn't apply here.

I'd love to go single-GPU, and not ever worry about driver efficiency or frame-pacing ever again, but a single GPU handling that kind of resolution at high settings just won't be happening, at least not any time soon.

I will need to try and get the second 290X before 30 days is up, however, because I need to make sure I can handle Crossfire, but due to temperature and, just as importantly, how smooth gameplay is (frame pacing and whatnot). Hopefully I can make that happen.
 

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
Well I will be going dual-GPU, no ifs ands or buts about it. I just couldn't afford to get both cards right now, not with everything else I needed. As I don't have my triple-monitor setup at the moment due to the current living situation and lack of desk space, I'm not stressing the second card just yet. But once I am ready to get my Surround (err.. Eyefinity) setup going again, it will be a must. And since 5760x1080 is 75% of 4K in pixels, that demands a good GPU solution, especially since it has a smaller pixel density so 4K's ability to handle less/no AA doesn't apply here.

I'd love to go single-GPU, and not ever worry about driver efficiency or frame-pacing ever again, but a single GPU handling that kind of resolution at high settings just won't be happening, at least not any time soon.

I will need to try and get the second 290X before 30 days is up, however, because I need to make sure I can handle Crossfire, but due to temperature and, just as importantly, how smooth gameplay is (frame pacing and whatnot). Hopefully I can make that happen.

In my experience two open cooler 290x (tri-x), the top card ran noticeably hotter/louder so I ended up putting a reference 290x with a Corsair H90/HG10 combo. Now the top GPU runs at 55C. The air cooled bottom one is in the 70's. I'm thinking you may have a bigger issue as the lightning cooler is even thicker.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Well I will be going dual-GPU, no ifs ands or buts about it. I just couldn't afford to get both cards right now, not with everything else I needed. As I don't have my triple-monitor setup at the moment due to the current living situation and lack of desk space, I'm not stressing the second card just yet. But once I am ready to get my Surround (err.. Eyefinity) setup going again, it will be a must. And since 5760x1080 is 75% of 4K in pixels, that demands a good GPU solution, especially since it has a smaller pixel density so 4K's ability to handle less/no AA doesn't apply here.

I'd love to go single-GPU, and not ever worry about driver efficiency or frame-pacing ever again, but a single GPU handling that kind of resolution at high settings just won't be happening, at least not any time soon.

I will need to try and get the second 290X before 30 days is up, however, because I need to make sure I can handle Crossfire, but due to temperature and, just as importantly, how smooth gameplay is (frame pacing and whatnot). Hopefully I can make that happen.


Yea for triple screen you'll need to go Dual GPU. XDMA crossfire from the 290/290x line is pretty solid given games have good dual gpu support.

Have you looked into ability to run triple screen all from one DP 1.2 port? GPU via DP 1.2 to screen 1, screen 1 via DP to screen 2, screen 2 via DP to screen 3. Monitors would need to support MST and i'm not sure if eyefinity supports this setup, but DP 1.2 can drive upto 4 1080p screens from one output.

http://www.displayport.org/cables/driving-multiple-displays-from-a-single-displayport-output/


Should let you get away with cleaner wiring and I believe vsync works on all monitors with a setup like this. Otherwise vsync has had issues in the past if multiple different video outputs are used from the gpu.



Edit: Eyefinity Supports DP 1.2 MST capable devices Easiest would be to go with 3x MST capable monitors, but an mst hub could be used to drive non mst capable displays.
 
Last edited:

destrekor

Lifer
Nov 18, 2005
28,799
359
126
In my experience two open cooler 290x (tri-x), the top card ran noticeably hotter/louder so I ended up putting a reference 290x with a Corsair H90/HG10 combo. Now the top GPU runs at 55C. The air cooled bottom one is in the 70's.

When the time comes, I'll have to see how that all comes together. If I can, I'll manage with fan profiles and hopefully that higher-flow side intake will help.

Generally, yes, the wisdom is that top cards, regardless of cooler type (ignoring AIO), always run hotter.

If it seems to be too high with even the max fan, and/or that max fan is unbearably loud, I'll be investigating some options. Looking into the H90, it seems it should fit in the top fan mounts, but perhaps not. I know the H100 (and, IIRC, the H100i, which is only 2mm thicker) both fit, but those are 120mm radiators with spacing, which is how Corsair words the fit compatibility for the case. A true 140mm radiatior might not fit due to hitting the motherboard, not sure.
If it fits, then it will fit even with the NH-D14 on, as it doesn't extend past the motherboard, it actually is short of that mark. That's one that would work, with the tubing running down by the cages. I don't know if a second one would fit in the back top slot, the tubing might be able to run down near the rear IO panel.

All the other AIO solutions I had seen were 38mm (iirc) radiators, plus the 25mm fan. Seeing as the H100 is 25mm (so 50mm total), and the H100i is 27mm (52mm), and the H90 is 27mm (52mm), there is that chance.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Yea for triple screen you'll need to go Dual GPU. XDMA crossfire from the 290/290x line is pretty solid given games have good dual gpu support.

Have you looked into ability to run triple screen all from one DP 1.2 port? GPU via DP 1.2 to screen 1, screen 1 via DP to screen 2, screen 2 via DP to screen 3. Monitors would need to support MST and i'm not sure if eyefinity supports this setup, but DP 1.2 can drive upto 4 1080p screens from one output.

http://www.displayport.org/cables/driving-multiple-displays-from-a-single-displayport-output/


Should let you get away with cleaner wiring and I believe vsync works on all monitors with a setup like this. Otherwise vsync has had issues in the past if multiple different video outputs are used from the gpu.



Edit: Eyefinity Supports DP 1.2 MST capable devices Easiest would be to go with 3x MST capable monitors, but an mst hub could be used to drive non mst capable displays.

I might look into that. Sure would be nice to stop dealing with a bunch of DVI. That was one thing I was bummed about not going with some of the 970 or 980 models, as they all had 3 DP ports. Only the 295X2 had multiple DP ports. But while an MST device would add a pretty penny to the setup, it might just be worth it.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
When the time comes, I'll have to see how that all comes together. If I can, I'll manage with fan profiles and hopefully that higher-flow side intake will help.

Generally, yes, the wisdom is that top cards, regardless of cooler type (ignoring AIO), always run hotter.

If it seems to be too high with even the max fan, and/or that max fan is unbearably loud, I'll be investigating some options. Looking into the H90, it seems it should fit in the top fan mounts, but perhaps not. I know the H100 (and, IIRC, the H100i, which is only 2mm thicker) both fit, but those are 120mm radiators with spacing, which is how Corsair words the fit compatibility for the case. A true 140mm radiatior might not fit due to hitting the motherboard, not sure.
If it fits, then it will fit even with the NH-D14 on, as it doesn't extend past the motherboard, it actually is short of that mark. That's one that would work, with the tubing running down by the cages. I don't know if a second one would fit in the back top slot, the tubing might be able to run down near the rear IO panel.

All the other AIO solutions I had seen were 38mm (iirc) radiators, plus the 25mm fan. Seeing as the H100 is 25mm (so 50mm total), and the H100i is 27mm (52mm), and the H90 is 27mm (52mm), there is that chance.

I assume you saw the article at The Tech Buyers Guru about 780 ti reference/custom coolers in SLI.
http://forums.anandtech.com/showthread.php?t=2418999

A great thing about XDMA crossfire is that you can space the cards further apart and get more air movement between them with your case fans.