Pretty sure this is the card...

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
Some of you guys know about my new build - a triple-monitor research and statistics computer that will be doing double/secondary duty as a photo and video editing machine.

The specs (that I am 95% sure on) are as follows:

I will be using the video card in an extended-desktop, PLP, 4960 x 1600 configuration, as follows:

Dell 2007FP -->> Dell U3014 <<-- Dell 2007FP

Thus far this is the top contender:

14-487-133-TS



I am using the Xeon to keep temps down and save power, as every little bit counts and adds up. Trying to do the same with the video card. What do you guys think?

Also, will it be possible to turn down the clocks if needed - presuming it won't affect day-to-day performance? I can always turn them back up if/when I need to...
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Meh, I wouldn't get the 4GB model. Won't help you anymore and it will hurt your power consumption and it's more expensive. So 3 knocks against it. Card isn't fast enough to utilize more VRAM. You wouldn't buy a 16GB VRAM model of the card would you? More VRAM doesn't mean better.
 

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
Meh, I wouldn't get the 4GB model. Won't help you anymore and it will hurt your power consumption and it's more expensive. So 3 knocks against it. Card isn't fast enough to utilize more VRAM. You wouldn't buy a 16GB VRAM model of the card would you? More VRAM doesn't mean better.
I'm doing it because some of the guys have told me that the extra 2GB on-card helps when editing large photos. Is that not correct?
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
You should probably wait a week and see how the GTX 950 pans out. There might even be a 4GB variant for $150.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I'm doing it because some of the guys have told me that the extra 2GB on-card helps when editing large photos. Is that not correct?

Not unless you are using filters that support OpenCL, in which case you should really get an AMD card as they blow nVidia out of the water in OpenCL performance.

But for just driving the displays, the extra 2GB does nothing for you.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
How large are your photos? My 24MP NEF files never use more than 1GB VRAM in Lightroom or Perfect Photo Suite. Typically they use under 750MB. This is at 1920x1200, but I don't think it'd be different at your resolution unless you have multiple images on screen at once. Video editing may be different though, I've only done light editing with 1080p video and never checked VRAM usage.

Check about Nvidia support with PLP setups. In the past, their support for this has been lackluster. Things may have changed since then, but it is something to look into.

Edit - It may just be gaming that has PLP problems on Nvidia, non-gaming desktop usage might be fine.
 
Last edited:

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
You should probably wait a week and see how the GTX 950 pans out. There might even be a 4GB variant for $150.
Wow, just looked at the specs and it might be running at 90W TDP!!! o_O

If so, and if this card has at least 3 x monitor outputs, this might be exactly what I need. :wub:
 

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
Check about Nvidia support with PLP setups. In the past, their support for this has been lackluster. Things may have changed since then, but it is something to look into.

Edit - It may just be gaming that has PLP problems on Nvidia, non-gaming desktop usage might be fine.
Yes, that was what I had heard, that it was gaming where they had issues.

As far as I know thus far, NVidia's cards will kill AMD's offerings for my purposes, simply because AMD's run so flippin' hot. :twisted:
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
You should probably wait a week and see how the GTX 950 pans out. There might even be a 4GB variant for $150.

So that's why the GTX 750 Ti has dropped by $20 recently. Speaking of which, I think they can support three monitors. Would it be too weak for the OP's setup? Definitely power thrifty.
 
Last edited:

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,805
1,018
126
I too think you should wait for the GTX 950 to launch before you make your final decision. That seems like the most likely route for now.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Some of you guys know about my new build - a triple-monitor research and statistics computer that will be doing double/secondary duty as a photo and video editing machine.

The specs (that I am 95% sure on) are as follows:

.

1) Why are you choosing the 3.5Ghz E3-1241V3 vs. i7 4790K that boosts to 4.4Ghz out of the box with a minimum guaranteed clock speed of 4Ghz?

2) Why are you buying into an outdated architecture, outdated chipset H97, when Skylake launches August 5th, with USB 3.1, Ultra.M2 32GB/sec PCIe 3.0 x4 (some boards will have 2 of these), Thunderbolt 3, upgraded audio?

3) You are getting 32GB of DDR3-1600 - outdated RAM that can't be reused for future CPU upgrades over the next 3-5 years. For example, at least you'll probably be able to reuse DDR4-2400/2666 for Icelake in 2018. However, this 32GB of DDR3 is going to be unusable for next gen CPU roll-over.

The price premium for DDR4 vs. DDR3 is not that large anymore.

32GB DDR4-2133 = $180
32GB DDR4-2666 = $250

With 4x8GB sticks, there is still room for more RAM down the line if necessary too. Plus this DDR4 can be reused in the future if you decide to upgrade/swap X99/Skylake in 3-4 years.

4) What about your SSD/M.2 drive choice? You can't have a modern/fast system with a mechanical HDD as the backbone for your OS. With a mechanical drive, your multi-purpose productivity PC is basically an anchor. Did you allocate enough budget for this component?

5) If you aren't gaming, I suggest waiting for GTX950 series August 17th. It'll cost you less and use less power. Since you aren't gaming, might as well that save $ and invest the savings from a lower videocard towards a faster CPU (or better yet the Skylake platform). Don't forget that Skylake-S/K series themselves will have a GPU inside that could be used as an option for video encoding/decoding. In addition to that, you may not even need a GTX950/960 videocard if you use the Skylake IGP.

Either way, I think it makes sense to wait until August 5th to see how well Skylake does for video encoding/decoding and photo editing. In Photoshop for example, Skylake's preview benchmarks crush Haswell.

Intel also highlights new video decoding/encoding IGP capabilities:
88b.jpg


6) As a side-note to Skylake, you should also strongly consider stepping up to a 5820K if you are going to be using your machine for photo/video editing. Among the choices on hand (4790K, Skylake i7-6700K and 5820K OC), your CPU processor choice doesn't make a lot of sense to me. Maybe you could explain it better? For example a 5820K @ 4.5Ghz + 16GB DDR4 + GTX950 will be a better video editing machine than what you have chosen. Certaintly 6700K will crush that 3.5Ghz Quad-core Haswell as well for not much more $.

Look at what other people are assembling for video editing:
http://forums.anandtech.com/showthread.php?t=2441523
 
Last edited:

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
I think he just has photographic memory...or maybe he runs a blog he hasn't told us about.
 

Fun Guy

Golden Member
Oct 25, 1999
1,210
5
81
1) Why are you choosing the 3.5Ghz E3-1241V3 vs. i7 4790K that boosts to 4.4Ghz out of the box with a minimum guaranteed clock speed of 4Ghz?
3.5Ghz with a turbo (as needed) freq of 3.9Ghz.

Mostly for heat, noise, and power-related concerns, am willing to give up a little performance for a quieter, cooler, and less expensive to operate box.

2) Why are you buying into an outdated architecture, outdated chipset H97, when Skylake launches August 5th, with USB 3.1, Ultra.M2 32GB/sec PCIe 3.0 x4 (some boards will have 2 of these), Thunderbolt 3, upgraded audio?
How many devices out there can even take advantages of those specs right now?
3) You are getting 32GB of DDR3-1600 - outdated RAM that can't be reused for future CPU upgrades over the next 3-5 years. For example, at least you'll probably be able to reuse DDR4-2400/2666 for Icelake in 2018. However, this 32GB of DDR3 is going to be unusable for next gen CPU roll-over.

The price premium for DDR4 vs. DDR3 is not that large anymore.

32GB DDR4-2133 = $180
32GB DDR4-2666 = $250

With 4x8GB sticks, there is still room for more RAM down the line if necessary too. Plus this DDR4 can be reused in the future if you decide to upgrade/swap X99/Skylake in 3-4 years.
Yeah, I wish I had not already purchased the 32GB of DDR3, on sale....
4) What about your SSD/M.2 drive choice? You can't have a modern/fast system with a mechanical HDD as the backbone for your OS. With a mechanical drive, your multi-purpose productivity PC is basically an anchor. Did you allocate enough budget for this component?
Who said anything about a mechanical HDD?
5) If you aren't gaming, I suggest waiting for GTX950 series August 17th. It'll cost you less and use less power. Since you aren't gaming, might as well that save $ and invest the savings from a lower videocard towards a faster CPU (or better yet the Skylake platform). Don't forget that Skylake-S/K series themselves will have a GPU inside that could be used as an option for video encoding/decoding. In addition to that, you may not even need a GTX950/960 videocard if you use the Skylake IGP.
I will definitely check it out - thanks.
Either way, I think it makes sense to wait until August 5th to see how well Skylake does for video encoding/decoding and photo editing. In Photoshop for example, Skylake's preview benchmarks crush Haswell.
I agree - will do.
6) As a side-note to Skylake, you should also strongly consider stepping up to a 5820K if you are going to be using your machine for photo/video editing. Among the choices on hand (4790K, Skylake i7-6700K and 5820K OC), your CPU processor choice doesn't make a lot of sense to me. Maybe you could explain it better? For example a 5820K @ 4.5Ghz + 16GB DDR4 + GTX950 will be a better video editing machine than what you have chosen. Certaintly 6700K will crush that 3.5Ghz Quad-core Haswell as well for not much more $.
Like I said, heat, noise, and power draw are factors. I will not be running this machine in a company-paid-for, air-conditioned space, it will be in my home office.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Are you going to be rendering or doing intensive background processes 24/7 or something? CPUs all downclock with adaptive voltage/speedstep. My CPU overclocks to 4.6GHz while gaming but on the desktop 99% of the time it runs at 1.2GHz. Power really shouldn't be a concern.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Are you going to be rendering or doing intensive background processes 24/7 or something? CPUs all downclock with adaptive voltage/speedstep. My CPU overclocks to 4.6GHz while gaming but on the desktop 99% of the time it runs at 1.2GHz. Power really shouldn't be a concern.

I'm at 1 Ghz right now while on anandtech on my 4770k or something around that usually lower even. So ya, I'd see how skylake has improved on this as this is Haswell, we've had DC, Broadwell, and now Skylake to improve on this. I wouldn't knock Skylake at all.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
I'm at 1 Ghz right now while on anandtech on my 4770k or something around that usually lower even. So ya, I'd see how skylake has improved on this as this is Haswell, we've had DC, Broadwell, and now Skylake to improve on this. I wouldn't knock Skylake at all.

Skylake is only supposed to be a 13-15% power reduction compared to Haswell at the same clocks so it's not significant at all in power savings unless you are running a render farm with thousands of machines.