• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

New NON-GAMING (OMG!?!?) Rig -Advice?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Ok, this is a bigger deal than I thought 🙂.....

First, how much of NV's claims are hype (AKA is it really going to be much better than the equivalent ATI card?) and how much is truth: http://www.nvidia.com/object/adobe_photoshop.html.

If that is pretty accurate, then it seems like a no-brainer that I should be getting an NV card over ATI. Of course, thats not what I wanted to hear since I have been looking at ATI cards this entire time and finally found one that met my requirements (silent cooling).

The stuff on that page is mostly hype. The only major improvement is the Mercury playback engine for Premiere. If you do want to do Nvidia, make sure you get a supported card! You can't just get any card to work, it has to be a card off of this list:
> Quadro 5000*
> Quadro 4000*
> Quadro 5000M*
> Quadro FX 5800

> Quadro FX 4800
> Quadro FX 4800 Mac
> Quadro FX 3800
> GeForce GTX 285
*Adobe qualification will be available with upcoming software release.

I personally wouldn't bother unless you use Premiere though.

Also, I think I misspoke a bit about the monitor thing....all I am basically saying is, if I am using 3 monitors with just one card (the 5770), is it true that if all I am doing is working on my main monitor (3007WFP-HC) and have nothing on the other screens other than some file windows open, that my performance is going to suffer (relative to if I just had one screen) due to the other monitors using up some of my vRAM?

If 3 monitors really will chew up equal amounts of the vRAM, and I NV's claims are accurate, looks like I will be looking for 2 nvidia cards now (and of course, the second card does not need to be powerful...).

There would be a small amount of memory used for the framebuffer, but in 2D mode that would be ~18MB for 2 1920x1200 panels using 32-bit color. That's a negligible amount really.
 
There would be a small amount of memory used for the framebuffer, but in 2D mode that would be ~18MB for 2 1920x1200 panels using 32-bit color. That's a negligible amount really.
See, thats what I thought, but then David put this:

I know it's nice to have 3 monitors but that kills the card (especially if they are three HD monitors) because it splits video ram among them equally. If you want three, get two Geforce 9800s (or better cards) with 1GB vRAM. This will help with video previews and I'm guessing photo editing as well. I've edited a lot of 4K photos at 1080p but have little experience with .RAW files.

You seem to know your stuff mfenn (and it sounds like what you are saying definitely disagrees with David's statement), I just don't want to ignore what someone else said either, considering its regarding a very critical element of what I get.

I guess the reason I am seemingly overly concerned here is, I already feel somewhat silly for dropping ~$800 for what is probably only a marginal increase over what I already have, so I would hate to do something (ie: use 3 monitors with one card) that will legitimately cut into my already relatively small increase in performance. But yea, if it really is as you describe mfenn - a totally negligible drop in performance - then 5770 it is.
 
Ok, this is a bigger deal than I thought 🙂.....

First, how much of NV's claims are hype (AKA is it really going to be much better than the equivalent ATI card?) and how much is truth: http://www.nvidia.com/object/adobe_photoshop.html.

If that is pretty accurate, then it seems like a no-brainer that I should be getting an NV card over ATI. Of course, thats not what I wanted to hear since I have been looking at ATI cards this entire time and finally found one that met my requirements (silent cooling).

Also, I think I misspoke a bit about the monitor thing....all I am basically saying is, if I am using 3 monitors with just one card (the 5770), is it true that if all I am doing is working on my main monitor (3007WFP-HC) and have nothing on the other screens other than some file windows open, that my performance is going to suffer (relative to if I just had one screen) due to the other monitors using up some of my vRAM?

If 3 monitors really will chew up equal amounts of the vRAM, and I NV's claims are accurate, looks like I will be looking for 2 nvidia cards now (and of course, the second card does not need to be powerful...).

NVidia cards are better for prosumer computers because of CUDA. That is the difference. CUDA assists the processor in rendering Video, 3D, and Composition work. There is no question that NVidia has far surpassed ATI in developing workstation graphics technology. The support from 3rd party developers (Adobe, Autodesk) shows that. If you were just getting a gaming rig, I would recommend ATI for their better price/ performance ratio IN GAMING, but no question Geforce and CUDA have my vote for this build.
 
Hahaha, this is what I get for being so hesitant - newegg's 10% cashback ended tonight, which means I lost out on about $80-85. This is all your fault David for making me question my 5770 🙂.
 
Cashback was on the 27th (almost all day) and then for only another few hours on the 28th.

This is interesting...time for more questions than answers...I can't verify if this is true or not:

"The other issue is that the HD5770 does not support Double Precision math in line, unlike the HD4770, which means if the calculations require 64 bit math (or greater than 32bit FP) then it will run slower, and this is likely an issue when calculating images with base 48bit colour palettes in RAW format."
http://www.tomshardware.com/forum/278452-33-photoshop-nvidia-radeon

For CUDA workaround for "non-supported" nV cards, see the link I posted earlier, also check here:
http://blog.krama.tv/hacking-adobe-premiere-cs5-to-enable-more-nvidia-cuda-cards/
 
See, thats what I thought, but then David put this:



You seem to know your stuff mfenn (and it sounds like what you are saying definitely disagrees with David's statement), I just don't want to ignore what someone else said either, considering its regarding a very critical element of what I get.

I guess the reason I am seemingly overly concerned here is, I already feel somewhat silly for dropping ~$800 for what is probably only a marginal increase over what I already have, so I would hate to do something (ie: use 3 monitors with one card) that will legitimately cut into my already relatively small increase in performance. But yea, if it really is as you describe mfenn - a totally negligible drop in performance - then 5770 it is.

What David says in that quote makes absolutely no sense. Think about it this way. The VRAM is divided into a few partitions, but lets focus on two. (I am grossly simplifying here for clarity)

There is the actual framebuffer (probably about ~40MB with a triple display). The framebuffer is where the actual pixel data that is going out to the monitor will be stored. This is the only part of the VRAM that cares about where the pixels are on the monitor(s) (their coordinates essentially).

The other 95% of the VRAM is used as working memory for shader programs (i.e. what actually renders the 3D). These programs don't know and don't care what monitor or where on a monitor the output is going. They simply render the image in their own coordinate space and that render is later composited into the final image that is put into the framebuffer.

Bottom line: absolutely no video card is brain-dead enough to statically partition all available VRAM. Doing so makes no sense once you understand the pipeline.
 
What David says in that quote makes absolutely no sense. Think about it this way. The VRAM is divided into a few partitions, but lets focus on two. (I am grossly simplifying here for clarity)

There is the actual framebuffer (probably about ~40MB with a triple display). The framebuffer is where the actual pixel data that is going out to the monitor will be stored. This is the only part of the VRAM that cares about where the pixels are on the monitor(s) (their coordinates essentially).

The other 95% of the VRAM is used as working memory for shader programs (i.e. what actually renders the 3D). These programs don't know and don't care what monitor or where on a monitor the output is going. They simply render the image in their own coordinate space and that render is later composited into the final image that is put into the framebuffer.

Bottom line: absolutely no video card is brain-dead enough to statically partition all available VRAM. Doing so makes no sense once you understand the pipeline.

I must be educated by idiots... and I was spreading the idiocy everywhere. It would technically still have the extra resolution stressing it though right?

I just did a ton of research on CUDA in Premiere Pro CS5 (not really used much in CS4) and it looks promising. Although there are a short list supported by default (the high end Quadros and the GTX 285) adding support to any other CUDA GPUs is as simple as adding it to a text document. I'm still asking questions as to how stable it is, but if you can wait a day or so I should have answers. Where I found the hacks, the people testing seemed to recommend the new Fermi cards for their high CUDA Core count. So they SEEM to run the Mercury Playback Engine in relative comfort.

I have to give a bow to mfenn. He has helped me with many problems and corrected me many times. I'm arguing for a technology I don't have as well, but I have done hours of research in to be able to regret getting my ATI. As soon as I come into some money in the next week I am getting a 465 or 470 to replace it.
 
Last edited:
I must be educated by idiots... and I was spreading the idiocy everywhere. It would technically still have the extra resolution stressing it though right?

I just did a ton of research on CUDA in Premiere Pro CS5 (not really used much in CS4) and it looks promising. Although there are a short list supported by default (the high end Quadros and the GTX 285) adding support to any other CUDA GPUs is as simple as adding it to a text document. I'm still asking questions as to how stable it is, but if you can wait a day or so I should have answers. Where I found the hacks, the people testing seemed to recommend the new Fermi cards for their high CUDA Core count. So they SEEM to run the Mercury Playback Engine in relative comfort.

I have to give a bow to mfenn. He has helped me with many problems and corrected me many times. I'm arguing for a technology I don't have as well, but I have done hours of research in to be able to regret getting my ATI. As soon as I come into some money in the next week I am getting a 465 or 470 to replace it.

It's OK, we all have to learn sometime. 🙂 The extra resolution will take a little chunk out of the available memory, but it won't make much of a different.
 
Ok, got all my stuff in, and after further debate, I think I've decided to go ahead and get an SSD - it just seems silly to do this big upgrade, then leave out something like the SSD that could legitimately help a system probably more than any other upgrade I did, just because I feel like they are still so overpriced.

My main hesitation though is the small size of these things. My old computer build (I finally got rid of it) had a 36GB Raptor (that I overpaid for at the time) as the main drive (with windows and my main programs), then bigger drives as backups/less accessed programs. I ended up hating this though, since I would inevitably end up filling up the drive somehow. The culprit was almost always temp files for things like lightroom/photoshop, and large cache/temp files for chrome/firefox. I vowed to never do that again, and I feel like I am about to jump right back on the ship.

So basically, as clarification, is the "strategy" for using SSD really just like I did with the raptor before (SSD as boot drive along with main programs), and if so, how do people manage the large temp files, etc., with these smaller 60GB drives???
 
Ok, got all my stuff in, and after further debate, I think I've decided to go ahead and get an SSD - it just seems silly to do this big upgrade, then leave out something like the SSD that could legitimately help a system probably more than any other upgrade I did, just because I feel like they are still so overpriced.

My main hesitation though is the small size of these things. My old computer build (I finally got rid of it) had a 36GB Raptor (that I overpaid for at the time) as the main drive (with windows and my main programs), then bigger drives as backups/less accessed programs. I ended up hating this though, since I would inevitably end up filling up the drive somehow. The culprit was almost always temp files for things like lightroom/photoshop, and large cache/temp files for chrome/firefox. I vowed to never do that again, and I feel like I am about to jump right back on the ship.

So basically, as clarification, is the "strategy" for using SSD really just like I did with the raptor before (SSD as boot drive along with main programs), and if so, how do people manage the large temp files, etc., with these smaller 60GB drives???

Yes, that's the strategy pretty much. I've never had a problem with the browser caches though. I've migrated my profile since Firefox 0.9 and never deleted anything out of it and it's only 56MB.

EDIT: Hell, my whole roaming profile is only 5GB (most of that is Thunderbird).
 
Last edited:
Yea, for whatever reason, Chrome temp/cache was getting up to like 4GB, so I deleted most everything in there, and I ended up effing up stuff (making me even more frustrated).....its much easier to just never even worry about it 🙂.

So would you say 60GB would/should be enough for boot/adobe suite/zune software/a few other small programs (AKA if I am diligent at all with keeping this stuff in line, 60GB should be plenty), or is it likely that 60GB could still very well not be enough??
 
Yea, for whatever reason, Chrome temp/cache was getting up to like 4GB, so I deleted most everything in there, and I ended up effing up stuff (making me even more frustrated).....its much easier to just never even worry about it 🙂.

So would you say 60GB would/should be enough for boot/adobe suite/zune software/a few other small programs (AKA if I am diligent at all with keeping this stuff in line, 60GB should be plenty), or is it likely that 60GB could still very well not be enough??

60GB will be plenty. You can get Windows 7 x64 and CS4 into 30GB if you really try.
 
mfenn is advising me to put the money into a 64GB right now. That seems to be the sweet spot for where my installations are at. I have CS4 and 3DS max and other random programs like Office and Aim installed on my Velociraptor thats 150GB. I likely wouldn't use the entire thing if not for secondary programs I could put on the other drive. Turns out you can get a 64GB SATA III and a 1TB Caviar Black for the same price of a SATA 3 600GB Velociraptor. So all and all it's a pretty good setup.
 
Alright, quick update:

Everything is installed and working properly - really happy with how everything worked out. I do have one problem though:

With the 5770 I have 3 monitors hooked up - 3007wfp-hc (2560x1600) into DVI, 2405wfp (1920x1200) into DVI, 2nd 2405wfp (1920x1200) into DP -> VGA adapter. The first two monitors work perfectly, and the 2405 into the DP -> VGA is able to achieve its native 1920x1200 resolution, except the "quality" is horrible. Its hard to explain (I will post a picture later), but basically none of the pixels are crisp at all...its like they all kinda bleed together a bit, and the colors are very drab. I tried the other 2405 into the adapter and got the same result. Not sure what the problem is - I am googling it right now, but can't find anyone with the same issue.

Any ideas?
 
Alright, quick update:

Everything is installed and working properly - really happy with how everything worked out. I do have one problem though:

With the 5770 I have 3 monitors hooked up - 3007wfp-hc (2560x1600) into DVI, 2405wfp (1920x1200) into DVI, 2nd 2405wfp (1920x1200) into DP -> VGA adapter. The first two monitors work perfectly, and the 2405 into the DP -> VGA is able to achieve its native 1920x1200 resolution, except the "quality" is horrible. Its hard to explain (I will post a picture later), but basically none of the pixels are crisp at all...its like they all kinda bleed together a bit, and the colors are very drab. I tried the other 2405 into the adapter and got the same result. Not sure what the problem is - I am googling it right now, but can't find anyone with the same issue.

Any ideas?

VGA is simply not that great at driving 1920x1200, especially with cheap cables/adapters. What you're seeing is not surprising. You could try getting a higher-quality adapter/cables.

Since the 5000 series only supports 2 DVI outputs at once (even with passive adapters), the only true solution is an active DP to DVI adapter or a native DP display.
 
Yea - cable is brand new and adapter is a sapphire one that goes with the card, so I DOUBT thats the problem, but I will pick up some different ones on the way home just to rule that out.

Assuming no adapter/cable combination fixes this problem, then is there anything else to do? I ask because I have read many people online able to achieve a perfect picture using the exact same setup (with the exception of another 24 inch instead of the 30 inch - anyway 30 inch could be pulling too much power or something?
 
Yea - cable is brand new and adapter is a sapphire one that goes with the card, so I DOUBT thats the problem, but I will pick up some different ones on the way home just to rule that out.

Assuming no adapter/cable combination fixes this problem, then is there anything else to do? I ask because I have read many people online able to achieve a perfect picture using the exact same setup (with the exception of another 24 inch instead of the 30 inch - anyway 30 inch could be pulling too much power or something?

No, the 30" doesn't matter. Just because something comes with the card, doesn't mean that it is high quality. In fact, it usually means just the opposite. Why give away an expensive freebie when you can give away a cheap one?

Like I said, trial and error might get you a better picture, but it will never be perfect like a DVI one will be. Personally, I can always tell the difference between DVI and VGA at 1600x1200 and greater, and the difference looks just as you describe, albeit to a lesser extent.

"Many people" may have a much less discerning idea of perfect than you or I. If we can't observe their setups in person, their subjective quality impressions are useless.
 
No, the 30" doesn't matter. Just because something comes with the card, doesn't mean that it is high quality. In fact, it usually means just the opposite. Why give away an expensive freebie when you can give away a cheap one?

Like I said, trial and error might get you a better picture, but it will never be perfect like a DVI one will be. Personally, I can always tell the difference between DVI and VGA at 1600x1200 and greater, and the difference looks just as you describe, albeit to a lesser extent.

"Many people" may have a much less discerning idea of perfect than you or I. If we can't observe their setups in person, their subjective quality impressions are useless.

The adapter didn't technically "come with" the card, its just made by sapphire for eyefinity, and it got really good reviews on newegg. Still worth trying a different adapter obviously.

If I can get it to where the difference is negligible, I am perfectly fine, considering this will only be used as IM windows, folders, etc. But where it is now.....its not even close to acceptable, and while I completely agree that many people can't tell quality issues (I swear, back in the day, people get get brand new TV's, hookup a DVD player incorrectly through component inputs (of course causing the colors to be totally off), and still claim "dude, my new TV is AMAZING!"), I can guarantee that anyone would tell here.

I'll report back after new cables and adapter.
 
The stuff on that page is mostly hype. The only major improvement is the Mercury playback engine for Premiere. If you do want to do Nvidia, make sure you get a supported card! You can't just get any card to work, it has to be a card off of this list:
> Quadro 5000*
> Quadro 4000*
> Quadro 5000M*
> Quadro FX 5800

> Quadro FX 4800
> Quadro FX 4800 Mac
> Quadro FX 3800
> GeForce GTX 285
*Adobe qualification will be available with upcoming software release.

I personally wouldn't bother unless you use Premiere though.



There would be a small amount of memory used for the framebuffer, but in 2D mode that would be ~18MB for 2 1920x1200 panels using 32-bit color. That's a negligible amount really.

The Murcury playback engine has been "unlocked" for the newer Fermi cards and some of the older high end NVidia cards.

http://www.studio1productions.com/Articles/PremiereCS5.htm

It seems pretty simple when you get to the meat of the article, they just don't want people trying this with those cruddy low profile cards. All you do is add your card to a list in a .txt and your card supports Mercury Playback 🙂.
 
Back
Top