Workstation Graphics Cards worth it?

redechelon

Junior Member
Jul 21, 2010
6
0
0
Hello everyone,

I'm building a system for a friend and I have a few questions for someone familiar with these cards...

For 3D Rendering and CAD, do workstation graphics cards have much benefit over gaming cards? I have read reviews comparing workstation cards to other workstation cards, but never to normal cards. NO gaming will be used on this system.

The budget is $1k, so after a few reviews, I decided on a V4800 (~$150).
Would this be superior to a HD 5770, or even a GT 460? Would CUDA help?

I have asked this question in a few other forums, but never got a straight answer... only that "a 5770 or GTX 460 will do the job". I know they wold do the job, but which does the job better and why?

Any insight would be helpful, thank you! :)
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
YES YES YES YES and YES to your question.

All those pros that use CAD and VIDEO and PS CS5 , they use like 1k or 2k dollar cards.

These cards can not play games. but they have more features then the gaming cards tailored for workstation type cad video photoshop 3D animate 3d studio max ,, Maya
they use 1k 2k cards because they have all the specs and they render much faster. They are OpenGL 3.1 compliant and render using OpenGL. Not the same OpenGL you play games with thought.
They cant play games also. Which is why everyone buys their gaming card to play games and general comp use.

for Audio Workstation a gaming card is the way to go FYI ,,
 
Last edited:

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
From my understanding, typically a workstation class card will do the job better because its drivers are designed to handle that sort of thing differently than a consumer graphics card would, even if they are using similar GPUs under the hood.

Sounds like the V4800 is using a similar GPU to the Radeon 5670. Not sure if it would be faster than a Radeon 5770 in CAD, though. Maybe someone else here can clear that up.
 

redechelon

Junior Member
Jul 21, 2010
6
0
0
Awesome, thanks a lot guys.

So, the drivers play a large role here?

Yeah, I noticed that too... V4800 is Redwood, which is why I was curious of a 5770 at the same price point.

I think I will recommend the V4800 if he doesn't plan on gaming AT ALL, unless someone else has more input.

Just to clear another thing up, I can't enter a V4800 into a PSU calculator.. is it safe to assume it's very similar to a 5670?
 

thedosbox

Senior member
Oct 16, 2009
961
0
0
I would suggest checking tweakboy's post history when trying to decide whether to take him seriously.

Driver quality and support are why professionals buy workstation cards. They can play games, albeit with poorer performance as their drivers are more focused on rendering quality (i.e. they don't have game specific "optimizations").
 
Last edited:

wwswimming

Banned
Jan 21, 2006
3,695
1
0
I would suggest checking tweakboy's post history when trying to decide whether to take him seriously.

Driver quality and support are why professionals buy workstation cards. They can play games, albeit with poorer performance as their drivers are more focused on rendering quality (i.e. they don't have game specific "optimizations").

you get similar driver quality & support by using gaming cards that have been out a year and have pulled to the head of their class - as a gaming card.

e.g. 4850 1 GB, introduced 2 years ago. when i bought it it was $120. compatible with Max, Maya, Solidworks, the entire Adobe suite - etc.

i've used the same approach (stable gaming card) on other workstations with Pro-E & One Space Designer (HP's solid modelling program).

i have had expensive workstation cards bought for me. they work no better than a good gaming video card that i selected.

true, this approach keeps you from buying the newest latest greatest - but, so what. "newest latest greatest" means they haven't had time to tweak the drivers - which is what you're paying for, with the pro card.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Sure you can normally flash a last gen gaming card to a workstation version and use those drivers then, but from looking at the Maya site it seems that they don't support those stuff.. http://download.autodesk.com/us/qualcharts/2011/maya2011_qualifiedgraphics_win.pdf

They even mention that specifically: "Important: Although Autodesk tested the NVIDIA GeForce and ATI Radeon consumer graphics cards, it is Autodesk, NVIDIA, and AMD policy to only recommend and support the professional NVIDIA Quadro, ATI FirePro, and ATI FireGL graphics family cards."
Should be the same for most professional software out there - sure you can do that and it may very well work perfectly fine, but I doubt that buying someone whos buys extremely expensive software, wants to save on non supported HW.. doesn't sound like such a great idea.
 

thedosbox

Senior member
Oct 16, 2009
961
0
0
you get similar driver quality & support by using gaming cards that have been out a year and have pulled to the head of their class - as a gaming card.

Uh huh. Good luck getting Adobe, ATI et al to accept your trouble-tickets.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,208
126
They even mention that specifically: "Important: Although Autodesk tested the NVIDIA GeForce and ATI Radeon consumer graphics cards, it is Autodesk, NVIDIA, and AMD policy to only recommend and support the professional NVIDIA Quadro, ATI FirePro, and ATI FireGL graphics family cards."
Should be the same for most professional software out there - sure you can do that and it may very well work perfectly fine, but I doubt that buying someone whos buys extremely expensive software, wants to save on non supported HW.. doesn't sound like such a great idea.

Sure is a nice racket they have going there with their "workstation" cards.

Same exact damn chip, slightly different (more stable) drivers? Why can't they just provide the more stable drivers for the "gaming" versions of the card? It's a scam, I tell you.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
Sure is a nice racket they have going there with their "workstation" cards.

Same exact damn chip, slightly different (more stable) drivers? Why can't they just provide the more stable drivers for the "gaming" versions of the card? It's a scam, I tell you.

Well they do put more work into their workstation drivers. And if they gave workstation performance to consumer cards a lot of professionals would go with consumer cards, reducing the pool of people paying for high level support. That would increase the price for support for the remaining customers, possibly pricing them out of the workstation market. Which could make it economically unfeasable to maintain a workstation product. So the entire market would be stuck with consumer level cards with half-assed drivers.
 

nenforcer

Golden Member
Aug 26, 2008
1,775
14
81
They're are many specific optimizations put into workstation graphics cards drivers that are not applicable to gaming cards.

Things such as various high performance antialiasing techniques, level of detail and additional API support for 3D modeling and rendering software such as Maya or Softimage, etc.

You are really paying for the software and all of the testing / validation because the hardware is more or less identical.

Depending on your usage of the card it is more than worth it especially in areas of rendering time.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Same exact damn chip, slightly different (more stable) drivers? Why can't they just provide the more stable drivers for the "gaming" versions of the card? It's a scam, I tell you.
Um, no completely different drivers.. and that's exactly the reason why those cards are that expensive. Much fewer people who buy those cards so the costs for developing the drivers are shared between fewer people - well and yes they know that professionals will pay more than the average gamer.

But the quality controll standards for those professional cards, as well as the support just cost money and no one that pays thousands of dollars for that kind of software would use a cheap gaming card to save a few hundred bucks - just not worth if you consider the budgets we're usually talking about.
And if you really want to save money, you can always flash a workstation bios on older highend chips - you're just one generation behind and don't get any support, but it will work..
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Workstation cards crush gaming cards in professional 3D applications.

Here are a couple of reviews to look at.

Read this one first.
http://hothardware.com/Reviews/AMD-ATI-FirePro-V8800-Workstation-Graphics-Card/?page=1
You probably aren't interested in a V8800, you probably wouldn't be asking this question if you were, but they compare it to the 5970. You can see the difference in the scores yourself. Now, in fairness to the 5970, you are actually seeing no crossfire scaling. I don't know why they used the 5970. I guess they were shooting for the most dramatic comparison. I can guarantee you though that the 5970 drivers don't have any crossfire profiles for Maya, Max, etc... They should have compared the 5870. There wouldn't have been much of a difference in the 5870 and the 5970 scores, though. So, they'll do for comparison.

Then read this one.
http://hothardware.com/Reviews/AMD-ATI-FirePro-Roundup-V7800-V4800-V3800/
It compares the V8800 with the lower end workstation cards. The V8800 performs even better in the later review (better drivers, I'm assuming?). If you look at even the most basic workstation card though it absolutely destroys the scores put up by the 5970.

Add actual support to the equation as well and it's not a difficult decision, if you are at all serious about graphics work.

As far as improved rendering goes... The graphics card is only used for rendering the editing window of the programs. The actual final rendering of the animations, etc is done by the CPU. If this is going to be a graphics workstation you'll want the fastest single thread processor plus as many cores/threads as the budget will allow. Most apps are only single threaded in the editing part of the program, but can use as many cores/threads for final rendering as you can supply.

On the cheap, I'd get a Ph II-X6 1055T. For a bit more, if more of an Intel fan, I'd get the i7-875K. Then I'd O/C the piss out of them. ;) These are bang for the buck systems, of course. The sky's the limit, if you had the funds.
 

Emulex

Diamond Member
Jan 28, 2001
9,759
1
71
the workstation cards are binned for their perfect math - the consumer cards are imperfect - very much they use the same core - its just binning. think about the top of the line XEON that is the same chip as your core chip - it has a few extra features to protect from errors (ECC end to end) and may even use ecc memory (or raid) - 128bit math needs to be perfect - consumer cards its cool to have an error here or there or to reset under stress. very much not cool to do so when have spent days manipulating data.

so yeah it costs more. but likewise the workstation motherboards are low end server boards with westmere support and ECC RDIMM support. why would you go with any less for your profession?
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Workstation cards crush gaming cards in professional 3D applications.

Here are a couple of reviews to look at.

Read this one first.
http://hothardware.com/Reviews/AMD-ATI-FirePro-V8800-Workstation-Graphics-Card/?page=1
You probably aren't interested in a V8800, you probably wouldn't be asking this question if you were, but they compare it to the 5970. You can see the difference in the scores yourself. Now, in fairness to the 5970, you are actually seeing no crossfire scaling. I don't know why they used the 5970. I guess they were shooting for the most dramatic comparison. I can guarantee you though that the 5970 drivers don't have any crossfire profiles for Maya, Max, etc... They should have compared the 5870. There wouldn't have been much of a difference in the 5870 and the 5970 scores, though. So, they'll do for comparison.

Then read this one.
http://hothardware.com/Reviews/AMD-ATI-FirePro-Roundup-V7800-V4800-V3800/
It compares the V8800 with the lower end workstation cards. The V8800 performs even better in the later review (better drivers, I'm assuming?). If you look at even the most basic workstation card though it absolutely destroys the scores put up by the 5970.

Add actual support to the equation as well and it's not a difficult decision, if you are at all serious about graphics work.
Though if you're already using a gaming card (= no support) I think you'd try to flash a workstation bios on it and then use the pro drivers. In that case the differences should be minimal, but you've got to wait untill someone releases the adequate bios (and hacks to get it work) and you get all the extra work and an added insecurity..

possible win: ~1k$
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Though if you're already using a gaming card (= no support) I think you'd try to flash a workstation bios on it and then use the pro drivers. In that case the differences should be minimal, but you've got to wait untill someone releases the adequate bios (and hacks to get it work) and you get all the extra work and an added insecurity..

possible win: ~1k$

I've got no problem with that. In the end you have a workstation card. I was just posting because people were misinforming the OP that gaming cards can come anywhere near the performance of a workstation card with graphics apps. The V3800 is only $120.00 card and look at the benchmarks.

If the OP truly isn't going to use this build for gaming then putting a gaming card (or an "unhacked" gaming card ;)) in it is a mistake.
 

rookwood

Junior Member
Jul 25, 2010
1
0
0
What type of rendering software are you using? I use Revit for 3D modeling and rendering and if you go to the AUGI forum (Revit section, Hardware) you will see the ongoing dialog with what graphic cards work with past and current versions of Revit. 2 years ago I had a $12,000 custom built workstation that set the benchmark records on the forum. Additionally I had a $6,500 BOXX mobile workstation. 2 months out of warranty, the BOXX fried inside on a Friday afternoon and I had a critical Monday morning deadline to meet. I went to BestBuy and picked up a $700 consumer grade HP dv2000 laptop with integrated graphics. Been running Revit up to and including the newly released 2011 version. Obviously, it is no speed demon and struggles at times, but one beta version of Revit wouldn't open on my BOXX mobile - go figure!

There is a movement on the AUGI forum to move away from Xeon processors in favor of the i7 and go with Dell or HP consumer desktops. There used to be a vast differences between "consumer" cards and "professional" cards. The professional cards (Quadro and Fire) are getting such that even certain drivers may/may not work and you find yourself using older drivers for newer versions of Revit. Extreme frustration on the forum with Autodesk for not establishing a standard. Being on the Revit beta testing team, I cross my fingers and close my eyes each time I install and open a new Revit release.

I just spent 2 entire weeks configuring every Dell, HP and Lenovo workstation scenario imaginable and abandoned this yesterday in search of a HP consumer desktop with an i7 processor and an entry level graphics card - and a system that can be upgraded.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I wasn't familiar with Revit. After looking around some forums and Autodesk's site, from what I can see, Revit uses the CPU for rendering, not the GPU.
 

Sahakiel

Golden Member
Oct 19, 2001
1,746
0
86
If you are at all familiar with how CPU's work, then you can sort of understand why workstation cards might perform very differently even though they use the same processor design.
As far as I'm aware, everything you use to run your games gets translated from common API into a processor-specific set of instructions. Two processors in the same generation likely use the same ISA, but two processors from different generations don't use the same instructions.
How does this relate? It helps explain why people talk about mature drivers even though they're still using the same graphics API. Each new chip needs tweaking, usually through thousands of users who use them because lab testing is too costly.

How workstation cards differ from gaming cards is through drivers. With games, you don't care about complete support for certain API's or good performance for every single call or even color accuracy. With workstation cards, you care, which is why you buy them. Workstation cards are tweaked to provide better performance with better accuracy and better support than a gaming card.
Sure, you can more than likely run a gaming card with your graphics apps no problem. Usually software only bothers checking if the API calls go through, not the chip name. But, you run the risk of errors the eye can't see, and the lack of vendor support. If that's all not important to you, run with a gaming card, it won't make a difference. If it does, go buy a proper workstation card.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,208
126
the workstation cards are binned for their perfect math - the consumer cards are imperfect - very much they use the same core - its just binning. think about the top of the line XEON that is the same chip as your core chip - it has a few extra features to protect from errors (ECC end to end) and may even use ecc memory (or raid) - 128bit math needs to be perfect - consumer cards its cool to have an error here or there or to reset under stress. very much not cool to do so when have spent days manipulating data.

so yeah it costs more. but likewise the workstation motherboards are low end server boards with westmere support and ECC RDIMM support. why would you go with any less for your profession?

Are you saying that consumer cards ship with known defects in the SPs? Considering how many CUDA apps there are, and GPGPU in general, that doesn't seem wise, nor correct.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
With workstation cards, you care, which is why you buy them. Workstation cards are tweaked to provide better performance with better accuracy and better support than a gaming card.
Ahm no, sure there may be some bugs in one driver that aren't in the other, but better accuracy? If the architecture doesn't support dp fp, no driver in the whole world will change anything and if it does support it, you get exactly that accuracy.

The only reason why thoes drivers are better is, that they're better optimized for those application areas, so they're faster, but you won't loose magically a few bit precission just because you switch from a workstation card to a gaming card of the same architecture.
Ati presumably bins the workstation cards exactly like their gaming pieces, if they reach certain performance goals at certain voltage levels and don't produce any errors they can sell them, otherwise not. The same chips with another name on it, but if a card produces errors they won't sell it as a gaming nor a workstation card.


But as 3DVagabond showed, the drivers really affect the performance, so if you want hw support and don't want to play with hacked BIOSes (you also don't OC servers although you could save money there..), that's probably worth the money.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
I always get one bad insulting negative comment from someone on my post.

You know who you are DOS 6.22 Windows 3.1 .

Im 32 and contributing to this site, your hate posts are not needed nor do I get offended. Actually I live to tick off 15 year old who think they know everything.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think you definately contribute, Tweakboy, and in a positive way. Your style is entertaining though. I think most comments are in good humor. ;)
 

mv2devnull

Golden Member
Apr 13, 2010
1,526
160
106
Are you saying that consumer cards ship with known defects in the SPs? Considering how many CUDA apps there are, and GPGPU in general, that doesn't seem wise, nor correct.
Nvidia naturally recommends the use of their Tesla brand for CUDA. Those used to have more memory, way more price, and no monitor connector. So clearly different from graphics cards.

Nvidia does offer one unified driver for Linux, both for GeForce and Quadro cards. Naturally, the Quadro line has more "supported features" in the driver.

Hardware stereo. Quadro cards (the more expensive ones) do have a 3-pin mini-Din for the shutter-glass sync signal. In the CRT-era there was no real alternative, and in this "3D Vision" era -- at least on Linux -- the mere sync-via-USB is no match for the dedicated sync.