• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Titan: Excellent Performance per Dollar

lambchops511

Senior member
For the PhD minions out there, its time to convince your advisor to buy Titan. We're experiencing a x2 speedup over GTX680 without any code modifications!!! Time to tell your advisor time is money, double the research throughput w/ Titan!!
 
The real question is how much faster is Titan over the GTX 580? Everyone knows the 680's non-gaming performance was crippled significantly.
 
The real question is how much faster is Titan over the GTX 580? Everyone knows the 680's non-gaming performance was crippled significantly.

This is a mixed answer... it really depends on what you are doing. Very roughly we find for our applications they are more or less equal, maybe a 10% edge for the 580s.
 
For the PhD minions out there, its time to convince your advisor to buy Titan. We're experiencing a x2 speedup over GTX680 without any code modifications!!! Time to tell your advisor time is money, double the research throughput w/ Titan!!

This reads like an infomercial. :hmm: Tone down the rhetoric!
 
Got to agree with Silver here - if its maybe 10% over the 580s - price wise its terrible compaired to the 580s.....

580s can be had for 200ish.....titan for a grand.....paying 800 for 10% improvement is not exactly great performance per dollar - unless its completely vital for that extra 10%
 
Got to agree with Silver here - if its maybe 10% over the 580s - price wise its terrible compaired to the 580s.....

580s can be had for 200ish.....titan for a grand.....paying 800 for 10% improvement is not exactly great performance per dollar - unless its completely vital for that extra 10%

I meant 10% over 680... ~2x for Titan over GTX580. Where can you buy GTX580s for 200? newegg.com still has them for $400...

1k is really nothing... Consider the simple mathematics of a PhD student costing $10 / hr. Suppose you run your GPUs 24 hrs a dy for 300 days in a year (good estimate), that is 4.8k hours, if you can cut your simulation time by 1/2, that is 2.4k hours and assuming your time is worth $10 / hr, thats still 24k.

Previously we were running 20k 4x4 CPU machines, even the GTX580s were giving much more performance per dollar.

Result? We're getting something like 30 Titans... as soon as our supply channel has them in stock 🙁
 
that makes no sense as we know the 680 non-gaming performance was crippled compaired to the 580 - the tests show the 680 is no where near the 580 performance there......
 
I've delt with academics - and money is also very much tough subject - specially when it comes to over spending....hell - to even consider upgrading the computer labs I had to show in several reports; why; cost per dollar....was the upgrade worth the time and effort to upgrade past what we have now.......performance gain.

I know for a fact my university would take a long hard look at this before taking a bite *6 plus months of debates and reports*....but its a smaller university and it doesn't like wasting money..
 
Lamb thanks - that is bit interesting......🙂 guess it can have uses - hey if you can get the budget for the titans....awesome 🙂
 
I've delt with academics - and money is also very much tough subject - specially when it comes to over spending....hell - to even consider upgrading the computer labs I had to show in several reports; why; cost per dollar....was the upgrade worth the time and effort to upgrade past what we have now.......performance gain.

I know for a fact my university would take a long hard look at this before taking a bite *6 plus months of debates and reports*....but its a smaller university and it doesn't like wasting money..

I think the trick is you just need to show whoever is in charge of the $$ that you will get a good ROI. imo, its not a waste of money. If you don't purchase the hardware and get research results faster, another group will and they will publish before you. Publish or perish http://www.phdcomics.com/comics/archive/phd100311s.gif.
 
The real question is how much faster is Titan over the GTX 580? Everyone knows the 680's non-gaming performance was crippled significantly.

A whole lot faster...
http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/3

53221.png


53222.png


53225.png


53223.png


53224.png


53400.png
 
I meant 10% over 680... ~2x for Titan over GTX580. Where can you buy GTX580s for 200? newegg.com still has them for $400...

1k is really nothing... Consider the simple mathematics of a PhD student costing $10 / hr. Suppose you run your GPUs 24 hrs a dy for 300 days in a year (good estimate), that is 4.8k hours, if you can cut your simulation time by 1/2, that is 2.4k hours and assuming your time is worth $10 / hr, thats still 24k.

Previously we were running 20k 4x4 CPU machines, even the GTX580s were giving much more performance per dollar.

Result? We're getting something like 30 Titans... as soon as our supply channel has them in stock 🙁

10% faster than a 680 for 2.5x the price is a great value?

In another thread you said they were 3x faster than both the 580/680.

I'm confused as to why your numbers keep changing between and within threads.

Your math also assumes that the grad student is working 24x300 and isn't doing anything but waiting for results to be returned.

Since I'm not all that familiar with GPU compute, can somebody point out to me why I shouldn't be highly skeptical of the OP's posts?
 
Last edited:
If anything the benchmarks Cloudfire777 posted
make the 7970 (stock) (non GHZ ed) look like "Excellent Performance per Dollar", not the Titan.

Titan is the "best performance" (but for poor performance pr dollar).


By the title of the thread, I thought it was started on April 1st. 😛

Yep its just a few days off mark, was my thought too.
 
Last edited:
From the benchmarks posted, looks like Titan is worth the money for those doing computations like that. If iwas doing that I would seriously consider getting titans.
 
What the heck did Nvidia do to neuter kepler in compute tasks? Does it really save many transistors or wattage?

I'm thinking they nerfed kepler on purpose to force students/professionals/edu to buy tesla cards instead of desktop cards like they did with 580s
 
I don`t think you people understand what OP tries to say. At all.

First of all, lots of professional software are highly CUDA specialized. So OP may not have other choice than Nvidia GPU.

Second, OP is correct about is statement that Titan net him over 2x speed. Titan is over 3.5 faster than GTX 680 on double precision.

Third, he says that time is money, and since Titan is a heck lot faster than 7970, they will pay for the extra cost for a Titan since the Titan can finish a computation much faster than a 7970. With DGEMM (double precision) it can output twice the amount that a 7970 can do.

GEMM (measures performance of dense matrix multiplication) and FFT (Fast Fourier Transform). These numerical operations are important in a variety of scientific fields.
 
Last edited:
What the heck did Nvidia do to neuter kepler in compute tasks? Does it really save many transistors or wattage?
Probably not.

I'm thinking they nerfed kepler on purpose to force students/professionals/edu to buy tesla cards instead of desktop cards like they did with 580s
They must have learnt it from Intel.
Intel loves do the same thing with lowend CPUs, turning features off on purpose.

Anyways title of thread is really misleading.

In all 3 of the benchmarks where the 7970 is included it's better than the 680. 😛
^ yep, stock 7970 (non ghz ed) beating the 680, by huge amounts.
Also makes you question why in some tests they didnt use any AMD cards to compair.


If there was a card that was "excellent performance per dollar" it would be the 7970.


53222.png



680 = 133
7970 = 689 (x5 times the 680) (costs ~399$)
7970 1ghz ed = wasnt tested.
Titan = 1309 (x1.8 times 7970) (costs ~999$ = x2.5 of 7970)


53225.png



71.5 vs 63 = ~13% differnce. (a 7970 1ghz ed would probably be faster)
Price differnce = ~250% differnce.
 
Last edited:
Back
Top